sha
null
last_modified
null
library_name
stringclasses
154 values
text
stringlengths
1
900k
metadata
stringlengths
2
348k
pipeline_tag
stringclasses
45 values
id
stringlengths
5
122
tags
listlengths
1
1.84k
created_at
stringlengths
25
25
arxiv
listlengths
0
201
languages
listlengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
listlengths
0
722
processed_texts
listlengths
1
723
tokens_length
listlengths
1
723
input_texts
listlengths
1
61
embeddings
listlengths
768
768
null
null
transformers
# CodeTrans model for program synthesis Pretrained model on programming language lisp inspired DSL using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on Program Synthesis dataset. ## Intended uses & limitations The model could be used to generate lisp inspired DSL code given the human language description tasks. ### How to use Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_program_synthese"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_program_synthese", skip_special_tokens=True), device=0 ) tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/program%20synthesis/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Evaluation results For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | LISP | | -------------------- | :------------: | | CodeTrans-ST-Small | 89.43 | | CodeTrans-ST-Base | 89.65 | | CodeTrans-TF-Small | 90.30 | | CodeTrans-TF-Base | 90.24 | | CodeTrans-TF-Large | 90.21 | | CodeTrans-MT-Small | 82.88 | | CodeTrans-MT-Base | 86.99 | | CodeTrans-MT-Large | 90.27 | | CodeTrans-MT-TF-Small | **90.31** | | CodeTrans-MT-TF-Base | 90.30 | | CodeTrans-MT-TF-Large | 90.17 | | State of the art | 85.80 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
summarization
SEBIS/code_trans_t5_small_program_synthese
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for program synthesis ===================================== Pretrained model on programming language lisp inspired DSL using the t5 small model architecture. It was first released in this repository. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on Program Synthesis dataset. Intended uses & limitations --------------------------- The model could be used to generate lisp inspired DSL code given the human language description tasks. ### How to use Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Evaluation results ------------------ For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 114 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.10197243094444275, 0.052408263087272644, -0.0008475699578411877, 0.07747628539800644, 0.1253296434879303, 0.0003255880146753043, 0.12152953445911407, 0.04511275142431259, 0.05444084852933884, -0.05780824273824692, 0.11050792038440704, 0.17079880833625793, -0.019645173102617264, 0.1482776701450348, -0.027185792103409767, -0.13108791410923004, 0.01266967412084341, 0.04069361090660095, -0.14536556601524353, 0.12656627595424652, 0.1282953917980194, -0.04141490161418915, 0.09692001342773438, -0.03897906467318535, -0.17773506045341492, 0.061040058732032776, -0.02040889859199524, -0.039826277643442154, 0.09927734732627869, 0.06610997021198273, 0.12230369448661804, 0.019010573625564575, 0.04727138578891754, -0.2565366327762604, 0.018060119822621346, -0.03919696807861328, 0.011944363825023174, 0.0250169076025486, 0.015064563602209091, -0.0789656788110733, 0.14202935993671417, -0.025330569595098495, 0.01100156269967556, 0.035046618431806564, -0.07802321016788483, -0.019268272444605827, -0.038531847298145294, -0.0380454920232296, 0.10299495607614517, 0.10869420319795609, 0.02419857867062092, 0.08626067638397217, -0.1634482890367508, 0.145771324634552, 0.1363670825958252, -0.17612147331237793, -0.013410902582108974, 0.13485229015350342, 0.06834079325199127, -0.03916085511445999, -0.002011894015595317, 0.031909458339214325, 0.07825463265180588, -0.010129458270967007, -0.02607346698641777, -0.11439294368028641, -0.1276097595691681, 0.10080505162477493, -0.09198480099439621, -0.0371447391808033, 0.30969637632369995, -0.006383324041962624, -0.022124918177723885, -0.049324147403240204, -0.018734024837613106, -0.003863545134663582, -0.015873586758971214, 0.003010186366736889, 0.02470802329480648, 0.020994314923882484, -0.013490252196788788, -0.03829893097281456, -0.08563075959682465, -0.140803262591362, 0.005951819010078907, 0.1296750158071518, 0.00312270806171, 0.008569779805839062, -0.14156359434127808, 0.10797200351953506, 0.15156155824661255, -0.06517653912305832, 0.005838689394295216, -0.05981644615530968, -0.022410040721297264, 0.027636896818876266, -0.09191755950450897, -0.14533385634422302, 0.08579141646623611, 0.09138021618127823, 0.07797209918498993, 0.0498286634683609, 0.021464744582772255, 0.050120528787374496, -0.0013990483712404966, 0.1754143238067627, 0.003032058710232377, -0.06348174065351486, 0.045996565371751785, -0.01634620502591133, 0.01493498869240284, 0.00037166595575399697, -0.11040442436933517, 0.0043921698816120625, -0.027445532381534576, 0.13297349214553833, -0.08764024823904037, 0.09052393585443497, -0.09865985065698624, -0.004623937886208296, -0.06520423293113708, -0.17077411711215973, 0.005518765188753605, -0.010281041264533997, -0.06588198244571686, -0.03328533470630646, 0.11219573765993118, -0.05447700619697571, -0.08251484483480453, -0.021448086947202682, -0.06478255987167358, 0.010386989451944828, -0.11901751905679703, -0.09427035599946976, 0.015622269362211227, 0.011459133587777615, 0.04744971916079521, -0.10885083675384521, -0.25637608766555786, -0.01006782241165638, 0.07674853503704071, 0.024961352348327637, 0.049995213747024536, -0.15544050931930542, -0.06621075421571732, -0.02437831088900566, -0.015694936737418175, -0.01468450017273426, -0.0708550214767456, 0.08645149320363998, 0.07800460606813431, 0.041108205914497375, -0.10081292688846588, 0.03892887383699417, -0.12886470556259155, 0.04439535737037659, -0.11212771385908127, 0.15730509161949158, -0.0913442000746727, 0.12435916811227798, -0.05935702845454216, -0.04574967175722122, 0.03242708742618561, 0.09016111493110657, 0.07486328482627869, 0.1565067619085312, -0.23173697292804718, -0.03993069753050804, 0.1187950074672699, -0.1168685331940651, -0.16001427173614502, 0.0483824759721756, -0.079312264919281, 0.1422954946756363, 0.1109020859003067, 0.20740720629692078, 0.16592885553836823, -0.042271971702575684, 0.02538086473941803, 0.10786233097314835, -0.018395163118839264, -0.08602164685726166, 0.039129480719566345, 0.09537657350301743, -0.11292758584022522, 0.08744866400957108, -0.10144074261188507, 0.13592010736465454, -0.018091244623064995, -0.0631723701953888, -0.03372138738632202, -0.07666319608688354, 0.004444223828613758, 0.010765351355075836, 0.05478523299098015, -0.03137199580669403, -0.03696350380778313, 0.11295577138662338, 0.12403365224599838, -0.12640461325645447, 0.0339149609208107, -0.14328017830848694, 0.03899872303009033, -0.04135704040527344, 0.027792086824774742, -0.18247660994529724, -0.03398760035634041, -0.027753813192248344, 0.0879887044429779, 0.0769856795668602, 0.03627989813685417, 0.005133970640599728, -0.040356945246458054, 0.016710743308067322, -0.0058478424325585365, 0.013616083189845085, -0.02611788548529148, -0.03324611112475395, -0.1107526496052742, -0.046368494629859924, -0.05707653611898422, 0.05121150240302086, -0.15661777555942535, -0.0003541907062754035, 0.03305904567241669, 0.09197414666414261, 0.002301138360053301, 0.024273119866847992, 0.020840546116232872, 0.08154038339853287, -0.06970921158790588, -0.02156679891049862, 0.05641176551580429, -0.007618376985192299, -0.15806855261325836, 0.030058180913329124, -0.020100802183151245, 0.01072128489613533, 0.13407869637012482, -0.1796996295452118, -0.0219466183334589, -0.008615524508059025, -0.0036713408771902323, 0.010075420141220093, 0.0555516742169857, -0.05590241774916649, 0.17373301088809967, 0.0013977307826280594, 0.13309480249881744, -0.10639460384845734, -0.05150628089904785, -0.08927128463983536, -0.02796814776957035, 0.007084365468472242, 0.15629664063453674, 0.08168385177850723, -0.11759497225284576, 0.06663627177476883, 0.06925886124372482, -0.04088655114173889, 0.1374179571866989, -0.06645732372999191, -0.015543279238045216, -0.01202527154237032, 0.0927605852484703, -0.04196268320083618, 0.10328168421983719, -0.1598808616399765, -0.06325890868902206, -0.00108517671469599, -0.015301755629479885, 0.09473638236522675, -0.14856204390525818, -0.0449524000287056, 0.017503894865512848, -0.07646927982568741, -0.08379952609539032, 0.030866919085383415, -0.008089267648756504, 0.0457417368888855, -0.03797382116317749, -0.08149941265583038, 0.05498448386788368, -0.0508771650493145, -0.10683105885982513, 0.20699791610240936, -0.12859393656253815, -0.25635334849357605, -0.1695123016834259, -0.015389503911137581, -0.04778007045388222, 0.011227809824049473, 0.04801603779196739, -0.12533575296401978, -0.05950907990336418, -0.07312387973070145, 0.08811221271753311, -0.03378482908010483, -0.04928421229124069, 0.013856424018740654, 0.049982331693172455, -0.01732848398387432, -0.2123628407716751, -0.0008907285518944263, -0.0026342396158725023, 0.05923747271299362, -0.007909467443823814, -0.12362751364707947, 0.10857321321964264, 0.12512587010860443, -0.007135148625820875, 0.04666110500693321, -0.018486754968762398, 0.18555693328380585, -0.04408169165253639, -0.09573926776647568, 0.15024474263191223, -0.08737480640411377, -0.0004103745159227401, 0.061759669333696365, 0.017391400411725044, -0.16445422172546387, 0.04608149081468582, -0.009712813422083855, -0.06816675513982773, -0.24986611306667328, -0.13600021600723267, -0.09490014612674713, 0.05311083793640137, -0.025729242712259293, 0.04038788750767708, -0.13025671243667603, 0.046638187021017075, 0.06914054602384567, 0.08670233190059662, -0.0016955804312601686, 0.05550495162606239, 0.1354140192270279, -0.03533070534467697, 0.03275904059410095, -0.11511999368667603, -0.08222530782222748, 0.025858262553811073, 0.057777415961027145, 0.16374611854553223, -0.018701422959566116, 0.15635572373867035, 0.06974513828754425, 0.020909497514367104, 0.008458435535430908, 0.12894085049629211, -0.0662231594324112, 0.009237287566065788, 0.011704185977578163, -0.03914977237582207, -0.13555003702640533, 0.05263596028089523, -0.08089753985404968, 0.02884642407298088, -0.08261779695749283, 0.006605192087590694, 0.06763255596160889, 0.039448678493499756, 0.01097510289400816, -0.2607683539390564, -0.023683497682213783, 0.06192915514111519, -0.05438767373561859, -0.02656654827296734, 0.08554151654243469, 0.11243930459022522, -0.04026053473353386, 0.03151124715805054, -0.0003430916403885931, 0.14614629745483398, -0.06050194799900055, 0.00730432802811265, -0.03483608365058899, -0.02479305863380432, 0.043128352612257004, 0.15735647082328796, -0.25901830196380615, 0.19615206122398376, 0.00368595402687788, 0.027927985414862633, -0.047392670065164566, 0.02793155424296856, -0.0031717848032712936, 0.06214221566915512, 0.12785594165325165, -0.01693674363195896, 0.004450258333235979, -0.14879870414733887, -0.024025801569223404, 0.06708026677370071, 0.09864993393421173, -0.031540174037218094, 0.06519901752471924, -0.026578368619084358, 0.01539810374379158, 0.009934237226843834, -0.07631414383649826, -0.0735655277967453, -0.14148831367492676, -0.008794756606221199, 0.03341185301542282, 0.035031482577323914, -0.037862129509449005, 0.00243186904117465, -0.016989804804325104, 0.24331818521022797, -0.06602924317121506, -0.08821844309568405, -0.08665511757135391, 0.028588445857167244, 0.11111630499362946, -0.05906679108738899, 0.003492946969345212, 0.04273160919547081, -0.017346318811178207, 0.01705705188214779, -0.14770086109638214, 0.03008114919066429, -0.042512767016887665, 0.01994442567229271, -0.042993731796741486, 0.13908495008945465, -0.0039034434594213963, -0.0027746609412133694, 0.05009292811155319, -0.0616571381688118, -0.030462082475423813, -0.14343813061714172, -0.136999249458313, 0.03953826427459717, -0.008757945150136948, 0.09080782532691956, -0.1442836970090866, 0.09390941262245178, -0.02140532061457634, -0.02187572792172432, 0.20965741574764252, 0.14467096328735352, -0.04322554171085358, 0.04114757850766182, 0.05901169031858444, -0.11194127798080444, -0.2711361050605774, 0.0004447541432455182, -0.027758464217185974, 0.05712663382291794, -0.009584333747625351, -0.16833461821079254, 0.14305497705936432, -0.05865904688835144, 0.03624743968248367, 0.015608653426170349, -0.29508712887763977, -0.09234601259231567, 0.13854126632213593, 0.09086595475673676, 0.10687554627656937, -0.12011826783418655, -0.06777264922857285, -0.11218960583209991, -0.16235864162445068, 0.16550643742084503, -0.0031450947280973196, 0.08291763067245483, 0.012506026774644852, 0.08699990808963776, 0.035920482128858566, -0.025492243468761444, 0.10692226886749268, 0.03777278587222099, 0.13838104903697968, -0.050515513867139816, -0.0874079242348671, 0.05728909373283386, -0.029163165017962456, 0.15835069119930267, -0.14096644520759583, 0.08286964893341064, -0.184068500995636, -0.03315519541501999, -0.008597043342888355, 0.04535728320479393, 0.008510200306773186, -0.050143346190452576, -0.11108126491308212, -0.009060479700565338, 0.019504647701978683, 0.025747127830982208, 0.13587945699691772, -0.03833925351500511, -0.035332195460796356, 0.016482064500451088, 0.19673019647598267, -0.040642380714416504, 0.008118872530758381, 0.0448707677423954, 0.011631980538368225, 0.08622443675994873, -0.1825338453054428, 0.050514791160821915, 0.13160361349582672, 0.042131535708904266, 0.11230851709842682, 0.0803757756948471, -0.020584065467119217, -0.0035434223245829344, 0.10809065401554108, -0.1389201134443283, -0.05402104929089546, -0.07009914517402649, -0.058827389031648636, 0.009431853890419006, 0.08290817588567734, 0.16295398771762848, -0.04016874358057976, 0.007245820481330156, 0.013051774352788925, -0.008008076809346676, -0.13035504519939423, 0.12483188509941101, 0.053820930421352386, 0.03329365327954292, -0.11941368132829666, 0.08117605000734329, -0.004625940695405006, -0.07370613515377045, 0.007058634422719479, 0.09430256485939026, -0.12076801061630249, -0.07085883617401123, -0.0473967045545578, 0.27106279134750366, -0.0838240385055542, -0.07864529639482498, -0.13255976140499115, -0.08041581511497498, 0.006529126316308975, 0.20609739422798157, 0.1188986599445343, 0.12334427237510681, -0.05537223815917969, 0.0021504175383597612, -0.06442450731992722, 0.06625306606292725, 0.10409357398748398, 0.021915771067142487, -0.15778064727783203, 0.09556083381175995, -0.022583147510886192, 0.10671664774417877, -0.061286695301532745, -0.016187211498618126, -0.16115480661392212, 0.10083941370248795, -0.12014411389827728, 0.04169413819909096, -0.04675066098570824, 0.04090399295091629, 0.029208514839410782, -0.004866926930844784, -0.04088929668068886, 0.03521238639950752, -0.10048376768827438, 0.0022348077036440372, 0.0195778738707304, 0.05440254509449005, -0.11296883225440979, -0.03383845090866089, 0.05579494312405586, -0.05520040914416313, 0.06943847239017487, 0.046635303646326065, -0.06726484000682831, 0.09916603565216064, -0.2313963770866394, -0.012264705263078213, 0.0935153067111969, 0.058173686265945435, 0.02197861112654209, -0.008072096854448318, 0.014838019385933876, 0.07343710958957672, 0.03485919535160065, 0.005209240596741438, 0.10351873934268951, -0.11608358472585678, -0.07642608880996704, -0.0497170127928257, -0.08639748394489288, -0.0288887657225132, 0.03400222957134247, 0.11523039638996124, 0.12857235968112946, 0.13156287372112274, -0.005415842402726412, 0.006243838928639889, -0.06541134417057037, -0.006029303651303053, 0.034586112946271896, -0.05670313537120819, -0.10617318749427795, -0.09038481116294861, 0.029347913339734077, -0.07310518622398376, 0.21806177496910095, 0.006932105869054794, 0.012860793620347977, -0.03020663745701313, 0.014995740726590157, 0.11369605362415314, 0.04432497173547745, 0.25085484981536865, -0.013540809042751789, 0.06856530904769897, -0.0524868443608284, 0.0358741320669651, 0.05230825021862984, 0.04725231975317001, 0.07009942829608917, 0.1255354881286621, -0.01932981237769127, 0.13328178226947784, 0.013123364187777042, 0.022074181586503983, -0.027415426447987556, -0.09323026239871979, 0.03749207407236099, 0.04320749267935753, -0.04970518499612808, 0.20338140428066254, 0.043025903403759, -0.07681310921907425, 0.07950281351804733, 0.037588492035865784, -0.10256728529930115, -0.08256936818361282, -0.06935223191976547, -0.050860900431871414, -0.1524939090013504, 0.018674882128834724, -0.13458721339702606, -0.04580722376704216, 0.0350465402007103, 0.020323842763900757, -0.047875452786684036, 0.16603900492191315, -0.0689837783575058, -0.05053047835826874, 0.03106723539531231, -0.028366032987833023, -0.01750113256275654, -0.0032647172920405865, 0.0430578775703907, -0.0011571983341127634, -0.005359356757253408, -0.016736097633838654, 0.05691055953502655, -0.00883474387228489, 0.009235511533915997, -0.06655067950487137, -0.005006698425859213, -0.05706116184592247, 0.061824552714824677, 0.04027272015810013, 0.0839872658252716, 0.035225220024585724, -0.07189159840345383, -0.00456170504912734, 0.20008207857608795, -0.06489595770835876, -0.08344591408967972, -0.09145213663578033, 0.18612858653068542, 0.012102172710001469, 0.082295261323452, -0.03969011455774307, -0.06856589764356613, -0.04355476051568985, 0.2825552523136139, 0.19163374602794647, -0.08958663791418076, 0.0015211064601317048, -0.014673097990453243, 0.019053669646382332, -0.041346438229084015, 0.1186487153172493, 0.03864984214305878, 0.23552769422531128, -0.010652695782482624, -0.10219451040029526, -0.08240645378828049, -0.04094749689102173, 0.012977763079106808, 0.10186095535755157, 0.002361572813242674, -0.07467856258153915, -0.05607173964381218, 0.08497972786426544, -0.18731757998466492, -0.03410477191209793, 0.03545243293046951, -0.1314840018749237, -0.0592392235994339, -0.040641192346811295, 0.07580254971981049, -0.049782559275627136, 0.02825554460287094, -0.048431478440761566, 0.0090985968708992, 0.04135182872414589, -0.00047955120680853724, -0.1742834895849228, -0.08761770278215408, 0.03646211326122284, -0.05053464323282242, 0.08978638797998428, -0.02088901400566101, 0.13620267808437347, 0.08675394207239151, 0.04930444061756134, -0.02658168599009514, 0.07340103387832642, 0.061765048652887344, -0.008376398123800755, 0.06630877405405045, 0.01970832236111164, -0.048169828951358795, 0.1737658530473709, -0.012114029377698898, -0.036259256303310394, 0.04677837714552879, 0.034324776381254196, -0.045176904648542404, -0.15930160880088806, -0.04314899444580078, -0.14656077325344086, 0.0937579795718193, 0.14941558241844177, -0.04021914675831795, 0.014652356505393982, -0.0638158991932869, 0.10716073215007782, -0.03579201176762581, -0.03805633261799812, -0.031066307798027992, -0.16093981266021729, -0.045237742364406586, 0.11294134706258774, -0.011064646765589714, -0.1615082174539566, -0.013740132562816143, -0.06120913103222847, -0.009671958163380623, -0.030894817784428596, 0.11232427507638931, 0.15480583906173706, 0.041804637759923935, -0.020118655636906624, -0.1642126739025116, 0.02128474786877632, 0.0668124333024025, -0.10598921030759811, -0.13764208555221558 ]
null
null
transformers
# CodeTrans model for program synthesis Pretrained model on programming language lisp inspired DSL using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. ## Intended uses & limitations The model could be used to generate lisp inspired DSL code given the human language description tasks. ### How to use Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_multitask"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_multitask", skip_special_tokens=True), device=0 ) tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/program%20synthesis/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ## Evaluation results For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | LISP | | -------------------- | :------------: | | CodeTrans-ST-Small | 89.43 | | CodeTrans-ST-Base | 89.65 | | CodeTrans-TF-Small | 90.30 | | CodeTrans-TF-Base | 90.24 | | CodeTrans-TF-Large | 90.21 | | CodeTrans-MT-Small | 82.88 | | CodeTrans-MT-Base | 86.99 | | CodeTrans-MT-Large | 90.27 | | CodeTrans-MT-TF-Small | **90.31** | | CodeTrans-MT-TF-Base | 90.30 | | CodeTrans-MT-TF-Large | 90.17 | | State of the art | 85.80 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
summarization
SEBIS/code_trans_t5_small_program_synthese_multitask
[ "transformers", "pytorch", "tf", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #tf #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for program synthesis ===================================== Pretrained model on programming language lisp inspired DSL using the t5 small model architecture. It was first released in this repository. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. Intended uses & limitations --------------------------- The model could be used to generate lisp inspired DSL code given the human language description tasks. ### How to use Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. Evaluation results ------------------ For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #tf #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 63, 143 ]
[ "passage: TAGS\n#transformers #pytorch #tf #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 440,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.1350070834159851, -0.016092849895358086, -0.0005681590992026031, 0.15120716392993927, 0.10476064682006836, 0.018968261778354645, 0.08963063359260559, 0.05293617397546768, -0.024613695219159126, 0.010289682075381279, 0.05619772896170616, 0.0324537567794323, 0.00853022187948227, 0.18285952508449554, 0.002084283623844385, -0.11886180937290192, -0.0040841903537511826, 0.02657843939960003, -0.07175812870264053, 0.12438894808292389, 0.0960950255393982, -0.06918115168809891, 0.05230552330613136, -0.06590519100427628, -0.20989619195461273, 0.051612332463264465, -0.011405709199607372, -0.04690844938158989, 0.10182259976863861, 0.055440373718738556, 0.12539243698120117, -0.007189399562776089, 0.04953530430793762, -0.1484186351299286, 0.002435111440718174, 0.013699227944016457, 0.029885608702898026, 0.0066932193003594875, 0.04916347563266754, 0.031372662633657455, 0.14121770858764648, -0.0019177154172211885, 0.045996591448783875, 0.055023517459630966, -0.059523534029722214, -0.09120788425207138, -0.02544778771698475, 0.024019712582230568, 0.05515841767191887, 0.1163836196064949, -0.005393152125179768, 0.118767149746418, -0.16064570844173431, 0.13864187896251678, 0.11797535419464111, -0.23353703320026398, -0.013419806025922298, 0.1032010167837143, 0.062413234263658524, 0.08282171934843063, -0.034579817205667496, -0.04474427551031113, 0.09881569445133209, 0.04305465519428253, 0.01548684947192669, -0.08414674550294876, -0.0775156244635582, 0.03951997309923172, -0.10808032751083374, -0.04735814407467842, 0.2276000827550888, 0.009326362982392311, -0.06926752626895905, -0.06879470497369766, -0.032161928713321686, -0.12331417948007584, 0.029774492606520653, 0.02479184791445732, 0.010551922023296356, -0.023397214710712433, 0.021549049764871597, 0.017720909789204597, -0.08450769633054733, -0.14664755761623383, 0.017910214141011238, 0.1288939118385315, 0.056967902928590775, 0.0201072059571743, -0.07707460969686508, 0.11505142599344254, 0.0740809440612793, -0.04469665139913559, -0.033026762306690216, -0.024717073887586594, -0.11783909052610397, 0.04640289023518562, -0.07357186079025269, -0.172946035861969, 0.004096061456948519, 0.042871639132499695, 0.021166285499930382, 0.049177948385477066, 0.027535295113921165, 0.015383313409984112, 0.0023884852416813374, 0.19763188064098358, 0.029001273214817047, -0.09259863942861557, 0.0510510616004467, 0.04090031236410141, -0.02053268440067768, -0.02316148206591606, -0.08349400758743286, -0.08335263282060623, 0.06134960055351257, 0.0958515852689743, -0.13025902211666107, 0.06062212586402893, -0.07110950350761414, -0.037944674491882324, -0.02932422235608101, -0.1768886297941208, 0.007061015348881483, 0.018220771104097366, -0.060609303414821625, -0.031031206250190735, 0.1060786098241806, -0.1607925146818161, -0.14005734026432037, -0.02838784083724022, -0.0731850415468216, -0.039702415466308594, -0.1549135148525238, -0.15149377286434174, -0.011012817732989788, -0.04932626709342003, 0.006885761395096779, -0.08384455740451813, -0.17671214044094086, -0.02226094715297222, 0.028283948078751564, 0.016901861876249313, 0.011262274347245693, -0.10021336376667023, -0.03028223104774952, -0.02067098766565323, -0.028532875701785088, -0.005112590733915567, -0.04580015689134598, 0.11970558762550354, 0.10305292904376984, 0.03289935365319252, -0.0404852032661438, 0.049594469368457794, -0.07094114273786545, 0.045865532010793686, -0.08822926878929138, 0.1295926570892334, -0.08639034628868103, 0.06826432794332504, -0.012493310496211052, -0.1107315644621849, 0.06480972468852997, 0.06894465535879135, 0.0679246112704277, 0.05900850519537926, -0.16572263836860657, -0.02846924588084221, 0.16231407225131989, -0.13125047087669373, -0.11566956341266632, 0.10277101397514343, -0.04850425571203232, 0.054516032338142395, 0.09870967268943787, 0.14265137910842896, 0.1726410984992981, -0.04386383295059204, 0.015148990787565708, 0.06756210327148438, 0.0481429398059845, -0.11750707030296326, 0.06666166335344315, 0.07550908625125885, -0.08005339652299881, 0.07195601612329483, -0.06696204841136932, 0.11126886308193207, -0.006624656263738871, -0.036343809217214584, -0.05892776697874069, -0.0812981128692627, 0.0023730206303298473, 0.008552427403628826, 0.0554877407848835, -0.08652191609144211, -0.07787831872701645, 0.10535460710525513, 0.17261934280395508, -0.1341901570558548, 0.01370291505008936, -0.0970143973827362, 0.03783315792679787, -0.031129959970712662, 0.018869994208216667, -0.15831923484802246, 0.002503232331946492, 0.04921839386224747, 0.020072516053915024, 0.0840277150273323, 0.13027581572532654, 0.01579095982015133, 0.03665377199649811, -0.001699606073088944, -0.013621700927615166, -0.11356420814990997, -0.05670837685465813, -0.06125972419977188, -0.07205565273761749, -0.08777384459972382, -0.06078352779150009, -0.000506636337377131, -0.18424947559833527, 0.009129313752055168, 0.016425972804427147, 0.02177540212869644, 0.014296303503215313, -0.013053404167294502, 0.012725097127258778, 0.07848889380693436, -0.06461472809314728, -0.0411531925201416, 0.020348329097032547, 0.013421407900750637, -0.08031025528907776, -0.0641755685210228, -0.07330326735973358, 0.006353712175041437, 0.1210327297449112, 0.012701008468866348, -0.07020527124404907, 0.02346646599471569, -0.0142894322052598, -0.029455339536070824, 0.036460135132074356, -0.07611516118049622, 0.1576593816280365, 0.0037212795577943325, 0.17790310084819794, -0.15642133355140686, -0.0437343530356884, -0.049075860530138016, 0.019732913002371788, 0.043023496866226196, 0.1440204530954361, -0.007391452323645353, -0.06176627427339554, 0.06184104457497597, 0.00766398012638092, -0.08515480905771255, 0.19905608892440796, -0.06531547755002975, -0.08966474235057831, 0.026499195024371147, 0.114364854991436, -0.015216019935905933, 0.14028385281562805, -0.1711762398481369, -0.03136667609214783, 0.01417444832623005, 0.004568016622215509, 0.06274130940437317, -0.1408623903989792, -0.010802358388900757, 0.01807010918855667, -0.0838966891169548, -0.07078136503696442, -0.010387323796749115, -0.008019791916012764, 0.04723844304680824, -0.021138206124305725, -0.046708203852176666, 0.027376813814044, -0.04050115868449211, -0.10476744920015335, 0.21322962641716003, -0.12132713943719864, -0.22153878211975098, -0.19988444447517395, 0.07063716650009155, -0.08230990171432495, -0.0014859148068353534, 0.02834416925907135, -0.11566850543022156, -0.06594542413949966, -0.0687960684299469, 0.17610985040664673, -0.06961019337177277, -0.012673486024141312, 0.0050482070073485374, 0.06147189438343048, 0.004317752085626125, -0.2267533838748932, 0.03771652281284332, -0.01689019612967968, -0.012241419404745102, -0.002456276211887598, -0.08716224879026413, 0.08908765763044357, 0.17090514302253723, -0.05705428123474121, 0.028852030634880066, 0.0017432820750400424, 0.18620888888835907, -0.03430236503481865, -0.061367519199848175, 0.14297005534172058, -0.027024192735552788, -0.0030104510951787233, 0.017402317374944687, 0.0008066582377068698, -0.11804322153329849, 0.06179894506931305, 0.00916878879070282, -0.02887601964175701, -0.27486833930015564, -0.03661661222577095, -0.08644164353609085, 0.035606950521469116, 0.013160008005797863, 0.04850025475025177, -0.096962571144104, 0.024731814861297607, 0.04703787714242935, 0.12141721695661545, 0.00613176915794611, 0.036956172436475754, 0.1101570874452591, -0.01549453753978014, 0.02256348729133606, -0.10353334993124008, -0.013475509360432625, 0.07564786076545715, 0.07496960461139679, 0.25638875365257263, -0.11238135397434235, 0.18882186710834503, 0.04450279101729393, 0.04389765113592148, 0.022603288292884827, 0.14583753049373627, -0.1031154915690422, 0.02546647936105728, 0.011597744189202785, -0.016034988686442375, -0.1090293601155281, 0.028098920360207558, -0.0672679990530014, 0.06479676067829132, -0.09620840102434158, -0.018885862082242966, 0.015277119353413582, 0.12533025443553925, 0.03775862604379654, -0.22131356596946716, -0.09202596545219421, 0.019474616274237633, -0.07948413491249084, -0.08751071244478226, 0.0787246972322464, 0.21426473557949066, -0.042237669229507446, -0.00811262708157301, -0.0007727044285275042, 0.12255007028579712, -0.03763164207339287, -0.02878452278673649, -0.02636590041220188, 0.064589723944664, 0.023817365989089012, 0.13439711928367615, -0.29339128732681274, 0.12745916843414307, -0.007427911274135113, 0.05890507996082306, -0.040191661566495895, 0.05574899911880493, -0.036935340613126755, 0.06703013181686401, 0.05147569626569748, -0.011204363778233528, 0.05949069932103157, -0.16393154859542847, -0.010933923535048962, 0.030385838821530342, 0.06236157566308975, 0.0462668351829052, 0.06999141722917557, -0.009017673321068287, 0.04970981180667877, 0.009893234819173813, -0.11900850385427475, -0.06565365195274353, -0.1048196330666542, -0.006395610515028238, -0.02965737134218216, -0.03985355421900749, -0.05483229085803032, -0.02673538774251938, 0.008413557894527912, 0.2156168520450592, -0.08401646465063095, -0.09151943773031235, -0.0829060822725296, 0.04409804940223694, 0.12520824372768402, -0.0743044763803482, 0.038242921233177185, 0.01730254665017128, 0.011144652031362057, -0.0030670654959976673, -0.09872808307409286, 0.04127587378025055, -0.026057368144392967, -0.07008256763219833, -0.009546064771711826, 0.09017729014158249, 0.012976993806660175, 0.02255253866314888, -0.002351579023525119, -0.09120437502861023, -0.024739643558859825, -0.12823033332824707, -0.1329372227191925, -0.018888508901000023, -0.014200868085026741, 0.06611388176679611, -0.14505277574062347, -0.008444122970104218, -0.014212309382855892, -0.027477620169520378, 0.1279800981283188, 0.17050090432167053, -0.05854214355349541, 0.03966737538576126, 0.10118667781352997, -0.05888649448752403, -0.19688284397125244, 0.027949927374720573, 0.05205307528376579, 0.10217010974884033, -0.04447813332080841, -0.17948803305625916, 0.07906882464885712, -0.0005496764788404107, 0.044317297637462616, 0.05982928350567818, -0.313101202249527, -0.1245604157447815, 0.1033012866973877, 0.13986210525035858, 0.09106506407260895, -0.11284039169549942, -0.031460586935281754, -0.06348926573991776, -0.11596918851137161, 0.09891120344400406, -0.0019557832274585962, 0.12571732699871063, -0.048058513551950455, 0.04599594324827194, 0.04080240800976753, -0.030526267364621162, 0.06468262523412704, 0.04226073995232582, 0.13957904279232025, -0.053638800978660583, 0.023346874862909317, 0.1106080412864685, -0.032435208559036255, 0.1811160147190094, -0.14286641776561737, 0.09724111109972, -0.20167972147464752, -0.06962769478559494, -0.05954066291451454, 0.015482642687857151, -0.02877906896173954, -0.050817690789699554, -0.09806814044713974, 0.019204990938305855, -0.009192369878292084, -0.012362324632704258, 0.03503558784723282, -0.024455294013023376, -0.04719707369804382, 0.06681475043296814, 0.13486836850643158, -0.012461107224225998, -0.052615612745285034, 0.04501049593091011, 0.04546140506863594, 0.09299764782190323, -0.16242292523384094, 0.0177234448492527, 0.11777668446302414, 0.02567492425441742, 0.11535542458295822, 0.053769759833812714, -0.0999404639005661, 0.013292052783071995, 0.09260164201259613, -0.08891301602125168, -0.07669855654239655, -0.03240171819925308, -0.11498457938432693, -0.056818217039108276, 0.06326539814472198, 0.11470896005630493, -0.039775967597961426, 0.0005566433537751436, -0.01817925088107586, -0.026335837319493294, -0.10886292159557343, 0.18606123328208923, 0.08172281086444855, 0.06316094100475311, -0.07910417020320892, 0.0600840263068676, 0.0532461516559124, -0.059083759784698486, 0.02113974466919899, 0.1682363897562027, -0.100739024579525, -0.045753736048936844, 0.04926973953843117, 0.1880505084991455, -0.028277616947889328, -0.04635752737522125, -0.12527996301651, -0.08104847371578217, 0.04044158756732941, 0.17526039481163025, 0.10405607521533966, 0.10840585827827454, -0.05335443839430809, 0.016738571226596832, -0.08540061116218567, 0.08692825585603714, 0.07505568861961365, 0.05183446407318115, -0.15003226697444916, 0.15559470653533936, 0.019853072240948677, 0.10075969249010086, -0.038871388882398605, -0.003842388978227973, -0.12231361120939255, 0.0685175284743309, -0.09504225850105286, 0.023856699466705322, 0.0008405179250985384, 0.06485392153263092, -0.013672939501702785, -0.007374655921012163, -0.02357695810496807, 0.0653519332408905, -0.09353940933942795, -0.0020178426057100296, 0.017114998772740364, 0.0368448905646801, -0.07605544477701187, -0.0353923961520195, 0.02410508692264557, -0.09497429430484772, 0.12373698502779007, -0.009622929617762566, -0.04878319799900055, 0.08438977599143982, -0.07969024777412415, 0.04646635428071022, 0.036894794553518295, 0.07772469520568848, 0.004324408248066902, 0.03889614716172218, 0.06457994133234024, 0.04778822511434555, 0.05654503032565117, 0.0310989823192358, 0.11928633600473404, -0.13066914677619934, -0.08593285828828812, -0.06096722185611725, -0.10184074938297272, -0.05215330794453621, 0.09609340131282806, 0.08536295592784882, 0.11450871080160141, 0.10694398730993271, -0.023463379591703415, 0.005430594086647034, -0.12853863835334778, -0.056147921830415726, 0.032871514558792114, -0.024025224149227142, -0.09332248568534851, -0.06429141014814377, 0.04533855989575386, -0.03741930052638054, 0.14654871821403503, 0.0019357763230800629, 0.00030860345577821136, -0.03260970115661621, -0.031120669096708298, 0.046967774629592896, 0.017686810344457626, 0.23184481263160706, -0.07251923531293869, 0.04693010449409485, -0.004015990532934666, -0.00522824190557003, 0.018673764541745186, 0.09720605611801147, 0.11944468319416046, 0.1482611447572708, -0.04847494512796402, 0.10327555239200592, 0.023641057312488556, -0.01171883661299944, -0.07150594890117645, -0.013514582067728043, 0.008696106262505054, 0.05607394874095917, -0.05892631411552429, 0.21221935749053955, 0.050578001886606216, -0.0988476425409317, 0.10208345949649811, 0.04340391233563423, -0.13218678534030914, -0.05043325200676918, -0.014302348718047142, -0.03214993327856064, -0.15160395205020905, 0.03431078419089317, -0.134444922208786, -0.01873166859149933, 0.04041789844632149, 0.06173289194703102, -0.06851648539304733, 0.17397281527519226, -0.01037849672138691, -0.047565896064043045, 0.045209888368844986, -0.006370071787387133, 0.006904690060764551, 0.03260233998298645, 0.013261977583169937, 0.03572969138622284, -0.02147354744374752, 0.02697627991437912, 0.02092253975570202, -0.016503091901540756, -0.00941096805036068, -0.01782962493598461, 0.01612487994134426, -0.02522582933306694, 0.030758505687117577, 0.0676099956035614, 0.1885325014591217, 0.03340080380439758, -0.09537731111049652, -0.03173055127263069, 0.15018586814403534, -0.04974716529250145, -0.11190899461507797, -0.11300907284021378, 0.13826434314250946, 0.02982235699892044, 0.030625000596046448, 0.00401859637349844, -0.08343202620744705, -0.05614171549677849, 0.21939516067504883, 0.07171446084976196, -0.04232654348015785, -0.020666951313614845, 0.0025946719106286764, 0.0004315572732593864, -0.054524604231119156, 0.19644111394882202, 0.024600045755505562, 0.24182283878326416, 0.02829664945602417, -0.040566664189100266, -0.0802268236875534, -0.027063382789492607, -0.0026081097312271595, 0.13254442811012268, -0.039398860186338425, -0.047742120921611786, -0.08535066246986389, 0.0047203777357935905, -0.013317715376615524, -0.057426512241363525, 0.08715713769197464, -0.11505530774593353, -0.0888676568865776, -0.03434088081121445, 0.04117134213447571, -0.0710202306509018, 0.01341890636831522, -0.030310945585370064, 0.04453853517770767, 0.07046648859977722, -0.031813398003578186, -0.13081279397010803, -0.13588972389698029, 0.08830421417951584, -0.05059219151735306, 0.12257514148950577, -0.005665196571499109, 0.15964609384536743, 0.08147280663251877, 0.045623790472745895, -0.04959128797054291, 0.11816706508398056, 0.03191469982266426, 0.027724413201212883, 0.05121424421668053, 0.09583030641078949, -0.05744922533631325, 0.13827162981033325, -0.040037281811237335, -0.017674444243311882, -0.009679794311523438, -0.03694012388586998, -0.034478798508644104, -0.17566785216331482, -0.02645435929298401, -0.12710176408290863, 0.10051751136779785, 0.18753111362457275, -0.03269380331039429, -0.024963906034827232, -0.09221139550209045, 0.09264206141233444, -0.03260890394449234, 0.04389241337776184, -0.0173119455575943, -0.1844971776008606, -0.015291169285774231, 0.0454428493976593, 0.013694372959434986, -0.23738239705562592, -0.023221686482429504, -0.037215542048215866, -0.02385059744119644, -0.07106155157089233, 0.1584530621767044, 0.09950355440378189, 0.04226423427462578, -0.029275042936205864, -0.1365135759115219, -0.02458854205906391, 0.06167661398649216, -0.14050471782684326, -0.1247531846165657 ]
null
null
transformers
# CodeTrans model for program synthesis Pretrained model on programming language lisp inspired DSL using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code. ## Intended uses & limitations The model could be used to generate lisp inspired DSL code given the human language description tasks. ### How to use Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_multitask_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_multitask_finetune", skip_special_tokens=True), device=0 ) tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/program%20synthesis/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data. ## Evaluation results For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | LISP | | -------------------- | :------------: | | CodeTrans-ST-Small | 89.43 | | CodeTrans-ST-Base | 89.65 | | CodeTrans-TF-Small | 90.30 | | CodeTrans-TF-Base | 90.24 | | CodeTrans-TF-Large | 90.21 | | CodeTrans-MT-Small | 82.88 | | CodeTrans-MT-Base | 86.99 | | CodeTrans-MT-Large | 90.27 | | CodeTrans-MT-TF-Small | **90.31** | | CodeTrans-MT-TF-Base | 90.30 | | CodeTrans-MT-TF-Large | 90.17 | | State of the art | 85.80 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
summarization
SEBIS/code_trans_t5_small_program_synthese_multitask_finetune
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for program synthesis ===================================== Pretrained model on programming language lisp inspired DSL using the t5 small model architecture. It was first released in this repository. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code. Intended uses & limitations --------------------------- The model could be used to generate lisp inspired DSL code given the human language description tasks. ### How to use Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data. Evaluation results ------------------ For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 63, 88, 112 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.11078450083732605, 0.0674130767583847, -0.0019478125032037497, 0.11080477386713028, 0.06499253958463669, 0.014756771735846996, 0.09246762841939926, 0.0780290961265564, -0.02419649064540863, 0.052982147783041, 0.07932750135660172, 0.027230143547058105, 0.04203030839562416, 0.1868920773267746, 0.04613323137164116, -0.14625222980976105, 0.0013399474555626512, 0.026733241975307465, -0.06437971442937851, 0.11087104678153992, 0.09171970188617706, -0.09162771701812744, 0.06655843555927277, -0.04695114493370056, -0.13858959078788757, 0.050906844437122345, -0.05691763758659363, -0.02356698550283909, 0.07078372687101364, 0.052370015531778336, 0.09084945917129517, -0.016314754262566566, 0.09611938893795013, -0.18415531516075134, -0.007611411157995462, 0.024508969858288765, 0.03271222859621048, 0.02186356857419014, 0.09106453508138657, 0.060439229011535645, 0.14585474133491516, -0.021319707855582237, 0.026651281863451004, 0.05441378802061081, -0.05924410745501518, -0.08439338207244873, -0.05602261424064636, 0.06697985529899597, 0.144474595785141, 0.09115125238895416, -0.016753315925598145, 0.011948097497224808, -0.06632519513368607, 0.10233451426029205, 0.12175959348678589, -0.18885329365730286, -0.029942147433757782, 0.06618079543113708, 0.06822255253791809, 0.050553902983665466, -0.03373464196920395, -0.0316673219203949, 0.09870781749486923, 0.025904033333063126, 0.042696088552474976, -0.07867813110351562, -0.048985160887241364, -0.0028634737245738506, -0.08733680844306946, -0.0554695799946785, 0.13975906372070312, 0.027853552252054214, -0.055402301251888275, -0.11654311418533325, -0.03807802498340607, -0.17294703423976898, 0.0385696180164814, 0.023614982143044472, 0.027403803542256355, -0.013604982756078243, 0.08238185942173004, -0.010867114178836346, -0.10901404172182083, -0.1226811334490776, 0.061458371579647064, 0.07055502384901047, 0.07646383345127106, 0.019929222762584686, 0.002577986801043153, 0.09361493587493896, 0.05521915480494499, -0.031388524919748306, -0.012774071656167507, 0.008926009759306908, -0.1248743012547493, 0.0546795018017292, -0.050664424896240234, -0.0814305767416954, -0.04143615439534187, 0.0658642128109932, 0.005303435027599335, 0.05139372497797012, 0.14746542274951935, 0.010074830614030361, -0.02184479497373104, 0.19151686131954193, 0.0010382583132013679, -0.06686850637197495, -0.007382353767752647, 0.026127146556973457, -0.010841146111488342, -0.015750249847769737, -0.0823633000254631, -0.030421379953622818, -0.019089728593826294, 0.07161156088113785, -0.12458911538124084, 0.02648170478641987, -0.04659580811858177, -0.028705541044473648, 0.05531591922044754, -0.14765343070030212, 0.025057367980480194, -0.0034814467653632164, -0.047285065054893494, -0.07588697224855423, 0.0828818753361702, -0.09669432044029236, -0.12428894639015198, 0.01746419630944729, -0.03630794584751129, -0.025987576693296432, -0.13209199905395508, -0.12136267125606537, -0.004249648656696081, -0.10599546134471893, 0.008710285648703575, -0.08926068991422653, -0.14270253479480743, -0.039747145026922226, 0.055741846561431885, -0.004657852463424206, -0.013527327217161655, -0.06137794628739357, 0.0114351911470294, -0.011330904439091682, -0.018388032913208008, 0.03798763081431389, -0.01813078671693802, 0.1024915799498558, 0.08184687793254852, 0.04871193692088127, 0.006855973973870277, 0.043606653809547424, -0.06168553605675697, 0.06289069354534149, -0.03831314668059349, 0.0875137597322464, -0.024602945894002914, 0.07256408035755157, -0.0767923891544342, -0.07711736857891083, 0.06569013744592667, 0.059934794902801514, 0.06604533642530441, 0.038021866232156754, -0.13336026668548584, 0.01036225724965334, 0.13008610904216766, -0.09862183779478073, -0.11385989189147949, 0.08276661485433578, -0.007165058515965939, 0.046286843717098236, 0.0675736665725708, 0.14632859826087952, 0.1576310247182846, -0.0774587020277977, -0.025644300505518913, 0.059580959379673004, 0.08503846824169159, -0.09999407082796097, 0.08026769012212753, 0.038589634001255035, 0.02690044231712818, 0.03984583914279938, -0.01236681453883648, 0.07445068657398224, -0.008708223700523376, -0.0504705049097538, -0.015943489968776703, -0.09864462167024612, -0.052519962191581726, -0.005155874416232109, 0.035528723150491714, -0.04256204888224602, -0.07625923305749893, 0.02571265958249569, 0.16646727919578552, -0.11205265671014786, 0.03966045007109642, -0.08463342487812042, -0.03072439134120941, -0.09489942342042923, 0.019251642748713493, -0.12567223608493805, 0.029210185632109642, 0.04092399403452873, -0.03168579563498497, 0.07660260796546936, 0.08414055407047272, 0.0006304063135758042, 0.03953269496560097, -0.021990815177559853, -0.04583023488521576, -0.055499110370874405, -0.06886488199234009, -0.13096953928470612, -0.03570841625332832, -0.12504716217517853, -0.04785597324371338, -0.06134680658578873, -0.15880785882472992, 0.020464181900024414, -0.03856165334582329, 0.028852514922618866, 0.006438408512622118, -0.03874735161662102, 0.04077960550785065, 0.06522595137357712, -0.04230273887515068, -0.07939212024211884, 0.027400638908147812, 0.004039433319121599, -0.11295843869447708, -0.016448024660348892, -0.11494386941194534, -0.0440266951918602, 0.06542345136404037, 0.04260270297527313, -0.06965770572423935, 0.008744505234062672, -0.027698783203959465, -0.05334688350558281, 0.020514316856861115, -0.07469632476568222, 0.13101734220981598, 0.0026850749272853136, 0.15131156146526337, -0.14233161509037018, -0.07078082114458084, -0.03064807876944542, 0.016234442591667175, -0.00218068971298635, 0.17262472212314606, 0.046988654881715775, -0.032213497906923294, 0.03738782927393913, 0.02434130199253559, -0.020108496770262718, 0.1334298700094223, -0.015590489841997623, -0.10883782804012299, 0.015288573689758778, 0.10406084358692169, -0.01264361385256052, 0.1015818789601326, -0.07441016286611557, -0.022451309487223625, 0.006656634621322155, 0.004170908592641354, 0.041810303926467896, -0.13854211568832397, 0.01315988227725029, 0.05676177516579628, -0.06736787408590317, -0.027668075636029243, -0.012785614468157291, -0.05862564221024513, 0.042290300130844116, -0.025832844898104668, 0.0006595010636374354, 0.001286341343075037, -0.030707914382219315, -0.08591195195913315, 0.17930550873279572, -0.08542721718549728, -0.18270717561244965, -0.18658004701137543, 0.014119680039584637, -0.05658261850476265, 0.018474122509360313, 0.04563739895820618, -0.1284889280796051, -0.061172522604465485, -0.10070008784532547, 0.13332386314868927, -0.113389752805233, 0.0014151191571727395, 0.001764866290614009, 0.03331444412469864, 0.03595753014087677, -0.1833305060863495, 0.04817066341638565, -0.01583811640739441, -0.00270865554921329, -0.0021219009067863226, -0.026316482573747635, 0.09779561311006546, 0.11775678396224976, -0.06360901147127151, 0.019259465858340263, 0.001384356408379972, 0.20167647302150726, -0.03881895914673805, 0.0205726008862257, 0.20191895961761475, 0.004175583831965923, 0.03529244661331177, 0.05062253400683403, 0.020411403849720955, -0.13753166794776917, 0.05006973072886467, 0.04657865688204765, -0.03632291778922081, -0.26480066776275635, -0.014662007801234722, -0.06768614798784256, 0.026686297729611397, 0.10487101972103119, 0.0621584914624691, -0.1431376188993454, 0.04316691309213638, -0.011010592803359032, 0.1636897176504135, -0.05253055691719055, 0.04313734546303749, 0.01783089153468609, -0.00626917090266943, -0.002001746091991663, -0.08893819153308868, 0.002162389690056443, 0.06176150590181351, 0.09451726078987122, 0.22712388634681702, -0.06834758818149567, 0.2246677577495575, 0.02972993813455105, 0.08155365288257599, -0.00925465114414692, 0.14642548561096191, -0.1059608981013298, 0.021953219547867775, 0.013715480454266071, -0.026840195059776306, -0.08223604410886765, 0.08152748644351959, -0.03228602185845375, 0.024907084181904793, -0.06887245178222656, 0.02838767133653164, -0.0077375927940011024, 0.17308276891708374, 0.04086815565824509, -0.19533437490463257, -0.06864652037620544, 0.024761425331234932, -0.0756947472691536, -0.11641082167625427, 0.060253798961639404, 0.19010721147060394, 0.00030490668723359704, -0.0026522555854171515, -0.008835500106215477, 0.13527843356132507, -0.0687752515077591, -0.0358131006360054, 0.04134676232933998, 0.04884057864546776, 0.04171907901763916, 0.1318609118461609, -0.23379379510879517, 0.10502611100673676, 0.021258695051074028, 0.07873101532459259, -0.04080605506896973, 0.07795333862304688, -0.07126659154891968, -0.011202818714082241, 0.11721408367156982, 0.007693445775657892, -0.06272132694721222, -0.21582718193531036, -0.07915037870407104, 0.010117890313267708, 0.08052893728017807, -0.04954819008708, 0.08058546483516693, 0.009351354092359543, 0.05084044486284256, -0.026530887931585312, -0.09963877499103546, -0.07415976375341415, -0.167103573679924, 0.006895384285598993, 0.019540760666131973, -0.03697632625699043, -0.03824213892221451, 0.012304889038205147, -0.04432576894760132, 0.23519399762153625, -0.14667634665966034, -0.10998831689357758, -0.09367365390062332, 0.03984326496720314, 0.14818410575389862, -0.08705287426710129, 0.01698249578475952, 0.03530678525567055, 0.02053794078528881, -0.03735405579209328, -0.0759957879781723, 0.03612688183784485, -0.05026094615459442, -0.08679252862930298, -0.03650062903761864, 0.08874604851007462, 0.0012297389330342412, 0.02891027182340622, -0.018444247543811798, -0.08393500745296478, -0.03641417995095253, -0.12080079317092896, -0.060920704156160355, 0.015368029475212097, 0.035385072231292725, -0.013734675012528896, -0.12435859441757202, 0.1290196180343628, -0.016823317855596542, -0.09694492816925049, 0.04901118949055672, 0.21792545914649963, -0.06353750824928284, 0.05136142671108246, 0.12016957253217697, -0.07997416704893112, -0.1671380251646042, -0.0661819651722908, 0.05970819294452667, 0.07826996594667435, -0.022090943530201912, -0.18140381574630737, 0.06474296003580093, -0.0020998790860176086, 0.02931295335292816, 0.05289090797305107, -0.28079092502593994, -0.14443828165531158, 0.05866353586316109, 0.09327808767557144, 0.015009751543402672, -0.12362759560346603, -0.041889019310474396, -0.05966883897781372, -0.057563185691833496, 0.04635816439986229, 0.11619078367948532, 0.12386336922645569, -0.039068553596735, 0.024922305718064308, 0.03289079666137695, -0.02277737483382225, 0.10354986786842346, 0.028799233958125114, 0.12255732715129852, -0.03716110438108444, 0.03254925459623337, 0.06347054243087769, -0.05906815826892853, 0.1579003483057022, -0.14930565655231476, 0.07219769805669785, -0.2376452088356018, -0.06697741150856018, -0.015977002680301666, -0.02204047329723835, -0.04356516897678375, -0.06497664749622345, -0.09119690954685211, 0.002580564934760332, 0.04797006770968437, -0.022086182609200478, 0.030077151954174042, -0.027641115710139275, -0.06294383108615875, 0.09622935950756073, 0.08492789417505264, -0.010783388279378414, -0.07901359349489212, 0.0037652249448001385, 0.030362965539097786, 0.09157659113407135, -0.17060810327529907, 0.000707500206772238, 0.12620824575424194, -0.0003210935683455318, 0.09059976786375046, 0.007910571061074734, -0.0845770463347435, 0.021120883524417877, 0.07692841440439224, -0.06412798166275024, -0.10756436735391617, -0.023425202816724777, -0.030735984444618225, -0.07568801194429398, 0.035506755113601685, 0.08674241602420807, -0.06439335644245148, -0.019016003236174583, -0.022404801100492477, 0.003646070836111903, -0.06996486335992813, 0.19019341468811035, 0.02558368444442749, 0.07294659316539764, -0.07054996490478516, 0.08510386943817139, 0.10166839510202408, -0.11529696732759476, 0.022832974791526794, 0.136299729347229, -0.0691685602068901, -0.043044596910476685, 0.056247562170028687, 0.07689512521028519, -0.022621477022767067, -0.061158668249845505, -0.07107195258140564, -0.060550756752491, 0.047012194991111755, 0.017761999741196632, 0.0705106183886528, 0.08844034373760223, -0.022621171548962593, 0.0135835325345397, -0.09751727432012558, 0.08182135224342346, 0.07617419958114624, 0.04793839156627655, -0.16161207854747772, 0.1509684920310974, 0.017786575481295586, 0.10323917865753174, -0.002196753164753318, 0.04462287947535515, -0.06328760087490082, 0.048495106399059296, -0.012440432794392109, 0.01739172264933586, -0.010959168896079063, 0.04395156726241112, -0.016774266958236694, 0.038064710795879364, -0.009202688001096249, 0.05507190153002739, -0.07250960916280746, -0.0277860127389431, -0.03117639757692814, 0.033476147800683975, -0.06152262166142464, -0.0431910939514637, -0.006111674942076206, -0.07788139581680298, 0.10097389668226242, -0.052060917019844055, -0.03232334554195404, -0.008029607124626637, -0.0346563421189785, 0.09823707491159439, 0.021700169891119003, 0.05393427610397339, -0.03140049800276756, 0.005324389785528183, 0.02356189489364624, 0.02209956757724285, -0.01776416227221489, -0.0239563025534153, 0.040850453078746796, -0.15542076528072357, -0.06355392187833786, -0.06911828368902206, -0.061938755214214325, -0.06290673464536667, 0.08259619772434235, 0.07595878094434738, 0.07602126896381378, 0.06891679018735886, -0.002200962509959936, 0.0022370624355971813, -0.1352785974740982, -0.024877207353711128, 0.06364800035953522, 0.0029769674874842167, -0.11124292761087418, -0.07654090970754623, 0.05200810357928276, -0.027350250631570816, 0.13513988256454468, -0.01855052076280117, -0.0016452129930257797, -0.03644837439060211, -0.05408287048339844, 0.006397332530468702, 0.010945386253297329, 0.19351325929164886, -0.10828530043363571, 0.03324837610125542, -0.001117083476856351, -0.014675220474600792, 0.04146585613489151, 0.17357729375362396, 0.06810422241687775, 0.12875570356845856, 0.07836750894784927, 0.09987498819828033, -0.042330946773290634, -0.033167507499456406, -0.1618579924106598, 0.03285054862499237, 0.01765032671391964, 0.06432320922613144, -0.05814974382519722, 0.14163018763065338, 0.08839886635541916, -0.10909455269575119, 0.0820067971944809, 0.038126762956380844, -0.09488674253225327, -0.046777524054050446, -0.10814755409955978, -0.03516211360692978, -0.07746963948011398, 0.01615835539996624, -0.09415196627378464, 0.02351340837776661, 0.0324571430683136, 0.04160148277878761, -0.0182289220392704, 0.1581168919801712, -0.02817072719335556, -0.02787400782108307, 0.020388519391417503, 0.01161064114421606, 0.020403774455189705, 0.07810039818286896, -0.0018803534330800176, 0.06703461706638336, -0.04238547757267952, 0.05261294171214104, 0.03085148148238659, -0.013408686965703964, 0.02355191670358181, 0.011126605793833733, 0.006105300970375538, -0.04576542228460312, 0.015268195420503616, 0.09425612539052963, 0.18522918224334717, 0.02052268572151661, -0.06898615509271622, -0.05831240117549896, 0.13645587861537933, -0.05963094159960747, -0.03218596428632736, -0.07691358029842377, 0.09839612245559692, 0.047664858400821686, 0.026688750833272934, 0.01057126559317112, -0.07540536671876907, -0.053495995700359344, 0.23572680354118347, 0.04569287225604057, -0.021256400272250175, -0.03445333242416382, 0.001206941087730229, 0.0009060442680492997, -0.0785297229886055, 0.15521296858787537, 0.006579397711902857, 0.20035813748836517, -0.0030557201243937016, -0.004645754117518663, -0.04484547674655914, -0.025563111528754234, -0.029472507536411285, 0.19277547299861908, -0.046255797147750854, 0.006748587358742952, -0.07521231472492218, -0.012766384519636631, 0.019675809890031815, -0.11631142348051071, 0.134176567196846, -0.0611351802945137, -0.06254604458808899, 0.026445796713232994, 0.07261747121810913, -0.031363051384687424, 0.03552476689219475, -0.03920168802142143, 0.04997275397181511, 0.07424319535493851, -0.027394291013479233, -0.11730792373418808, -0.10568960756063461, 0.05129726231098175, -0.026074998080730438, 0.16201439499855042, 0.024119749665260315, 0.08645211905241013, 0.06084521859884262, 0.005188295152038336, -0.1022956445813179, 0.102561354637146, 0.03605519235134125, 0.012813572771847248, 0.0770268514752388, 0.13825978338718414, -0.044338200241327286, 0.13255847990512848, -0.03409203141927719, -0.020563287660479546, -0.004231987055391073, -0.001294742221944034, -0.010369446128606796, -0.13136419653892517, -0.004928532987833023, -0.09099975228309631, 0.1300048828125, 0.16786067187786102, -0.04110115021467209, -0.032144367694854736, -0.047918688505887985, 0.0780797228217125, -0.03210814297199249, 0.06589147448539734, 0.00733848474919796, -0.16334031522274017, 0.016092849895358086, 0.04796462878584862, 0.0380910187959671, -0.16661351919174194, -0.045843761414289474, -0.039201851934194565, -0.05175621062517166, -0.08906912058591843, 0.14897461235523224, 0.07444028556346893, 0.017750388011336327, -0.036042142659425735, -0.18892274796962738, -0.019870242103934288, 0.04909444972872734, -0.14621427655220032, -0.11406607180833817 ]
null
null
transformers
# CodeTrans model for program synthesis ## Table of Contents - [Model Details](#model-details) - [How to Get Started With the Model](#how-to-get-started-with-the-model) - [Uses](#uses) - [Risks, Limitations and Biases](#risks-limitations-and-biases) - [Training](#training) - [Evaluation](#evaluation) - [Environmental Impact](#environmental-impact) - [Citation Information](#citation-information) ## Model Details - **Model Description:** This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code. - **Developed by:** [Ahmed Elnaggar](https://www.linkedin.com/in/prof-ahmed-elnaggar/),[Wei Ding](https://www.linkedin.com/in/wei-ding-92561270/) - **Model Type:** Summarization - **Language(s):** English - **License:** Unknown - **Resources for more information:** - [Research Paper](https://arxiv.org/pdf/2104.02443.pdf) - [GitHub Repo](https://github.com/agemagician/CodeTrans) ## How to Get Started With the Model Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune", skip_special_tokens=True), device=0 ) tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/transfer%20learning%20fine-tuning/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Uses #### Direct Use The model could be used to generate lisp inspired DSL code given the human language description tasks. ## Risks, Limitations and Biases As detailed in this model’s [publication](https://arxiv.org/pdf/2104.02443.pdf), this model makes use of the data-set [One Billion Word Language Model Benchmark corpus](https://www.researchgate.net/publication/259239818_One_Billion_Word_Benchmark_for_Measuring_Progress_in_Statistical_Language_Modeling) in order to gather the self-supervised English data samples. Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)). As such, it should be noted that language models that are pretrained from text corpus such as the One Billion Word Word Language Model Benchmark corpus have been further explored (e.g by [Ngo, Helen & Araújo et al(2021)](https://www.researchgate.net/publication/355582954_No_News_is_Good_News_A_Critique_of_the_One_Billion_Word_Benchmark) reports that the One Billion Word Word Language Model Benchmark corpus > “generate text in the linguistic style of news, without any grounding in the real world. In addition to potential harms from models which are inadvertently optimized for generating fake news.” The aforementioned publication continues to warn that the One Billion Word Word Language Model Benchmark corpus > contains sentences which contain words commonly found on blocklists. While these sentences may have plausibly been used in expository contexts within the article, the destructive sentence-level preprocessing and shuffling applied to lm1b [One Billion Word Word Language Model Benchmark corpus] removes all long-range structure from the text and makes it infeasible to track the context and intent of individual examples. [Ngo, Helen & Araújo et al(2021)](https://www.researchgate.net/publication/355582954_No_News_is_Good_News_A_Critique_of_the_One_Billion_Word_Benchmark) ## Training #### Training Data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) The authors provide additionally notes about the vocabulary used, in the [associated paper](https://arxiv.org/pdf/2104.02443.pdf): > We used the SentencePiece model (Kudo, 2018) to construct the vocabulary for this research, as well as to decode and encode the input/output. ## Training procedure #### Preprocessing ##### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ###### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 5,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data. ## Evaluation #### Results For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | LISP | | -------------------- | :------------: | | CodeTrans-ST-Small | 89.43 | | CodeTrans-ST-Base | 89.65 | | CodeTrans-TF-Small | 90.30 | | CodeTrans-TF-Base | 90.24 | | CodeTrans-TF-Large | 90.21 | | CodeTrans-MT-Small | 82.88 | | CodeTrans-MT-Base | 86.99 | | CodeTrans-MT-Large | 90.27 | | CodeTrans-MT-TF-Small | **90.31** | | CodeTrans-MT-TF-Base | 90.30 | | CodeTrans-MT-TF-Large | 90.17 | | State of the art | 85.80 | ## Environmental Impact Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). We present the hardware type based on the [associated paper](https://arxiv.org/pdf/2105.09680.pdf). - **Hardware Type:** Nvidia RTX 8000 GPUs - **Hours used:** Unknown - **Cloud Provider:** GCC TPU v2-8 and v3-8. - **Compute Region:** Unknown - **Carbon Emitted:** Unknown ## Citation Information ```bibtex @misc{elnaggar2021codetrans, title={CodeTrans: Towards Cracking the Language of Silicon's Code Through Self-Supervised Deep Learning and High Performance Computing}, author={Ahmed Elnaggar and Wei Ding and Llion Jones and Tom Gibbs and Tamas Feher and Christoph Angerer and Silvia Severini and Florian Matthes and Burkhard Rost}, year={2021}, eprint={2104.02443}, archivePrefix={arXiv}, primaryClass={cs.SE} } ```
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
summarization
SEBIS/code_trans_t5_small_program_synthese_transfer_learning_finetune
[ "transformers", "pytorch", "tf", "jax", "t5", "feature-extraction", "summarization", "arxiv:2104.02443", "arxiv:1910.09700", "arxiv:2105.09680", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[ "2104.02443", "1910.09700", "2105.09680" ]
[]
TAGS #transformers #pytorch #tf #jax #t5 #feature-extraction #summarization #arxiv-2104.02443 #arxiv-1910.09700 #arxiv-2105.09680 #endpoints_compatible #has_space #text-generation-inference #region-us
CodeTrans model for program synthesis ===================================== Table of Contents ----------------- * Model Details * How to Get Started With the Model * Uses * Risks, Limitations and Biases * Training * Evaluation * Environmental Impact * Citation Information Model Details ------------- * Model Description: This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code. * Developed by: Ahmed Elnaggar,Wei Ding * Model Type: Summarization * Language(s): English * License: Unknown * Resources for more information: + Research Paper + GitHub Repo How to Get Started With the Model --------------------------------- Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Uses ---- #### Direct Use The model could be used to generate lisp inspired DSL code given the human language description tasks. Risks, Limitations and Biases ----------------------------- As detailed in this model’s publication, this model makes use of the data-set One Billion Word Language Model Benchmark corpus in order to gather the self-supervised English data samples. Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)). As such, it should be noted that language models that are pretrained from text corpus such as the One Billion Word Word Language Model Benchmark corpus have been further explored (e.g by Ngo, Helen & Araújo et al(2021) reports that the One Billion Word Word Language Model Benchmark corpus > > “generate text in the linguistic style of news, without any grounding in the real world. In addition to potential harms from models which are inadvertently optimized for generating fake news.” > > > The aforementioned publication continues to warn that the One Billion Word Word Language Model Benchmark corpus > > contains sentences which contain words commonly found on blocklists. While these sentences may have plausibly been used in expository contexts within the article, the destructive sentence-level preprocessing and shuffling applied to lm1b [One Billion Word Word Language Model Benchmark corpus] removes all long-range structure from the text and makes it infeasible to track the context and intent of individual examples. > > > Ngo, Helen & Araújo et al(2021) Training -------- #### Training Data The supervised training tasks datasets can be downloaded on Link The authors provide additionally notes about the vocabulary used, in the associated paper: > > We used the SentencePiece model (Kudo, 2018) to construct the vocabulary for this research, as well as to decode and encode the input/output. > > > Training procedure ------------------ #### Preprocessing ##### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ###### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 5,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data. Evaluation ---------- #### Results For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : Environmental Impact -------------------- Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper. * Hardware Type: Nvidia RTX 8000 GPUs * Hours used: Unknown * Cloud Provider: GCC TPU v2-8 and v3-8. * Compute Region: Unknown * Carbon Emitted: Unknown
[ "#### Direct Use\n\n\nThe model could be used to generate lisp inspired DSL code given the human language description tasks.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nAs detailed in this model’s publication, this model makes use of the data-set One Billion Word Language Model Benchmark corpus in order to gather the self-supervised English data samples.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).\nAs such, it should be noted that language models that are pretrained from text corpus such as the One Billion Word Word Language Model Benchmark corpus have been further explored (e.g by Ngo, Helen & Araújo et al(2021) reports that the One Billion Word Word Language Model Benchmark corpus\n\n\n\n> \n> “generate text in the linguistic style of news, without any grounding in the real world. In addition to potential harms from models which are inadvertently optimized for generating fake news.”\n> \n> \n> \n\n\nThe aforementioned publication continues to warn that the One Billion Word Word Language Model Benchmark corpus\n\n\n\n> \n> contains sentences which contain words commonly found on blocklists. While these sentences may have plausibly been used in expository contexts within the article, the destructive sentence-level preprocessing and shuffling applied to lm1b [One Billion Word Word Language Model Benchmark corpus] removes all long-range structure from the text and makes it infeasible to track the context and intent of individual examples.\n> \n> \n> \n\n\nNgo, Helen & Araújo et al(2021)\n\n\nTraining\n--------", "#### Training Data\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nThe authors provide additionally notes about the vocabulary used, in the associated paper:\n\n\n\n> \n> We used the SentencePiece model (Kudo, 2018) to construct the vocabulary for this research, as well as to decode and encode the input/output.\n> \n> \n> \n\n\nTraining procedure\n------------------", "#### Preprocessing", "##### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "###### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation\n----------", "#### Results\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper.\n\n\n* Hardware Type: Nvidia RTX 8000 GPUs\n* Hours used: Unknown\n* Cloud Provider: GCC TPU v2-8 and v3-8.\n* Compute Region: Unknown\n* Carbon Emitted: Unknown" ]
[ "TAGS\n#transformers #pytorch #tf #jax #t5 #feature-extraction #summarization #arxiv-2104.02443 #arxiv-1910.09700 #arxiv-2105.09680 #endpoints_compatible #has_space #text-generation-inference #region-us \n", "#### Direct Use\n\n\nThe model could be used to generate lisp inspired DSL code given the human language description tasks.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nAs detailed in this model’s publication, this model makes use of the data-set One Billion Word Language Model Benchmark corpus in order to gather the self-supervised English data samples.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).\nAs such, it should be noted that language models that are pretrained from text corpus such as the One Billion Word Word Language Model Benchmark corpus have been further explored (e.g by Ngo, Helen & Araújo et al(2021) reports that the One Billion Word Word Language Model Benchmark corpus\n\n\n\n> \n> “generate text in the linguistic style of news, without any grounding in the real world. In addition to potential harms from models which are inadvertently optimized for generating fake news.”\n> \n> \n> \n\n\nThe aforementioned publication continues to warn that the One Billion Word Word Language Model Benchmark corpus\n\n\n\n> \n> contains sentences which contain words commonly found on blocklists. While these sentences may have plausibly been used in expository contexts within the article, the destructive sentence-level preprocessing and shuffling applied to lm1b [One Billion Word Word Language Model Benchmark corpus] removes all long-range structure from the text and makes it infeasible to track the context and intent of individual examples.\n> \n> \n> \n\n\nNgo, Helen & Araújo et al(2021)\n\n\nTraining\n--------", "#### Training Data\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nThe authors provide additionally notes about the vocabulary used, in the associated paper:\n\n\n\n> \n> We used the SentencePiece model (Kudo, 2018) to construct the vocabulary for this research, as well as to decode and encode the input/output.\n> \n> \n> \n\n\nTraining procedure\n------------------", "#### Preprocessing", "##### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "###### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation\n----------", "#### Results\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\nEnvironmental Impact\n--------------------\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper.\n\n\n* Hardware Type: Nvidia RTX 8000 GPUs\n* Hours used: Unknown\n* Cloud Provider: GCC TPU v2-8 and v3-8.\n* Compute Region: Unknown\n* Carbon Emitted: Unknown" ]
[ 79, 380, 84, 5, 87, 62, 123 ]
[ "passage: TAGS\n#transformers #pytorch #tf #jax #t5 #feature-extraction #summarization #arxiv-2104.02443 #arxiv-1910.09700 #arxiv-2105.09680 #endpoints_compatible #has_space #text-generation-inference #region-us \n#### Direct Use\n\n\nThe model could be used to generate lisp inspired DSL code given the human language description tasks.\n\n\nRisks, Limitations and Biases\n-----------------------------\n\n\nAs detailed in this model’s publication, this model makes use of the data-set One Billion Word Language Model Benchmark corpus in order to gather the self-supervised English data samples.\n\n\nSignificant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).\nAs such, it should be noted that language models that are pretrained from text corpus such as the One Billion Word Word Language Model Benchmark corpus have been further explored (e.g by Ngo, Helen & Araújo et al(2021) reports that the One Billion Word Word Language Model Benchmark corpus\n\n\n\n> \n> “generate text in the linguistic style of news, without any grounding in the real world. In addition to potential harms from models which are inadvertently optimized for generating fake news.”\n> \n> \n> \n\n\nThe aforementioned publication continues to warn that the One Billion Word Word Language Model Benchmark corpus\n\n\n\n> \n> contains sentences which contain words commonly found on blocklists. While these sentences may have plausibly been used in expository contexts within the article, the destructive sentence-level preprocessing and shuffling applied to lm1b [One Billion Word Word Language Model Benchmark corpus] removes all long-range structure from the text and makes it infeasible to track the context and intent of individual examples.\n> \n> \n> \n\n\nNgo, Helen & Araújo et al(2021)\n\n\nTraining\n--------" ]
[ 0.0052181752398610115, 0.1618373841047287, -0.004518742673099041, 0.0670597180724144, 0.0656760185956955, -0.014495009556412697, 0.08652421832084656, 0.07516150176525116, -0.08342234045267105, 0.032811734825372696, 0.0243645291775465, 0.06049541383981705, 0.014714195393025875, 0.0009224918321706355, 0.07572025060653687, -0.18992185592651367, 0.08081895112991333, -0.08049782365560532, 0.11344615370035172, 0.09016035497188568, 0.07024114578962326, -0.014059833250939846, 0.07507609575986862, 0.02503865584731102, -0.024757144972682, 0.014971088618040085, -0.02091902308166027, -0.044006314128637314, 0.08952806144952774, 0.0867822915315628, 0.009729013778269291, -0.012430382892489433, -0.012386485934257507, -0.1609003096818924, 0.0156947523355484, 0.081857830286026, -0.00011647095379885286, 0.037247274070978165, 0.05272269248962402, -0.04664124920964241, 0.13402187824249268, -0.05398048833012581, 0.02039174921810627, 0.05063951388001442, -0.14834919571876526, -0.13021664321422577, -0.07865563035011292, -0.030190609395503998, -0.02251610718667507, 0.12091868370771408, -0.060577407479286194, 0.18421177566051483, 0.03200256824493408, 0.019118355587124825, 0.1882372498512268, -0.2140977829694748, -0.0017767418175935745, 0.03736133500933647, 0.002987558487802744, 0.048121362924575806, -0.007532568648457527, 0.047514937818050385, 0.034718383103609085, 0.052344340831041336, 0.00336700351908803, -0.07409597933292389, -0.015157131478190422, -0.06283994764089584, -0.07424908131361008, -0.019768359139561653, 0.15498723089694977, -0.012611191719770432, -0.08746496587991714, -0.15650853514671326, -0.02952035702764988, 0.12853944301605225, 0.010722284205257893, -0.023975437507033348, -0.017364246770739555, -0.0038334669079631567, 0.12416059523820877, -0.11719196289777756, -0.05913986265659332, 0.019622357562184334, -0.04773597791790962, 0.1657203733921051, 0.029113972559571266, 0.04876949265599251, -0.02163652703166008, 0.0929398164153099, -0.04179947450757027, -0.03851926326751709, 0.012299751862883568, -0.12331429123878479, -0.04613769054412842, 0.047985561192035675, -0.05060992389917374, -0.08777174353599548, -0.018377909436821938, -0.011602794751524925, -0.035938896238803864, 0.028151508420705795, -0.06757169961929321, 0.030589431524276733, 0.1162208542227745, 0.011341501027345657, -0.1283346712589264, -0.07191658765077591, 0.04787886515259743, 0.02289518713951111, 0.03318296745419502, 0.009206424467265606, -0.03173764422535896, 0.000378892058506608, -0.04006767272949219, 0.054404184222221375, 0.06656889617443085, 0.0564054399728775, -0.0569995678961277, -0.053336743265390396, 0.11648327112197876, -0.08471141010522842, -0.0536019429564476, -0.024095308035612106, -0.029233505949378014, 0.003468880197033286, 0.10142294317483902, -0.01841788738965988, -0.1149659976363182, -0.05872653052210808, -0.09266139566898346, -0.03295815736055374, -0.057584717869758606, -0.12981976568698883, 0.005287291947752237, -0.10211484879255295, -0.0210175272077322, -0.09824492782354355, -0.1956803947687149, -0.0701037272810936, 0.014478208497166634, -0.0497276671230793, -0.013640259392559528, -0.05174495279788971, -0.028032982721924782, -0.0761432945728302, 0.006465699523687363, 0.037690065801143646, -0.022594138979911804, 0.03525057062506676, -0.05720137059688568, 0.025759562849998474, -0.05544444918632507, 0.024050768464803696, -0.12391290068626404, 0.007791758980602026, -0.17552195489406586, 0.17080365121364594, -0.03343099728226662, -0.00896019209176302, -0.0348021425306797, -0.04216615483164787, -0.1409427374601364, 0.09522473067045212, 0.008934768848121166, 0.12298429012298584, -0.18845580518245697, -0.02514101006090641, 0.14239302277565002, -0.11601348221302032, -0.010315310209989548, 0.08792414516210556, -0.08236850798130035, 0.172424778342247, 0.11515148729085922, 0.11122176796197891, 0.020090388134121895, 0.03700440376996994, -0.032727520912885666, -0.07886765152215958, -0.025068135932087898, 0.15106350183486938, -0.01413209829479456, -0.08109875023365021, 0.02043730579316616, 0.04878688603639603, -0.023520493879914284, -0.054257139563560486, 0.009582036174833775, -0.03150291368365288, 0.030150674283504486, -0.003159481333568692, 0.021592114120721817, -0.07408704608678818, -0.034477438777685165, -0.029513122513890266, -0.09080925583839417, 0.025317097082734108, 0.05536068230867386, 0.009785079397261143, 0.09258376061916351, -0.029086893424391747, -0.05466781184077263, -0.06212809681892395, 0.044858478009700775, -0.24061469733715057, -0.028575794771313667, 0.026655040681362152, -0.14872382581233978, 0.17521753907203674, 0.07598096132278442, -0.02250807359814644, 0.027382608503103256, -0.0250097569078207, 0.05494997277855873, -0.025117095559835434, -0.040864814072847366, -0.06784425675868988, -0.1290769875049591, 0.02880069613456726, -0.077863909304142, 0.01469044853001833, -0.07812526077032089, 0.014577122405171394, -0.06592373549938202, 0.022300971671938896, 0.05392696335911751, -0.07260197401046753, 0.06322579085826874, 0.016008540987968445, 0.012246459722518921, -0.029489874839782715, -0.007317804265767336, -0.004645470529794693, -0.04685116931796074, 0.07306116074323654, -0.2188444584608078, -0.06366217881441116, 0.10642574727535248, -0.06524073332548141, -0.059860214591026306, 0.038531407713890076, -0.009703726507723331, -0.014700675383210182, -0.10838188976049423, -0.045097921043634415, 0.22165662050247192, 0.018951939418911934, 0.05693673714995384, -0.13529616594314575, -0.026353292167186737, -0.010845760814845562, -0.05596550926566124, -0.011907268315553665, 0.05204375833272934, -0.07682493329048157, -0.12901519238948822, 0.011021818965673447, 0.044659681618213654, -0.0003221825463697314, 0.16977693140506744, 0.0891229510307312, -0.07306283712387085, -0.031558576971292496, -0.053021062165498734, -0.032392945140600204, 0.0704764574766159, -0.07912649214267731, 0.022438349202275276, 0.052858855575323105, 0.057072822004556656, 0.08169768750667572, -0.035485800355672836, 0.019041121006011963, 0.006344485562294722, -0.042315736413002014, 0.007613995112478733, 0.003617306239902973, 0.021436240524053574, 0.11131207644939423, -0.021288808435201645, 0.08554331213235855, -0.020086027681827545, -0.04323180764913559, -0.09159693121910095, 0.13167054951190948, -0.10961133986711502, -0.33335497975349426, -0.08479291200637817, 0.13638457655906677, -0.09257637709379196, 0.04539745673537254, 0.04987648129463196, -0.03982992097735405, -0.07969998568296432, -0.11112203449010849, 0.09440066665410995, 0.03950329124927521, -0.08740366250276566, -0.06206951662898064, 0.034542281180620193, 0.008239533752202988, -0.1288728415966034, -0.007750969845801592, -0.03218163549900055, -0.03600312024354935, -0.04749353975057602, 0.01241262722760439, 0.05080614984035492, 0.01808779127895832, 0.0721738338470459, -0.03686611354351044, 0.014017226174473763, 0.15658095479011536, -0.023129543289542198, 0.02743985317647457, 0.11844054609537125, -0.07132194936275482, 0.015298238024115562, -0.005787726491689682, -0.007590038701891899, -0.04658995941281319, 0.07065387070178986, 0.11829736083745956, -0.06554795801639557, -0.20386499166488647, -0.0789206251502037, 0.01721053011715412, -0.05247117578983307, -0.025614703074097633, 0.045747868716716766, -0.007135515566915274, 0.04740811884403229, -0.05408916622400284, -0.00505090644583106, 0.07502520829439163, 0.06332731246948242, 0.06641048938035965, -0.07764258235692978, 0.07601208984851837, -0.0711916983127594, -0.04314028099179268, 0.0560833178460598, 0.03591063991189003, 0.18176576495170593, -0.00282737473025918, 0.16085612773895264, 0.09667056053876877, 0.009590272791683674, 0.015001201070845127, 0.04981924220919609, 0.036221399903297424, 0.03451019525527954, -0.0764753445982933, -0.07661042362451553, -0.01903141289949417, 0.10453972220420837, -0.007953677326440811, 0.01910010725259781, -0.002054263837635517, -0.0020766083616763353, 0.13665741682052612, 0.1089797392487526, 0.001915464410558343, -0.11431806534528732, -0.004178135190159082, 0.06718481332063675, -0.06561136245727539, -0.09264547377824783, 0.04161708801984787, 0.033799782395362854, -0.12051352113485336, -0.00948011502623558, 0.03181976079940796, 0.07340265810489655, -0.17854580283164978, 0.04135100543498993, -0.06093272939324379, -0.00025886541698127985, -0.0379563607275486, 0.060211505740880966, -0.18369938433170319, 0.11328886449337006, 0.025093799456954002, -0.015712102875113487, -0.0865808054804802, -0.014256652444601059, -0.008573222905397415, -0.10932256281375885, 0.16295364499092102, 0.0011859501246362925, 0.0633549690246582, 0.05998557060956955, -0.046797141432762146, 0.035735003650188446, 0.054153021425008774, -0.07517319917678833, 0.08084633946418762, 0.011017786338925362, 0.03195912018418312, -0.03540754318237305, 0.08186789602041245, -0.1488586813211441, -0.14733265340328217, 0.038459084928035736, -0.06968539953231812, -0.02049749344587326, -0.05867382511496544, -0.015743598341941833, -0.06109702214598656, 0.1357627511024475, -0.0435064323246479, -0.11620214581489563, -0.07002100348472595, -0.004358635284006596, 0.12497960031032562, -0.028189726173877716, -0.016511987894773483, 0.011790775693953037, 0.015523760579526424, -0.054425884038209915, -0.06468719989061356, 0.053560350090265274, -0.03869498893618584, -0.104683056473732, -0.0640314519405365, 0.0854528471827507, 0.1316155642271042, 0.04084712639451027, 0.0111836027354002, -0.028206851333379745, 0.037262555211782455, -0.1284082978963852, -0.049263257533311844, 0.08102633059024811, 0.02222472056746483, 0.11287400126457214, -0.0902000144124031, -0.06000898778438568, -0.12256047129631042, 0.016976214945316315, 0.08510123938322067, 0.14044053852558136, -0.009500505402684212, 0.11729999631643295, 0.23699872195720673, -0.13284549117088318, -0.1563682109117508, 0.010850788094103336, -0.007590144407004118, -0.010113882832229137, 0.05546633526682854, -0.14528220891952515, 0.0761086493730545, 0.06588534265756607, 0.02955368347465992, 0.04451437667012215, -0.25407442450523376, -0.1216680258512497, 0.1411704570055008, -0.016091283410787582, 0.1723860502243042, -0.10366170108318329, -0.03321028873324394, -0.022827893495559692, -0.05632883310317993, 0.19058552384376526, -0.16490545868873596, 0.0804903581738472, 0.061948880553245544, 0.015730546787381172, 0.039294756948947906, -0.008300891146063805, 0.1557687371969223, 0.09793239086866379, 0.13188105821609497, -0.05598462000489235, -0.025697175413370132, 0.10023325681686401, -0.005248357076197863, 0.1555255502462387, 0.11324499547481537, 0.060621507465839386, -0.05080146715044975, -0.07791533321142197, -0.13804106414318085, 0.03504744544625282, -0.05416733771562576, -0.06904050707817078, -0.01489576231688261, 0.07770010828971863, 0.026498517021536827, -0.004926373716443777, 0.03274527192115784, -0.14231309294700623, -0.008208521641790867, 0.17229971289634705, 0.11270198225975037, 0.058169979602098465, -0.0055862851440906525, 0.0198868028819561, -0.03381798788905144, 0.0567375086247921, -0.020925013348460197, 0.033420488238334656, 0.0711623951792717, -0.005757029168307781, 0.09575533866882324, -0.010255791246891022, -0.17576651275157928, 0.023389993235468864, 0.023268837481737137, -0.131449356675148, -0.06340108066797256, -0.021386750042438507, -0.007825789041817188, -0.06660272181034088, -0.06312986463308334, 0.20065288245677948, -0.0044642225839197636, -0.05240244045853615, 0.008672141470015049, 0.029193436726927757, -0.04841006547212601, 0.06525680422782898, -0.010630344972014427, 0.007882124744355679, -0.10580061376094818, 0.1760319173336029, 0.02553587220609188, -0.05181608721613884, 0.008198314346373081, 0.07868120074272156, -0.08868914842605591, -0.04071967303752899, -0.15931858122348785, 0.10655238479375839, -0.08882536739110947, -0.008206065744161606, -0.011990071274340153, -0.11223196983337402, -0.010959911160171032, 0.07173725962638855, 0.026673782616853714, 0.08146891742944717, -0.05642084777355194, -0.016486188396811485, -0.012778443284332752, 0.05126114934682846, 0.12327331304550171, -0.029119230806827545, -0.03044511377811432, 0.07605527341365814, -0.005884020123630762, 0.019612662494182587, -0.026177382096648216, -0.012956240214407444, -0.06439493596553802, -0.0010004200739786029, -0.22966542840003967, -0.004514587111771107, -0.10167194157838821, -0.00911364983767271, -0.022991785779595375, -0.0034276198130100965, 0.027552997693419456, -0.036491166800260544, -0.06044348329305649, -0.0301971435546875, -0.012842615135014057, 0.03911396116018295, -0.0755600556731224, -0.008749174885451794, 0.027615917846560478, -0.04398936778306961, 0.09650745987892151, 0.06364157795906067, -0.06197066232562065, 0.07029811292886734, -0.011317060329020023, 0.08040323108434677, 0.008085538633167744, -0.0184419397264719, -0.019377274438738823, -0.10658270120620728, -0.015929972752928734, -0.03191874548792839, 0.034061968326568604, -0.01778293587267399, -0.005512434057891369, -0.11321637779474258, 0.013486357405781746, 0.020790817216038704, 0.005328824277967215, -0.0031858510337769985, -0.02980807237327099, 0.08310326933860779, 0.039446886628866196, 0.07467625290155411, -0.00036414156784303486, 0.025468967854976654, -0.1299503594636917, 0.0253541711717844, -0.01202769111841917, 0.0035199886187911034, 0.025757718831300735, -0.033758822828531265, 0.022683583199977875, 0.0036403192207217216, 0.1980578601360321, -0.00881732814013958, -0.03523293510079384, 0.015274659730494022, 0.04642961174249649, -0.11513633280992508, 0.012041911482810974, -0.041299473494291306, -0.003418706590309739, -0.019088763743638992, -0.08027149736881256, -0.0015289201401174068, -0.009941484779119492, -0.014179153367877007, 0.12432724237442017, 0.14157770574092865, 0.24666064977645874, 0.09799004346132278, 0.04519921913743019, -0.0407588817179203, 0.07712595164775848, 0.11501161009073257, 0.05830001085996628, -0.012056916020810604, -0.044412821531295776, -0.10242389887571335, 0.12155750393867493, -0.09376609325408936, 0.12271924316883087, 0.07835548371076584, -0.03918465971946716, -0.06654635071754456, -0.2004159837961197, -0.0182833019644022, 0.0881485790014267, -0.045849572867155075, -0.16522431373596191, 0.02197623997926712, 0.07027696818113327, 0.02190719172358513, -0.04497535526752472, 0.13091549277305603, -0.06304370611906052, -0.18603681027889252, 0.10150253772735596, 0.027869854122400284, 0.04491173103451729, 0.06849202513694763, 0.030740173533558846, 0.02803978882730007, 0.12900547683238983, -0.00624306034296751, 0.08685442805290222, 0.04158874973654747, 0.01615248993039131, -0.10229993611574173, -0.051288776099681854, -0.04941748455166817, 0.026086021214723587, 0.04705768823623657, 0.25909140706062317, 0.04043136537075043, -0.09577177464962006, -0.00007177139195846394, 0.21876411139965057, -0.0518915168941021, -0.06589772552251816, -0.1427081674337387, 0.18012407422065735, 0.0024427182506769896, 0.05231114849448204, 0.01340787298977375, -0.06281948834657669, 0.07318902760744095, 0.13650955259799957, 0.26990437507629395, -0.08442666381597519, -0.008764160796999931, 0.015280213207006454, 0.0021343042608350515, 0.008206901140511036, -0.003391415113583207, 0.03641648590564728, 0.30193355679512024, -0.07515647262334824, 0.14592710137367249, -0.021937649697065353, 0.03050183318555355, 0.005023130681365728, 0.08382242918014526, 0.01304428931325674, 0.004277517087757587, -0.017366977408528328, 0.05420726165175438, -0.1650681495666504, -0.21229417622089386, -0.06299903243780136, -0.026791108772158623, -0.09728683531284332, -0.007138007320463657, -0.16949701309204102, 0.03647516295313835, 0.09644326567649841, 0.015865372493863106, -0.015076260082423687, 0.16867341101169586, -0.027684789150953293, -0.05477143079042435, -0.11991064250469208, 0.059965673834085464, 0.01246810145676136, 0.07259120047092438, 0.014226640574634075, 0.019229933619499207, 0.05318518355488777, -0.0225041713565588, -0.09430167078971863, 0.09449662268161774, -0.05740267038345337, 0.04982801154255867, -0.0022930221166461706, 0.08515104651451111, 0.04612421244382858, 0.11066886782646179, 0.08222059160470963, -0.021118534728884697, 0.06545740365982056, 0.027435271069407463, -0.07484357059001923, -0.09200476109981537, 0.08081068098545074, -0.10181526094675064, 0.14181764423847198, 0.1421928107738495, -0.01422150619328022, 0.021907439455389977, -0.01498581562191248, 0.0467553474009037, -0.00669229868799448, 0.04432963579893112, -0.0022010852117091417, -0.11406956613063812, 0.07038022577762604, -0.017212169244885445, 0.011048243381083012, -0.29444217681884766, -0.02230430208146572, 0.07598325610160828, -0.04019616171717644, 0.02504538744688034, 0.05869213119149208, 0.00852176547050476, 0.04566633701324463, -0.0009818875696510077, -0.03285597264766693, 0.031383857131004333, 0.11315212398767471, -0.08180610090494156, -0.10049186646938324 ]
null
null
transformers
# CodeTrans model for source code summarization csharp Pretrained model on programming language csharp using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization csharp dataset. ## Intended uses & limitations The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp", skip_special_tokens=True), device=0 ) tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/source%20code%20summarization/csharp/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_csharp
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization csharp ==================================================== Pretrained model on programming language csharp using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization csharp dataset. Intended uses & limitations --------------------------- The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 116 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.13123203814029694, -0.008956597186625004, 0.00020162582222837955, 0.037732165306806564, 0.13945803046226501, 0.018874723464250565, 0.10457854717969894, 0.0261867456138134, -0.022193020209670067, -0.020739253610372543, 0.09670757502317429, 0.10125600546598434, 0.028636222705245018, 0.16784468293190002, -0.02935093827545643, -0.17355330288410187, 0.008417794480919838, 0.06609396636486053, -0.1789551079273224, 0.13523977994918823, 0.11265981942415237, -0.06236467510461807, 0.09623076766729355, -0.003968840464949608, -0.23009300231933594, 0.04039261117577553, 0.0031363123562186956, -0.08276820927858353, 0.1303713172674179, 0.09641513973474503, 0.14144468307495117, 0.05152898281812668, 0.021603241562843323, -0.20230989158153534, 0.04325537011027336, -0.047463346272706985, -0.01463493425399065, 0.060126710683107376, 0.014784903265535831, -0.07332537323236465, 0.17133700847625732, -0.008990704081952572, 0.024205265566706657, 0.06082039698958397, -0.11177872866392136, -0.04121687635779381, -0.02060530334711075, 0.04331837594509125, 0.08951739221811295, 0.07351015508174896, 0.017166588455438614, 0.11220233142375946, -0.15452758967876434, 0.10990844666957855, 0.121117003262043, -0.19233538210391998, -0.018852418288588524, 0.12731221318244934, 0.07105888426303864, -0.06293974071741104, -0.046443574130535126, 0.020158056169748306, 0.07198718935251236, -0.012033877894282341, 0.033333729952573776, -0.12511542439460754, -0.15923812985420227, 0.06512438505887985, -0.08091646432876587, -0.06917539983987808, 0.28465816378593445, -0.0016199630917981267, -0.04149272292852402, -0.045294057577848434, -0.0465354360640049, -0.004027132876217365, -0.01719321869313717, 0.01745600253343582, -0.004193696193397045, -0.01550307311117649, -0.05809586122632027, 0.00010702516738092527, -0.11291486769914627, -0.13162855803966522, -0.002948643174022436, 0.10380139201879501, -0.013520038686692715, 0.03077540546655655, -0.12845033407211304, 0.10414222627878189, 0.08420919626951218, -0.06221863627433777, 0.014226243831217289, -0.0664677694439888, -0.07944709807634354, -0.023278208449482918, -0.06963761150836945, -0.1507025957107544, 0.07618466019630432, 0.12372492253780365, -0.024889793246984482, 0.052999675273895264, 0.054256416857242584, 0.07178044319152832, 0.043758612126111984, 0.21571312844753265, -0.004146764520555735, -0.045659247785806656, 0.050198864191770554, -0.017638027667999268, -0.041040703654289246, -0.009566240012645721, -0.08823888748884201, -0.045232344418764114, 0.0036057166289538145, 0.1271134614944458, -0.12038273364305496, 0.08089222759008408, -0.08244302123785019, -0.03654423728585243, 0.04542718455195427, -0.1214352548122406, -0.03076154924929142, 0.036837443709373474, -0.04019305109977722, -0.01478677336126566, 0.12265938520431519, -0.06742089986801147, -0.07732218503952026, 0.003918325994163752, -0.08457840979099274, -0.008914501406252384, -0.08853184431791306, -0.0719878301024437, 0.010636093094944954, 0.04106469824910164, 0.0518663115799427, -0.13056527078151703, -0.1307675987482071, 0.004022459499537945, 0.06043970584869385, 0.0278252512216568, 0.041521064937114716, -0.08630545437335968, -0.020289793610572815, -0.02257663942873478, -0.00799438077956438, 0.0032045424450188875, -0.07488317042589188, 0.07562487572431564, 0.05860411375761032, 0.04752984270453453, -0.091040700674057, 0.03855659067630768, -0.11415112763643265, 0.07316187769174576, -0.18699833750724792, 0.08504616469144821, -0.06132514774799347, 0.11700348556041718, -0.09698625653982162, -0.052647292613983154, 0.034735988825559616, 0.048883333802223206, 0.05019489675760269, 0.14076697826385498, -0.1252911537885666, -0.06724702566862106, 0.1455478072166443, -0.12401778250932693, -0.20878256857395172, 0.06365574151277542, -0.06767822802066803, 0.14891469478607178, 0.05922044441103935, 0.14538449048995972, 0.10685122013092041, -0.07666298002004623, 0.03149011731147766, 0.08308225870132446, -0.04327763244509697, -0.06364164501428604, 0.07858297973871231, 0.05627429485321045, -0.13091415166854858, 0.04518222063779831, -0.02610265277326107, 0.11235949397087097, -0.04516289010643959, -0.0535501129925251, -0.00836742389947176, -0.07254233211278915, 0.041959356516599655, -0.012740490958094597, 0.08010826259851456, 0.006673901807516813, -0.009549591690301895, 0.0441821925342083, 0.10775227099657059, -0.1286570131778717, 0.0033758420031517744, -0.10995881259441376, 0.07639767974615097, -0.10026267915964127, 0.026968643069267273, -0.19485126435756683, -0.0636608824133873, 0.008580470457673073, 0.03689424693584442, 0.06493890285491943, 0.002264447743073106, -0.005637839436531067, 0.015278877690434456, -0.004199759569019079, 0.001299548428505659, -0.005445033311843872, -0.016563190147280693, -0.038507673889398575, -0.10585558414459229, -0.06163530424237251, -0.03834415227174759, 0.08589842915534973, -0.1831098049879074, 0.0022936961613595486, 0.08342964947223663, 0.06822273135185242, 0.007395375054329634, 0.029701806604862213, 0.03331851586699486, 0.060742560774087906, -0.05073525011539459, -0.015284999273717403, 0.05637101083993912, 0.01706780306994915, -0.16559089720249176, 0.06411978602409363, -0.08366049081087112, 0.02968372404575348, 0.12438499182462692, -0.1429111361503601, -0.08760049194097519, -0.07992863655090332, -0.035817455500364304, -0.02691299095749855, 0.017198186367750168, -0.0385514572262764, 0.19451797008514404, 0.00035416774335317314, 0.1716557741165161, -0.11050571501255035, -0.03452304005622864, -0.037205904722213745, 0.0013118014903739095, 0.038696832954883575, 0.12340717762708664, 0.10736952722072601, -0.19430525600910187, 0.05343209207057953, 0.10319674015045166, -0.010857881046831608, 0.17761708796024323, -0.05718233063817024, -0.06565245985984802, -0.012484605424106121, 0.062444355338811874, -0.015765199437737465, 0.1764148324728012, -0.15241515636444092, -0.04032861813902855, 0.018015189096331596, -0.039236776530742645, 0.09135834127664566, -0.1328551471233368, -0.0012394689256325364, 0.02591458521783352, -0.036575574427843094, -0.14946697652339935, 0.045893143862485886, 0.00453641451895237, 0.030141733586788177, 0.0003600831551011652, -0.022327378392219543, 0.04468446969985962, -0.03178735077381134, -0.12557096779346466, 0.2417794018983841, -0.07763394713401794, -0.24457862973213196, -0.18827946484088898, 0.09298067539930344, -0.011194185353815556, 0.010438009165227413, 0.03458482027053833, -0.05003215745091438, -0.05416709929704666, -0.032220859080553055, 0.14316105842590332, -0.02439987286925316, -0.022071275860071182, -0.02912469021975994, 0.051619309931993484, -0.016815420240163803, -0.17592032253742218, -0.0009147602249868214, -0.0032090824097394943, 0.02165476232767105, 0.01869140937924385, -0.16299943625926971, 0.11902669817209244, 0.13587261736392975, -0.050365373492240906, 0.032253433018922806, -0.050930969417095184, 0.2244417518377304, -0.07741253823041916, -0.08181952685117722, 0.11455787718296051, -0.12301001697778702, 0.004729738458991051, 0.03171410784125328, 0.012896325439214706, -0.11343178153038025, 0.03762403130531311, -0.05487211048603058, -0.0793062150478363, -0.2513212263584137, -0.10861719399690628, -0.08298659324645996, 0.08492682129144669, 0.029914436861872673, 0.024953268468379974, -0.04652306064963341, 0.0803573951125145, 0.05229179933667183, 0.11668484658002853, -0.005035310052335262, 0.07002533972263336, 0.06261711567640305, 0.015954548493027687, 0.013550273142755032, -0.11150378733873367, -0.048948053270578384, 0.03948980197310448, 0.06576565653085709, 0.1961737424135208, 0.012177766300737858, 0.20979191362857819, 0.05779226869344711, 0.03837276250123978, 0.06351422518491745, 0.1725957840681076, -0.07381229102611542, 0.0018642153590917587, -0.002225287724286318, -0.031080706045031548, -0.1436709761619568, 0.04682239517569542, 0.00037397030973806977, 0.03827211260795593, -0.13782820105552673, -0.05525169149041176, 0.06587815284729004, 0.13412578403949738, -0.011927900835871696, -0.2633565366268158, -0.12768355011940002, 0.017533788457512856, -0.05598778277635574, -0.042832836508750916, 0.051647480577230453, 0.11993594467639923, -0.13262319564819336, -0.0022232704795897007, -0.03861941397190094, 0.15743213891983032, -0.07157079875469208, 0.009694796986877918, -0.05949211120605469, -0.0576402023434639, 0.005766133312135935, 0.13331928849220276, -0.17785578966140747, 0.22520506381988525, 0.015352297574281693, 0.0053343623876571655, -0.05876089632511139, 0.016627099364995956, 0.007904344238340855, 0.10965391993522644, 0.09921040385961533, -0.02337172068655491, -0.061257608234882355, -0.18262311816215515, 0.008963484317064285, 0.0798078179359436, 0.08689610660076141, -0.05400550365447998, 0.060697197914123535, -0.022633284330368042, 0.013415521942079067, -0.003459843108430505, -0.07995136082172394, -0.08075843006372452, -0.1123850867152214, -0.0013790351804345846, -0.032469287514686584, 0.08457957953214645, -0.03659914433956146, 0.0030011851340532303, 0.043397143483161926, 0.1715821921825409, -0.07351231575012207, -0.06292938441038132, -0.11383823305368423, 0.016762780025601387, 0.12365713715553284, -0.08161772787570953, 0.049303922802209854, -0.0022671690676361322, 0.04244229570031166, 0.0015013479860499501, -0.1427748054265976, 0.07262444496154785, -0.07005642354488373, -0.04064960032701492, -0.027790963649749756, 0.1432553231716156, -0.011645873077213764, 0.011579630896449089, 0.05319203436374664, -0.04870441555976868, -0.04383443295955658, -0.14384087920188904, -0.1161133423447609, -0.04881985858082771, 0.033696502447128296, 0.08961587399244308, -0.11715870350599289, 0.029529159888625145, -0.0052720955573022366, -0.008179075084626675, 0.22145691514015198, 0.12941068410873413, -0.04822239652276039, 0.032253071665763855, 0.11485449969768524, -0.07899985462427139, -0.2672637403011322, 0.00554982665926218, -0.027966562658548355, 0.08542685955762863, 0.004609330091625452, -0.14212585985660553, 0.10311803221702576, -0.019678859040141106, 0.03752434626221657, 0.0248886626213789, -0.27122822403907776, -0.11012888699769974, 0.11492665112018585, 0.11522994935512543, 0.08460487425327301, -0.12322203069925308, -0.05659342557191849, -0.09564956277608871, -0.19379974901676178, 0.16790954768657684, -0.11451821029186249, 0.10256178677082062, -0.0052902293391525745, 0.04399357736110687, 0.018239352852106094, -0.0466342531144619, 0.12278670817613602, 0.04201126843690872, 0.08341233432292938, -0.008543005213141441, -0.10032416135072708, 0.17040584981441498, -0.01867067441344261, 0.11220073699951172, -0.08824165910482407, 0.08504272997379303, -0.17839470505714417, -0.035222869366407394, -0.021953612565994263, 0.045664578676223755, -0.017060687765479088, -0.0458071306347847, -0.0899902731180191, -0.004116120282560587, 0.02494785375893116, 0.015931516885757446, 0.11334220319986343, -0.048291075974702835, 0.008058705367147923, 0.075996033847332, 0.16127347946166992, -0.03099261224269867, -0.059409335255622864, 0.047708481550216675, 0.007134800776839256, 0.11628848314285278, -0.22146719694137573, 0.0920521542429924, 0.1318332701921463, 0.03741365671157837, 0.10753737390041351, 0.09304569661617279, -0.023421760648489, 0.04363929107785225, 0.08725660294294357, -0.1357865333557129, -0.05474546551704407, -0.04738108441233635, -0.0857178196310997, -0.006271661259233952, 0.08465132862329483, 0.14221243560314178, -0.046860672533512115, -0.008055498823523521, 0.0024764672853052616, -0.05325205996632576, -0.12805134057998657, 0.1257031410932541, 0.04701004549860954, 0.06731367856264114, -0.10047090798616409, 0.06846020370721817, 0.04896087571978569, -0.1326725035905838, -0.04175238683819771, 0.08562297374010086, -0.14532500505447388, -0.07317754626274109, -0.03527069091796875, 0.22880540788173676, -0.13740025460720062, -0.08184788376092911, -0.14144861698150635, -0.08186229318380356, -0.003932823892682791, 0.2584522068500519, 0.10829239338636398, 0.10322859138250351, -0.03385571390390396, 0.0019451250554993749, -0.09132213145494461, 0.042857758700847626, 0.07591713219881058, 0.023432858288288116, -0.07887592166662216, 0.09357251226902008, 0.00319359521381557, 0.1446934938430786, -0.06400734186172485, -0.03318054601550102, -0.1857101321220398, 0.07843168824911118, -0.12199167162179947, 0.0625651478767395, -0.06233898922801018, 0.01578798145055771, 0.015040429309010506, 0.00789504125714302, -0.03267901763319969, 0.04242558777332306, -0.09054365754127502, 0.023289857432246208, 0.00795282144099474, 0.0683976262807846, -0.08418512344360352, 0.005268055479973555, 0.08498978614807129, -0.06469041854143143, 0.11222489178180695, 0.045508116483688354, -0.05197535455226898, 0.11751148849725723, -0.21254323422908783, -0.021524062380194664, 0.04473503306508064, 0.025797277688980103, 0.03429370000958443, -0.02742066979408264, 0.037714213132858276, 0.02402804233133793, 0.03409062698483467, 0.002848838921636343, 0.09705730527639389, -0.10640370845794678, -0.10301754623651505, -0.05190474912524223, -0.09168237447738647, -0.040336158126592636, 0.012187041342258453, 0.03838389739394188, 0.09586066752672195, 0.08777391910552979, -0.02548164874315262, 0.030919259414076805, -0.06417835503816605, -0.02023305743932724, 0.033302366733551025, -0.06298302114009857, -0.09378224611282349, -0.10228195786476135, 0.03519894555211067, -0.06515844166278839, 0.22104088962078094, -0.05954306200146675, 0.14334720373153687, -0.0032816773746162653, 0.021011536940932274, 0.06729423999786377, 0.06942413002252579, 0.25972533226013184, 0.003210689639672637, 0.04785655066370964, -0.05112087354063988, 0.0737200677394867, 0.032204076647758484, 0.040531326085329056, 0.10899107158184052, 0.07629990577697754, -0.04868442565202713, 0.1256798654794693, 0.013728396035730839, 0.036458540707826614, -0.01961296610534191, -0.08703368902206421, 0.04004120081663132, 0.03366030752658844, -0.03384467959403992, 0.13189849257469177, 0.11843791604042053, -0.09360231459140778, 0.07891932129859924, 0.014241448603570461, -0.10595059394836426, -0.0477767214179039, 0.011364496313035488, -0.05199091136455536, -0.14163370430469513, 0.017065810039639473, -0.10497719049453735, -0.07921837270259857, 0.09231778979301453, 0.030452562496066093, -0.03864917531609535, 0.2344799041748047, -0.015705531463027, -0.057589076459407806, 0.055594127625226974, -0.009082184173166752, 0.013506078161299229, -0.00326861091889441, 0.05332007631659508, 0.00806565210223198, -0.0376075878739357, 0.006016661413013935, 0.035378217697143555, -0.02274213172495365, 0.01673859916627407, -0.05813964456319809, -0.033100858330726624, -0.040699802339076996, 0.051609866321086884, -0.005623572506010532, 0.01865442469716072, 0.01872788369655609, -0.01674646884202957, -0.014885497279465199, 0.17298458516597748, -0.04632297903299332, -0.07044569402933121, -0.15173625946044922, 0.16863836348056793, 0.021283699199557304, 0.05570198595523834, 0.005619633477181196, -0.06654762476682663, -0.024794938042759895, 0.2670011818408966, 0.19833798706531525, -0.07275871187448502, 0.01861605793237686, 0.0016477067256346345, 0.02291586995124817, 0.0051046451553702354, 0.13921625912189484, 0.015283497981727123, 0.2089303731918335, -0.027987966313958168, -0.10493713617324829, -0.05486796796321869, -0.056696806102991104, 0.025577057152986526, 0.11138981580734253, 0.01211540400981903, -0.05688358098268509, -0.04853428900241852, 0.08696472644805908, -0.16312535107135773, -0.11187069863080978, 0.04942057654261589, -0.15237054228782654, -0.06189565360546112, -0.06392054259777069, 0.0063404967077076435, -0.012452790513634682, 0.035046182572841644, -0.04529409483075142, -0.028907887637615204, 0.07406366616487503, 0.026327837258577347, -0.16229596734046936, -0.08163175731897354, 0.05851346254348755, -0.08085206151008606, 0.1312771737575531, -0.028510650619864464, 0.1402822881937027, 0.08955686539411545, 0.06442246586084366, -0.006923523265868425, 0.022352125495672226, 0.08128198236227036, -0.017734695225954056, 0.05185769498348236, 0.06326262652873993, -0.026739347726106644, 0.15325099229812622, -0.042331263422966, -0.11646414548158646, 0.05835287272930145, -0.025329411029815674, -0.006933559197932482, -0.11488287150859833, -0.04259075969457626, -0.09701775014400482, 0.10439196228981018, 0.15575550496578217, -0.045356858521699905, 0.003151016077026725, -0.05199483036994934, 0.10708107054233551, 0.02051066979765892, -0.013226657174527645, -0.07017267495393753, -0.153564915060997, -0.012761110439896584, 0.010145648382604122, -0.01851622387766838, -0.19314135611057281, -0.012671343982219696, -0.0687597468495369, 0.0052902125753462315, -0.028680037707090378, 0.12796460092067719, 0.10686971247196198, 0.03130332753062248, -0.022534204646945, -0.15449628233909607, -0.010270664468407631, 0.06629201769828796, -0.1293625831604004, -0.1517159342765808 ]
null
null
transformers
# CodeTrans model for source code summarization csharp Pretrained model on programming language csharp using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. ## Intended uses & limitations The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp_multitask"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp_multitask", skip_special_tokens=True), device=0 ) tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/csharp/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_csharp_multitask
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization csharp ==================================================== Pretrained model on programming language csharp using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. Intended uses & limitations --------------------------- The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 62, 146 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.14716871082782745, 0.003122931346297264, -0.0005814392934553325, 0.12836116552352905, 0.11194460093975067, 0.012520133517682552, 0.06723432242870331, 0.047790125012397766, -0.04537753388285637, 0.0321488194167614, 0.04797714203596115, 0.015340760350227356, 0.049171749502420425, 0.20309801399707794, 0.012146653607487679, -0.16186153888702393, -0.004980219062417746, 0.0270236786454916, -0.09132011234760284, 0.1267899125814438, 0.09621420502662659, -0.07850413769483566, 0.04484795778989792, -0.04036704823374748, -0.21380454301834106, 0.04855429753661156, -0.003512176452204585, -0.07454591989517212, 0.1065671294927597, 0.06753619760274887, 0.13691073656082153, 0.003300242591649294, 0.036200426518917084, -0.12634894251823425, 0.010311024263501167, 0.023669365793466568, 0.03517436236143112, 0.02654523216187954, 0.06091590225696564, 0.026249611750245094, 0.15719449520111084, -0.0016717201797291636, 0.0419880710542202, 0.06402812153100967, -0.06438755989074707, -0.09951832890510559, -0.02750346250832081, 0.041555844247341156, 0.05885433778166771, 0.0944596529006958, -0.007895620539784431, 0.0976518988609314, -0.15154170989990234, 0.11411510407924652, 0.10053445398807526, -0.24994339048862457, -0.01048631127923727, 0.09380349516868591, 0.06390778720378876, 0.06484128534793854, -0.04292723163962364, -0.04727964475750923, 0.08511606603860855, 0.046884071081876755, 0.043051671236753464, -0.07605196535587311, -0.07922767847776413, 0.025579379871487617, -0.10203459858894348, -0.07470016181468964, 0.21304325759410858, 0.011419651098549366, -0.07343516498804092, -0.07383857667446136, -0.048197243362665176, -0.13287535309791565, 0.027674831449985504, 0.04353924095630646, 0.0005551658687181771, -0.022403595969080925, -0.003381431568413973, 0.0415714830160141, -0.09737831354141235, -0.13697288930416107, 0.016559649258852005, 0.08769514411687851, 0.05045682191848755, 0.03685056418180466, -0.06081671640276909, 0.11269373446702957, 0.03940483182668686, -0.04000554978847504, -0.015651805326342583, -0.03085212968289852, -0.12080220133066177, 0.03953735530376434, -0.06500927358865738, -0.18199865520000458, -0.008879574947059155, 0.02214742638170719, -0.032198723405599594, 0.05617298558354378, 0.05463317036628723, 0.026437751948833466, 0.02210298366844654, 0.207722470164299, 0.008779655210673809, -0.0956140011548996, 0.05258717015385628, 0.04572002962231636, -0.041117943823337555, -0.025998927652835846, -0.061971381306648254, -0.0730498731136322, 0.052587639540433884, 0.10028183460235596, -0.14048780500888824, 0.047419141978025436, -0.06810354441404343, -0.04209281504154205, 0.01961209625005722, -0.16594882309436798, -0.006811073515564203, 0.02902277186512947, -0.055671706795692444, -0.024282194674015045, 0.09173321723937988, -0.15254494547843933, -0.14435173571109772, -0.014335129410028458, -0.07956422120332718, -0.04912548139691353, -0.14805765450000763, -0.1330026239156723, -0.010362600907683372, -0.03952570632100105, -0.004639113321900368, -0.07435446232557297, -0.13120146095752716, -0.015699991956353188, 0.020085996016860008, 0.010850561782717705, 0.0015431599458679557, -0.07420773804187775, -0.010075308382511139, -0.009893174283206463, -0.0318441204726696, -0.0007532582385465503, -0.044668182730674744, 0.11000385880470276, 0.08872267603874207, 0.03586620092391968, -0.018173281103372574, 0.048691827803850174, -0.06962726265192032, 0.07103856652975082, -0.125516876578331, 0.1102699413895607, -0.07517730444669724, 0.07481922209262848, -0.037131570279598236, -0.10453386604785919, 0.041567113250494, 0.0505707785487175, 0.058510832488536835, 0.05884352698922157, -0.14800050854682922, -0.03734142333269119, 0.1966003030538559, -0.12869109213352203, -0.10730305314064026, 0.1002393290400505, -0.04226243123412132, 0.02995540015399456, 0.08404231071472168, 0.1195535808801651, 0.1302480250597, -0.03851236402988434, -0.00017508064047433436, 0.059673208743333817, 0.04085549712181091, -0.11611641198396683, 0.0901469737291336, 0.03859173133969307, -0.08446300774812698, 0.05810809135437012, -0.017480632290244102, 0.09751252830028534, -0.016117578372359276, -0.03883803263306618, -0.04228900000452995, -0.07122371345758438, 0.0007642467389814556, -0.0012011281214654446, 0.06262979656457901, -0.06555107980966568, -0.06980511546134949, 0.05694900453090668, 0.1577279269695282, -0.13357053697109222, 0.0000276240571110975, -0.09038916975259781, 0.05348917841911316, -0.07263205945491791, 0.016969120129942894, -0.17027251422405243, 0.004350072704255581, 0.07116834074258804, -0.025011571124196053, 0.06199340522289276, 0.09476811438798904, 0.007447654381394386, 0.06395969539880753, 0.0013659870019182563, -0.00788034312427044, -0.09512533247470856, -0.05720749869942665, -0.06138337031006813, -0.05888856202363968, -0.10121850669384003, -0.04235474392771721, 0.024953525513410568, -0.17635387182235718, 0.011663946323096752, 0.03437187895178795, 0.030507683753967285, 0.0037055225111544132, -0.015590177848935127, 0.01859308034181595, 0.07190940529108047, -0.04975612834095955, -0.039676446467638016, 0.02681954950094223, 0.020035112276673317, -0.07935735583305359, -0.023480072617530823, -0.10522250831127167, -0.010173379443585873, 0.11451926827430725, 0.04491162300109863, -0.0808088630437851, 0.004546315409243107, -0.023977573961019516, -0.03853596746921539, 0.022991245612502098, -0.06508468836545944, 0.17106659710407257, 0.004938832484185696, 0.20396451652050018, -0.15832479298114777, -0.03130552917718887, -0.021574296057224274, 0.03249094635248184, 0.04392940551042557, 0.12372086197137833, 0.028420785441994667, -0.1081518828868866, 0.054265014827251434, 0.012517854571342468, -0.055921684950590134, 0.2143661230802536, -0.054372284561395645, -0.10341986268758774, 0.020120572298765182, 0.1019987240433693, -0.019013049080967903, 0.14985665678977966, -0.1474277228116989, -0.024679312482476234, 0.01250211801379919, 0.0007782615139149129, 0.06836956739425659, -0.1422612965106964, 0.009613743983209133, 0.027158468961715698, -0.06753706187009811, -0.10701250284910202, -0.016544468700885773, -0.004996671341359615, 0.03674980625510216, 0.003827051492407918, -0.01937153749167919, 0.01895090937614441, -0.04250609502196312, -0.10087109357118607, 0.22500309348106384, -0.11015121638774872, -0.23762427270412445, -0.2114255279302597, 0.12401429563760757, -0.04793047532439232, 0.0032088023144751787, 0.01365574635565281, -0.08799514919519424, -0.06641551852226257, -0.06502949446439743, 0.17233127355575562, -0.07295048236846924, -0.011478578671813011, -0.010317104868590832, 0.05693233758211136, 0.014551502652466297, -0.20804259181022644, 0.03306104242801666, -0.014395325444638729, -0.022608233615756035, -0.006157143972814083, -0.09406162798404694, 0.08718068152666092, 0.1740209013223648, -0.06150940805673599, 0.021636715158820152, -0.009121131151914597, 0.1943322867155075, -0.036791689693927765, -0.06317390501499176, 0.12465284764766693, -0.011622593738138676, 0.01381502766162157, 0.020962027832865715, 0.002577922772616148, -0.08906670659780502, 0.04686150699853897, -0.001555564347654581, -0.03353507071733475, -0.2690137028694153, -0.03126714751124382, -0.08329907059669495, 0.04433668032288551, 0.03652626648545265, 0.045807160437107086, -0.07549073547124863, 0.030391797423362732, 0.0306076742708683, 0.12374389916658401, -0.007008042652159929, 0.0387861430644989, 0.06677431613206863, 0.00009734181367093697, 0.017778702080249786, -0.09943369776010513, -0.00733463279902935, 0.07690666615962982, 0.08742862194776535, 0.26938578486442566, -0.09798069298267365, 0.2229558676481247, 0.03866694122552872, 0.06788904219865799, 0.05584597960114479, 0.1613408923149109, -0.1030058041214943, 0.024875856935977936, 0.002147435210645199, -0.018720274791121483, -0.1253475397825241, 0.027377832680940628, -0.023093707859516144, 0.06282217055559158, -0.12044815719127655, -0.04147326573729515, 0.009164649061858654, 0.19014708697795868, 0.029317911714315414, -0.22811877727508545, -0.11986798793077469, 0.012022103182971478, -0.09625109285116196, -0.09185779839754105, 0.06125401705503464, 0.22350606322288513, -0.08201511204242706, -0.018506774678826332, -0.01800304651260376, 0.12593193352222443, -0.04965010657906532, -0.03772783279418945, -0.03812750428915024, 0.06208961084485054, 0.019925830885767937, 0.12629495561122894, -0.21738511323928833, 0.14474113285541534, -0.00508866598829627, 0.058287084102630615, -0.04207778349518776, 0.06498081237077713, -0.03585081547498703, 0.05277850106358528, 0.045281603932380676, -0.011736128479242325, -0.021837254986166954, -0.19386453926563263, -0.01263787504285574, 0.03038889355957508, 0.05656089633703232, 0.02766626887023449, 0.06931228190660477, -0.0044850450940430164, 0.029566138982772827, -0.003545992076396942, -0.12515799701213837, -0.06060884892940521, -0.1173449233174324, -0.004913325887173414, -0.03810300678014755, -0.015242185443639755, -0.06222565844655037, -0.023449676111340523, 0.06155649945139885, 0.18237099051475525, -0.1018989235162735, -0.09040825068950653, -0.08736763894557953, 0.03594031184911728, 0.14538344740867615, -0.07629784196615219, 0.06909093260765076, -0.005159282591193914, 0.048766233026981354, -0.0001845879014581442, -0.09819363802671432, 0.05679216608405113, -0.034463074058294296, -0.07937245815992355, -0.02191736362874508, 0.12012902647256851, 0.001560809905640781, 0.023273056373000145, -0.004613361321389675, -0.06926855444908142, -0.0329403355717659, -0.12588326632976532, -0.1256924420595169, -0.022020651027560234, 0.011321005411446095, 0.06609195470809937, -0.12317057698965073, -0.028676919639110565, -0.004994292743504047, -0.023945508524775505, 0.13942621648311615, 0.15436303615570068, -0.07221544533967972, 0.04540126770734787, 0.12017407268285751, -0.045497242361307144, -0.18571707606315613, 0.0026554467622190714, 0.05893915891647339, 0.1151668056845665, -0.04600290209054947, -0.17487755417823792, 0.054416246712207794, 0.019588693976402283, 0.038663964718580246, 0.03848033770918846, -0.3326947093009949, -0.12383601069450378, 0.058731868863105774, 0.12744377553462982, 0.053115878254175186, -0.10198508948087692, -0.04375750944018364, -0.0600273460149765, -0.12333578616380692, 0.10672388970851898, -0.004092438146471977, 0.13696865737438202, -0.03938312828540802, 0.030124027281999588, 0.02768363617360592, -0.04851287230849266, 0.0779346451163292, 0.028797756880521774, 0.09315802156925201, -0.02768486738204956, 0.026151379570364952, 0.14226306974887848, -0.0255197174847126, 0.1598401665687561, -0.1340373307466507, 0.10356750339269638, -0.20007330179214478, -0.07462106645107269, -0.06942655146121979, 0.01304218452423811, -0.03623248636722565, -0.031633708626031876, -0.0791935920715332, 0.020449474453926086, 0.001954158768057823, -0.01309376209974289, 0.020093198865652084, -0.03445376083254814, -0.019991043955087662, 0.08691902458667755, 0.10945359617471695, -0.019399231299757957, -0.08910683542490005, 0.0353495292365551, 0.04300400987267494, 0.09667084366083145, -0.19559884071350098, 0.027932072058320045, 0.12106137722730637, 0.018849128857254982, 0.11577785015106201, 0.055095285177230835, -0.1044473648071289, 0.037634506821632385, 0.0920645147562027, -0.08048012852668762, -0.08265174180269241, -0.02204325795173645, -0.06964504718780518, -0.06531628966331482, 0.06061752885580063, 0.08972586691379547, -0.041255418211221695, -0.021141860634088516, -0.025751836597919464, -0.03751017525792122, -0.10239648073911667, 0.1997833400964737, 0.05912390723824501, 0.08380265533924103, -0.07939936220645905, 0.07100308686494827, 0.07839352637529373, -0.10180610418319702, -0.000300737185170874, 0.18397119641304016, -0.11246732622385025, -0.04115859046578407, 0.022659024223685265, 0.1475159376859665, -0.04360967129468918, -0.05160887539386749, -0.1242641732096672, -0.08136255294084549, 0.03850838541984558, 0.17104066908359528, 0.08271323144435883, 0.11537808924913406, -0.050648607313632965, -0.001849212683737278, -0.08429031074047089, 0.070891834795475, 0.0746040791273117, 0.03237529098987579, -0.11333519220352173, 0.15251906216144562, 0.0421135388314724, 0.10987678170204163, -0.031393181532621384, -0.015347521752119064, -0.10114812105894089, 0.05326468497514725, -0.08766523748636246, 0.03074061870574951, -0.01930437609553337, 0.045614324510097504, -0.02260413020849228, 0.005983496550470591, -0.023865433409810066, 0.06789834052324295, -0.0870591327548027, -0.00604479992762208, -0.011042828671634197, 0.03712403029203415, -0.055387482047080994, -0.005452956538647413, 0.03703559562563896, -0.09250721335411072, 0.12672406435012817, -0.008967279456555843, -0.031841978430747986, 0.08532166481018066, -0.05439790338277817, 0.0398629866540432, 0.010596547275781631, 0.05535946413874626, 0.006048357579857111, 0.042030155658721924, 0.0823880210518837, 0.03295397013425827, 0.04269585385918617, 0.010182319208979607, 0.08614300936460495, -0.13638333976268768, -0.11478070914745331, -0.04442660138010979, -0.09623423963785172, -0.06205097958445549, 0.08808418363332748, 0.07168539613485336, 0.09448152035474777, 0.08918734639883041, -0.027921484783291817, 0.01654835045337677, -0.13350379467010498, -0.05836864560842514, 0.022969959303736687, -0.030765598639845848, -0.0903957262635231, -0.05893262103199959, 0.050100166350603104, -0.029087549075484276, 0.1475364714860916, -0.021098744124174118, 0.053440600633621216, -0.025178777053952217, -0.033949654549360275, 0.04815605282783508, 0.04040152207016945, 0.23564451932907104, -0.052312564104795456, 0.03442522510886192, 0.0021883088629692793, 0.007446111645549536, 0.0022400771267712116, 0.11621735990047455, 0.11295884847640991, 0.13383856415748596, -0.017970599234104156, 0.10569550096988678, 0.023324960842728615, -0.009342937730252743, -0.08707447350025177, 0.004705715924501419, -0.003727180417627096, 0.06485273689031601, -0.06290869414806366, 0.1758628785610199, 0.06825430691242218, -0.10825739800930023, 0.10361076891422272, 0.026210695505142212, -0.12999694049358368, -0.04563083499670029, -0.0025923559442162514, -0.030326364561915398, -0.1387026309967041, 0.028956713154911995, -0.11930114030838013, -0.023117827251553535, 0.08265786617994308, 0.05179280787706375, -0.06358295679092407, 0.1902027577161789, 0.017842980101704597, -0.056918926537036896, 0.058165837079286575, 0.007768065202981234, 0.0354086235165596, 0.04458410665392876, 0.01469369512051344, 0.044113244861364365, -0.032954610884189606, 0.041294779628515244, 0.015097343362867832, -0.0229805801063776, 0.0057532284408807755, -0.010819355957210064, -0.0019222156843170524, -0.024033427238464355, 0.02259652130305767, 0.051771849393844604, 0.144887313246727, 0.027011947706341743, -0.07093818485736847, -0.039864860475063324, 0.16015255451202393, -0.04310060665011406, -0.08010663092136383, -0.1382703334093094, 0.16256287693977356, 0.0425904355943203, 0.022838186472654343, 0.023067429661750793, -0.09074468165636063, -0.05116523057222366, 0.19980822503566742, 0.08320406824350357, -0.02464725449681282, -0.02104247733950615, -0.0009502252214588225, -0.006087906192988157, -0.04538482800126076, 0.20218569040298462, 0.024485427886247635, 0.25316759943962097, 0.00706323329359293, -0.02178456261754036, -0.06098693609237671, -0.03013521432876587, -0.011823990382254124, 0.1455788016319275, -0.052934423089027405, -0.031381506472826004, -0.07420482486486435, 0.005891358945518732, 0.005989819299429655, -0.09251102060079575, 0.08602955937385559, -0.13371209800243378, -0.09333264082670212, -0.035435959696769714, 0.02429538406431675, -0.03902064263820648, 0.020479224622249603, -0.02654731273651123, 0.03696952387690544, 0.08395393192768097, -0.01607217639684677, -0.12219025194644928, -0.14259344339370728, 0.08447044342756271, -0.07218383252620697, 0.14654451608657837, -0.007283774204552174, 0.1296505331993103, 0.09447666257619858, 0.056316617876291275, -0.04891807585954666, 0.09969312697649002, 0.0445781908929348, 0.024825911968946457, 0.04638386890292168, 0.12114699184894562, -0.04423218220472336, 0.15914388000965118, -0.056746914982795715, -0.04166962578892708, 0.005300274584442377, -0.0821501836180687, -0.005318015348166227, -0.1555594801902771, -0.015040193684399128, -0.10412123799324036, 0.1066165640950203, 0.19389167428016663, -0.04233203083276749, -0.031152918934822083, -0.0850336030125618, 0.08792784810066223, -0.02125108428299427, 0.0517229400575161, -0.03242916613817215, -0.19571471214294434, 0.0020298634190112352, -0.006712211761623621, 0.020215407013893127, -0.24106499552726746, -0.018578609451651573, -0.04031556844711304, -0.023677553981542587, -0.06973836570978165, 0.15739576518535614, 0.07881081849336624, 0.04574787989258766, -0.030161820352077484, -0.126664936542511, -0.03265252709388733, 0.06128351390361786, -0.14039866626262665, -0.12583820521831512 ]
null
null
transformers
# CodeTrans model for source code summarization csharp Pretrained model on programming language csharp using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the csharp code snippets. ## Intended uses & limitations The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp_multitask_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp_multitask_finetune", skip_special_tokens=True), device=0 ) tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/csharp/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_csharp_multitask_finetune
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization csharp ==================================================== Pretrained model on programming language csharp using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the csharp code snippets. Intended uses & limitations --------------------------- The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 62, 88, 112 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.10569002479314804, 0.0557570718228817, -0.0015333949122577906, 0.10467126965522766, 0.05633319169282913, 0.016534700989723206, 0.06759271025657654, 0.07972719520330429, -0.0690406784415245, 0.06566348671913147, 0.07529717683792114, -0.032039839774370193, 0.06346393376588821, 0.19018158316612244, 0.03630910813808441, -0.17245139181613922, -0.007620539050549269, 0.0348951555788517, -0.05238794535398483, 0.11123747378587723, 0.09339763224124908, -0.09309200197458267, 0.058864250779151917, -0.03671187534928322, -0.13533267378807068, 0.04966088756918907, -0.050600554794073105, -0.04666532576084137, 0.08947955816984177, 0.06460477411746979, 0.12367761880159378, -0.014132851734757423, 0.0817800983786583, -0.18583160638809204, 0.00311944168061018, 0.019396938383579254, 0.035386234521865845, 0.028062233701348305, 0.07506042718887329, 0.06287065893411636, 0.15321779251098633, -0.017916498705744743, 0.032014306634664536, 0.05031265318393707, -0.0645957812666893, -0.057275962084531784, -0.045486077666282654, 0.09237754344940186, 0.1367294043302536, 0.0836804062128067, -0.012619353830814362, 0.022723084315657616, -0.078513003885746, 0.0887836217880249, 0.11936908960342407, -0.21842288970947266, -0.030104897916316986, 0.073946513235569, 0.0826878473162651, 0.06684699654579163, -0.05697019025683403, -0.042911019176244736, 0.09880079329013824, 0.03745550289750099, 0.042493898421525955, -0.08178423345088959, -0.07004990428686142, -0.00766170397400856, -0.07708612084388733, -0.06530332565307617, 0.12738661468029022, 0.04277748987078667, -0.05473370850086212, -0.09023966640233994, -0.049571793526411057, -0.20139054954051971, 0.04538548365235329, 0.028104813769459724, 0.01277904398739338, -0.0209212526679039, 0.04331784322857857, -0.0006638948107138276, -0.1146542951464653, -0.11262621730566025, 0.016370583325624466, 0.061982493847608566, 0.0710289478302002, 0.026205219328403473, -0.009506016038358212, 0.09029672294855118, 0.01615632139146328, -0.037906426936388016, -0.01239009853452444, 0.007222939282655716, -0.1357903629541397, 0.025048334151506424, -0.03967280313372612, -0.07824229449033737, -0.02694072388112545, 0.07683916389942169, -0.04779965057969093, 0.06322421878576279, 0.14506201446056366, 0.010208256542682648, -0.005990875884890556, 0.20571018755435944, 0.01017648447304964, -0.09941809624433517, -0.003878186224028468, 0.03278721868991852, -0.018655454739928246, -0.011572424322366714, -0.07140126079320908, -0.045390013605356216, 0.00045985550968907773, 0.06839552521705627, -0.13191670179367065, 0.014037210494279861, -0.038534607738256454, -0.01841307431459427, 0.0802786573767662, -0.1339527815580368, 0.015970628708600998, 0.01622866280376911, -0.0428493469953537, -0.062442053109407425, 0.06820717453956604, -0.10947206616401672, -0.12354273349046707, 0.022533146664500237, -0.04874749481678009, -0.03093412145972252, -0.12346260249614716, -0.11180063337087631, -0.004781187046319246, -0.07645338028669357, 0.010386024601757526, -0.10134417563676834, -0.08133477717638016, -0.021061034873127937, 0.03367098048329353, 0.0016937603941187263, -0.014582461677491665, -0.04688766971230507, 0.013040028512477875, -0.01017722301185131, -0.021422449499368668, 0.02033182606101036, -0.031384848058223724, 0.09509208798408508, 0.07820697128772736, 0.050811320543289185, 0.004702756647020578, 0.030999576672911644, -0.056118760257959366, 0.06983525305986404, -0.0692969560623169, 0.05798834562301636, -0.02008179761469364, 0.05721770226955414, -0.08858641237020493, -0.08205786347389221, 0.05572635307908058, 0.046464934945106506, 0.054163090884685516, 0.018019771203398705, -0.1114480048418045, 0.01885797083377838, 0.15256428718566895, -0.09483697265386581, -0.12703828513622284, 0.10118178278207779, -0.002382936654612422, -0.0019058962352573872, 0.05435272306203842, 0.11673223227262497, 0.14611497521400452, -0.08912775665521622, -0.03380350396037102, 0.08014590293169022, 0.06483884155750275, -0.0835021436214447, 0.10797715932130814, 0.0169904213398695, 0.04193277657032013, 0.016989318653941154, 0.02783283032476902, 0.07270749658346176, -0.012918784283101559, -0.0415322408080101, -0.01008241530507803, -0.09179714322090149, -0.042503613978624344, -0.01644570380449295, 0.03621075674891472, -0.04389820992946625, -0.06155557185411453, 0.013052241876721382, 0.16833306849002838, -0.10773120075464249, 0.0320860892534256, -0.08898360282182693, -0.028294378891587257, -0.11385449767112732, 0.008051762357354164, -0.1289728283882141, 0.020963620394468307, 0.047507643699645996, -0.059369880706071854, 0.06608324497938156, 0.078892283141613, -0.004276552703231573, 0.045460425317287445, -0.03558680787682533, -0.040178827941417694, -0.053349655121564865, -0.05450667813420296, -0.12650880217552185, -0.0207557063549757, -0.11110842227935791, -0.03261718526482582, -0.04710453376173973, -0.1603146642446518, 0.019468162208795547, -0.011078123934566975, 0.029041502624750137, -0.001273231115192175, -0.019842999055981636, 0.031116656959056854, 0.05970008671283722, -0.043828342109918594, -0.08016276359558105, 0.018470415845513344, 0.023357706144452095, -0.12290365248918533, -0.027583546936511993, -0.12653769552707672, -0.05590595304965973, 0.0664583072066307, 0.0749431923031807, -0.08947762101888657, -0.00028298344113864005, -0.03474834933876991, -0.0589115284383297, -0.016486506909132004, -0.06789939850568771, 0.1579716056585312, 0.005343139637261629, 0.1711275726556778, -0.14234963059425354, -0.04987909272313118, -0.0219559445977211, 0.005598396062850952, 0.012073639780282974, 0.15139392018318176, 0.030894866213202477, -0.07592335343360901, 0.03028966300189495, 0.013430853374302387, -0.053054165095090866, 0.15399256348609924, -0.013800887390971184, -0.11451052129268646, 0.015570716001093388, 0.09893649071455002, -0.011793753132224083, 0.1323625147342682, -0.06379281729459763, -0.014841591939330101, 0.0006939777522347867, 0.013655273243784904, 0.04580725356936455, -0.1367873102426529, 0.029761498793959618, 0.06241437420248985, -0.056173115968704224, -0.05829227343201637, -0.030916625633835793, -0.048487864434719086, 0.03736375644803047, -0.0018586654914543033, -0.009267712011933327, -0.003704790258780122, -0.027967089787125587, -0.09225605428218842, 0.20297135412693024, -0.0834764689207077, -0.17793044447898865, -0.1867542862892151, 0.05202697589993477, -0.049834977835416794, 0.013258530758321285, 0.045707110315561295, -0.09971613436937332, -0.051981810480356216, -0.0956389382481575, 0.12181230634450912, -0.10716132074594498, 0.012988794595003128, -0.00799758080393076, 0.032691504806280136, 0.03301737830042839, -0.1693451851606369, 0.042376499623060226, -0.004507908597588539, -0.00384967471472919, 0.0034192109014838934, -0.059128232300281525, 0.10164594650268555, 0.1419333666563034, -0.0881015807390213, 0.010755782946944237, -0.007166930008679628, 0.17874178290367126, -0.05342979356646538, 0.02419426292181015, 0.18422874808311462, -0.0008070718613453209, 0.038000643253326416, 0.04123128578066826, 0.013388808816671371, -0.09267381578683853, 0.05572144314646721, 0.03364207223057747, -0.03257550299167633, -0.25638318061828613, -0.0014587375335395336, -0.06211418658494949, 0.04252003878355026, 0.11129868030548096, 0.04910064861178398, -0.12665273249149323, 0.04033335670828819, -0.01320845726877451, 0.14971670508384705, -0.04715708643198013, 0.04939286410808563, -0.006797247566282749, 0.01353235263377428, -0.0017623647581785917, -0.09566191583871841, 0.0029907014686614275, 0.0709187239408493, 0.10407009720802307, 0.22667090594768524, -0.0604647733271122, 0.21926634013652802, 0.018678933382034302, 0.06658891588449478, 0.02599962055683136, 0.13372814655303955, -0.11012542247772217, 0.005979347042739391, 0.008336360566318035, -0.016461994498968124, -0.07584479451179504, 0.061435896903276443, -0.0043007852509617805, 0.04685703292489052, -0.07570560276508331, 0.0283596720546484, 0.006182577461004257, 0.18773344159126282, 0.044445380568504333, -0.18512049317359924, -0.10549552738666534, 0.007584619335830212, -0.09584187716245651, -0.11565133184194565, 0.06477317959070206, 0.20717772841453552, -0.03511917218565941, -0.010810903273522854, -0.010636535473167896, 0.13360239565372467, -0.06614629179239273, -0.030795786529779434, 0.022979767993092537, 0.0431697778403759, 0.01938447915017605, 0.1292627453804016, -0.21754534542560577, 0.10563749819993973, 0.01717825047671795, 0.08524490147829056, -0.030702145770192146, 0.07243658602237701, -0.05086209625005722, 0.01111151184886694, 0.08736467361450195, 0.002229205099865794, -0.0764603391289711, -0.20679427683353424, -0.05904465913772583, 0.01901896297931671, 0.06781032681465149, -0.019468003883957863, 0.0856461375951767, 0.006521779578179121, 0.05357953906059265, -0.028304727748036385, -0.12435522675514221, -0.06151721253991127, -0.14240805804729462, -0.0010688947513699532, 0.001518854172900319, -0.015910793095827103, -0.036935608834028244, 0.012885598465800285, -0.01390676386654377, 0.1912098526954651, -0.15203338861465454, -0.10038428753614426, -0.08975154906511307, 0.04909365996718407, 0.1404155194759369, -0.09626125544309616, 0.02980107255280018, 0.018178174272179604, 0.051041893661022186, -0.03951689600944519, -0.05421234294772148, 0.03507372364401817, -0.05257801711559296, -0.09140080958604813, -0.027223704382777214, 0.08873064070940018, -0.006938625127077103, 0.04308127611875534, -0.010390102863311768, -0.08428489416837692, -0.04556576535105705, -0.12915508449077606, -0.08170025050640106, -0.020824812352657318, 0.0431375578045845, -0.003951662685722113, -0.09352026879787445, 0.08574575930833817, -0.007326233666390181, -0.08903847634792328, 0.07576945424079895, 0.2158280462026596, -0.06729890406131744, 0.01487661525607109, 0.1240699514746666, -0.05443248152732849, -0.1680869311094284, -0.06986595690250397, 0.058202262967824936, 0.08714611083269119, -0.023134887218475342, -0.14687760174274445, 0.06419889628887177, 0.024128280580043793, 0.02991459146142006, 0.01476029772311449, -0.2806425988674164, -0.13284961879253387, 0.05394325405359268, 0.08394531160593033, 0.009420441463589668, -0.12230249494314194, -0.042437437921762466, -0.05817437916994095, -0.08975548297166824, 0.03313364088535309, 0.06250854581594467, 0.13127104938030243, -0.039588283747434616, 0.009358297102153301, 0.022586317732930183, -0.03458569943904877, 0.10978473722934723, 0.01750936731696129, 0.09313440322875977, -0.019374415278434753, 0.01804296113550663, 0.08855323493480682, -0.059283193200826645, 0.14588922262191772, -0.14666081964969635, 0.08324325829744339, -0.23203174769878387, -0.05748952925205231, -0.020472075790166855, -0.016961168497800827, -0.045426879078149796, -0.05490613728761673, -0.09113316982984543, 0.001695521641522646, 0.05400821194052696, -0.02389976568520069, 0.05622311308979988, -0.03829165920615196, -0.06055760011076927, 0.07495685666799545, 0.08541901409626007, -0.02326996996998787, -0.10918371379375458, 0.009559674188494682, 0.027417736127972603, 0.08839459717273712, -0.19386348128318787, 0.016382532194256783, 0.13103966414928436, 0.004724806174635887, 0.09574786573648453, 0.019666938111186028, -0.07181762903928757, 0.04374092072248459, 0.07587994635105133, -0.04507025331258774, -0.08297494053840637, -0.016606764867901802, -0.024846594780683517, -0.08961599320173264, 0.04379003494977951, 0.07851945608854294, -0.054126255214214325, -0.0243271142244339, -0.020161708816885948, -0.012854170054197311, -0.07409578561782837, 0.18851688504219055, 0.035236526280641556, 0.08712177723646164, -0.055635206401348114, 0.07870349287986755, 0.10826888680458069, -0.14288349449634552, 0.0019056977471336722, 0.14720533788204193, -0.08277136087417603, -0.037274155765771866, 0.06193127855658531, 0.07387499511241913, -0.05979689955711365, -0.0673968493938446, -0.09507875144481659, -0.06284750252962112, 0.03058469481766224, 0.053322236984968185, 0.0664038136601448, 0.08175276219844818, -0.022803684696555138, 0.013999403454363346, -0.09757291525602341, 0.08954381942749023, 0.06203140690922737, 0.050493136048316956, -0.1375422477722168, 0.16726012527942657, 0.022500302642583847, 0.10660660266876221, -0.0008113415096886456, 0.04043209180235863, -0.07313419133424759, 0.04389893636107445, -0.046884872019290924, 0.028042461723089218, -0.01795792393386364, 0.0413343720138073, -0.019354524090886116, 0.04052139073610306, -0.0192659143358469, 0.05624782294034958, -0.05666851997375488, -0.020800277590751648, -0.03314223140478134, 0.03828107938170433, -0.050660472363233566, -0.017700282856822014, 0.0027361339889466763, -0.07785632461309433, 0.10797804594039917, -0.05917022377252579, -0.011565075255930424, -0.002461919793859124, -0.0024995733983814716, 0.07900957018136978, 0.0055568949319422245, 0.052544593811035156, -0.016632230952382088, 0.016280096024274826, 0.039976075291633606, 0.019316352903842926, -0.01640247367322445, -0.015450524166226387, 0.0345412977039814, -0.14236536622047424, -0.07968761771917343, -0.08325105160474777, -0.06067020818591118, -0.0756472647190094, 0.07611022144556046, 0.07805201411247253, 0.07005342096090317, 0.07494588941335678, -0.01923608034849167, 0.004281087778508663, -0.12372151017189026, -0.03361373022198677, 0.057099226862192154, -0.0001515009644208476, -0.09113636612892151, -0.06608643382787704, 0.05305737629532814, -0.0400175116956234, 0.11806635558605194, -0.041727520525455475, 0.046669092029333115, -0.015412885695695877, -0.0604880228638649, 0.0016953629674389958, 0.019863126799464226, 0.21160945296287537, -0.09845259040594101, 0.01661093719303608, 0.004044800531119108, -0.005288413260132074, 0.030673198401927948, 0.16986960172653198, 0.08868908882141113, 0.11664406210184097, 0.061819955706596375, 0.11115166544914246, -0.045777350664138794, -0.031180255115032196, -0.1525319665670395, 0.028524227440357208, 0.013790338300168514, 0.050331614911556244, -0.039380986243486404, 0.09355677664279938, 0.12882578372955322, -0.11062786728143692, 0.07389642298221588, 0.015869617462158203, -0.10186964273452759, -0.03447393327951431, -0.04392283782362938, -0.0403556190431118, -0.07821206003427505, 0.01871093176305294, -0.09819809347391129, 0.013667014427483082, 0.07821521162986755, 0.044015150517225266, -0.0190524123609066, 0.1805608868598938, -0.025202222168445587, -0.043358784168958664, 0.0201879795640707, 0.020432105287909508, 0.053011395037174225, 0.09917473047971725, 0.005378710106015205, 0.08033004403114319, -0.07065533846616745, 0.06471546739339828, 0.01619747281074524, 0.0017967100720852613, 0.018146121874451637, 0.006033561658114195, -0.0029295897111296654, -0.0385444276034832, 0.0014447064604610205, 0.0821799635887146, 0.17302674055099487, 0.026305556297302246, -0.04515991359949112, -0.050286803394556046, 0.14793868362903595, -0.05789501965045929, -0.06845051795244217, -0.11448167264461517, 0.1264253407716751, 0.06677360087633133, 0.026989437639713287, 0.012714732438325882, -0.0870981439948082, -0.04457441717386246, 0.2175401747226715, 0.022888347506523132, -0.022007998079061508, -0.03181589022278786, 0.008299916982650757, -0.0029742864426225424, -0.045833196491003036, 0.1438988298177719, 0.012187073938548565, 0.20101897418498993, -0.009784701280295849, -0.016550932079553604, -0.04273271933197975, -0.02046782523393631, -0.0368027426302433, 0.18815749883651733, -0.038237135857343674, 0.025923732668161392, -0.07574581354856491, -0.010877118445932865, 0.04245654493570328, -0.12844406068325043, 0.11588200181722641, -0.07927172631025314, -0.06759384274482727, 0.03306535258889198, 0.07481091469526291, -0.0242801234126091, 0.03144006431102753, -0.02821154147386551, 0.046656157821416855, 0.06820737570524216, -0.02262008748948574, -0.09839329868555069, -0.10803808271884918, 0.05201394483447075, -0.04904768615961075, 0.15522843599319458, 0.024217434227466583, 0.09775746613740921, 0.08421525359153748, 0.017615899443626404, -0.08334463834762573, 0.1124056875705719, 0.042633164674043655, 0.007632571738213301, 0.07283363491296768, 0.13596199452877045, -0.03713404759764671, 0.13295407593250275, -0.026582835242152214, -0.034132715314626694, -0.013899141922593117, -0.01884864829480648, 0.0036882886197417974, -0.1363873928785324, 0.00007941915100673214, -0.06518424302339554, 0.13760744035243988, 0.17332878708839417, -0.04635898023843765, -0.029521280899643898, -0.03081389330327511, 0.06642696261405945, -0.015201623551547527, 0.09107818454504013, -0.00970395840704441, -0.164105623960495, 0.025369463488459587, -0.026375817134976387, 0.03617575764656067, -0.1668652594089508, -0.037451617419719696, -0.039118945598602295, -0.046448685228824615, -0.08288635313510895, 0.13958129286766052, 0.07143554836511612, 0.02218463271856308, -0.038678642362356186, -0.1728237271308899, -0.03263590857386589, 0.04410237446427345, -0.1557368040084839, -0.1271871030330658 ]
null
null
transformers
# CodeTrans model for source code summarization csharp Pretrained model on programming language csharp using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the csharp code snippets. ## Intended uses & limitations The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp_transfer_learning_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_csharp_transfer_learning_finetune", skip_special_tokens=True), device=0 ) tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/csharp/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_csharp_transfer_learning_finetune
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization csharp ==================================================== Pretrained model on programming language csharp using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the csharp code snippets. Intended uses & limitations --------------------------- The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 62, 87, 112 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.09317005425691605, 0.06870707869529724, -0.0009642562363296747, 0.09938507527112961, 0.045993622392416, 0.018630441278219223, 0.06088770553469658, 0.08224665373563766, -0.07148037105798721, 0.05865119770169258, 0.08654073625802994, -0.04939483478665352, 0.06439561396837234, 0.1839064359664917, 0.023444799706339836, -0.17792588472366333, -0.0066194087266922, 0.03251013904809952, -0.046593278646469116, 0.11376940459012985, 0.10185159742832184, -0.09387258440256119, 0.06207505613565445, -0.030618654564023018, -0.133965864777565, 0.055300530046224594, -0.05142961069941521, -0.05068915709853172, 0.09368675947189331, 0.06673495471477509, 0.1362481415271759, -0.01421897392719984, 0.08408446609973907, -0.19838182628154755, 0.0017920759273692966, 0.015395860187709332, 0.04194055497646332, 0.03223991021513939, 0.06046796962618828, 0.061662983149290085, 0.1501304656267166, -0.021773697808384895, 0.04038925841450691, 0.03962621092796326, -0.062088001519441605, -0.05018571391701698, -0.05393068492412567, 0.0871090367436409, 0.121234230697155, 0.09067129343748093, -0.00655090156942606, 0.03170458599925041, -0.07770119607448578, 0.09186316281557083, 0.12787185609340668, -0.22494739294052124, -0.02890942431986332, 0.09389881044626236, 0.09157778322696686, 0.06064991652965546, -0.0739908516407013, -0.03739389404654503, 0.10219154506921768, 0.03357347846031189, 0.05354700982570648, -0.07951021939516068, -0.07204070687294006, 0.00565725564956665, -0.06461681425571442, -0.059527553617954254, 0.16050773859024048, 0.03985538333654404, -0.050406526774168015, -0.09500780701637268, -0.048102766275405884, -0.20329631865024567, 0.04410766810178757, 0.014406204223632812, 0.019559647887945175, -0.014673559926450253, 0.019153300672769547, -0.002424301113933325, -0.11505549401044846, -0.10710012167692184, -0.013320962898433208, 0.07368914783000946, 0.062747523188591, 0.026255307719111443, -0.00818893313407898, 0.08364792913198471, 0.011919212527573109, -0.042065538465976715, -0.02667909488081932, 0.008979083970189095, -0.12719838321208954, 0.018601806834340096, -0.042497191578149796, -0.08299259841442108, -0.02803976647555828, 0.08505044132471085, -0.05780661851167679, 0.06546664237976074, 0.13136914372444153, 0.010322201065719128, 0.003184707136824727, 0.2345481514930725, 0.022860394790768623, -0.12743248045444489, 0.0010439739562571049, 0.024235136806964874, -0.013167567551136017, -0.008774934336543083, -0.06946337223052979, -0.03531313315033913, 0.012309742160141468, 0.06086874008178711, -0.1361413449048996, 0.025244349613785744, -0.03361036628484726, -0.013503706082701683, 0.05823834240436554, -0.13828419148921967, 0.022617163136601448, 0.008067677728831768, -0.051354553550481796, -0.050704874098300934, 0.07412131130695343, -0.10818959027528763, -0.10696689039468765, 0.02289978228509426, -0.05330466106534004, -0.0359916016459465, -0.12734074890613556, -0.11924450099468231, -0.013028561137616634, -0.035649966448545456, 0.008320744149386883, -0.1016714945435524, -0.09696900844573975, -0.008972001262009144, 0.033595502376556396, 0.007412545382976532, -0.0014138647820800543, -0.052039049565792084, 0.006757454015314579, -0.002383855637162924, -0.02099970541894436, 0.01864020712673664, -0.03358827531337738, 0.08926660567522049, 0.0772148072719574, 0.040642961859703064, -0.0022854956332594156, 0.027489300817251205, -0.06297702342271805, 0.06502040475606918, -0.05903048813343048, 0.05408663675189018, -0.02408565767109394, 0.04875018820166588, -0.08218146860599518, -0.08533160388469696, 0.04000154510140419, 0.05050107464194298, 0.05157534405589104, 0.014478483237326145, -0.13272292912006378, 0.019872039556503296, 0.13388720154762268, -0.10139540582895279, -0.12876427173614502, 0.09181495755910873, -0.014231152832508087, 0.001651777420192957, 0.05556397885084152, 0.11159726232290268, 0.12479985505342484, -0.09083453565835953, -0.038424037396907806, 0.08235063403844833, 0.06011452525854111, -0.05946856364607811, 0.1032537892460823, 0.02362016588449478, 0.04451737552881241, 0.02736581675708294, 0.017071297392249107, 0.06368347257375717, -0.01070308592170477, -0.03616537153720856, -0.01683005876839161, -0.08560669422149658, -0.04806560277938843, -0.016343271359801292, 0.03228629380464554, -0.05582815781235695, -0.06287229806184769, 0.024860238656401634, 0.1684940606355667, -0.11092212796211243, 0.030959665775299072, -0.08779887109994888, -0.046364426612854004, -0.09112140536308289, 0.0011369206476956606, -0.11730971932411194, 0.012475721538066864, 0.049118999391794205, -0.04112720116972923, 0.06498952209949493, 0.0811866745352745, -0.003979085944592953, 0.03294006735086441, -0.04231437295675278, -0.037455830723047256, -0.053629644215106964, -0.047750815749168396, -0.11740054190158844, -0.019319161772727966, -0.09692110866308212, -0.030157839879393578, -0.0389225110411644, -0.16863039135932922, 0.009747575968503952, -0.012087970040738583, 0.03354275971651077, -0.0011948277242481709, -0.018445665016770363, 0.016764190047979355, 0.05155043676495552, -0.05831396207213402, -0.08556481450796127, 0.0020974527578800917, 0.03158651292324066, -0.12880012392997742, -0.03145497664809227, -0.12706731259822845, -0.05427645146846771, 0.07640282809734344, 0.08224822580814362, -0.09540268033742905, 0.002012021141126752, -0.037016384303569794, -0.05310186371207237, -0.019606513902544975, -0.07050392031669617, 0.17704176902770996, 0.0031278745736926794, 0.1793850213289261, -0.14463794231414795, -0.0579705610871315, -0.03685537353157997, -0.0027311842422932386, 0.0024334429763257504, 0.13774603605270386, 0.0075597562827169895, -0.1043199673295021, 0.041266314685344696, -0.0011915898649021983, -0.0602075569331646, 0.15870343148708344, -0.006570524536073208, -0.10091336071491241, 0.006860706023871899, 0.08904800564050674, -0.015389818698167801, 0.15293031930923462, -0.0511871874332428, -0.01514055673032999, 0.0013615584466606379, 0.014371495693922043, 0.05203799530863762, -0.13357393443584442, 0.03248045966029167, 0.06694668531417847, -0.05514630675315857, -0.050703052431344986, -0.0404493622481823, -0.04657580330967903, 0.03879217058420181, 0.0009484888869337738, -0.005178360268473625, -0.013618109747767448, -0.030019115656614304, -0.09352583438158035, 0.20950329303741455, -0.10224442183971405, -0.2161414623260498, -0.1775767058134079, 0.048812903463840485, -0.048244625329971313, 0.010544250719249249, 0.03593595325946808, -0.09010244905948639, -0.06863919645547867, -0.10754004120826721, 0.11224771291017532, -0.12861661612987518, 0.010137194767594337, 0.0007631859625689685, 0.03507942333817482, 0.023921599611639977, -0.17813098430633545, 0.03802879527211189, -0.01239051390439272, 0.0027938629500567913, -0.0025229647289961576, -0.06912805140018463, 0.09669651091098785, 0.13496261835098267, -0.089571513235569, 0.015724923461675644, -0.009777229279279709, 0.16122563183307648, -0.055730171501636505, 0.025296533480286598, 0.16802066564559937, -0.0010291205253452063, 0.031854577362537384, 0.04050736874341965, 0.01068959105759859, -0.08648320287466049, 0.0599573515355587, 0.027221256867051125, -0.03281426057219505, -0.24472366273403168, -0.010150030255317688, -0.06800646334886551, 0.033517107367515564, 0.1092880368232727, 0.05035707354545593, -0.11633763462305069, 0.02895412966609001, -0.010351782664656639, 0.14358043670654297, -0.032832201570272446, 0.060996171087026596, -0.0017297073500230908, 0.005984077230095863, 0.012133367359638214, -0.09345345199108124, 0.005253119859844446, 0.07391161471605301, 0.08878585696220398, 0.21609485149383545, -0.054374683648347855, 0.20045731961727142, 0.018204009160399437, 0.06761948764324188, 0.038073115050792694, 0.12171190232038498, -0.11213193088769913, 0.0014944049762561917, 0.003699464024975896, -0.012471429072320461, -0.07784110307693481, 0.06261707842350006, -0.019209997728466988, 0.06656905263662338, -0.06651025265455246, 0.023424161598086357, 0.011259444989264011, 0.17558352649211884, 0.03967949375510216, -0.19584986567497253, -0.11626746505498886, 0.009365427307784557, -0.10016386955976486, -0.11142146587371826, 0.05708971619606018, 0.20253123342990875, -0.038885828107595444, -0.0037882328033447266, -0.003193332813680172, 0.1279415786266327, -0.0612092986702919, -0.030943775549530983, 0.02254306711256504, 0.054227784276008606, 0.012217137962579727, 0.12643642723560333, -0.22974808514118195, 0.11115432530641556, 0.016223343089222908, 0.08360415697097778, -0.028664641082286835, 0.0714481770992279, -0.044982943683862686, 0.006195170804858208, 0.07062840461730957, 0.0016189871821552515, -0.09562584012746811, -0.19540239870548248, -0.04144088178873062, 0.018002087250351906, 0.07806732505559921, -0.00918557494878769, 0.0927068293094635, -0.009823044762015343, 0.06044992059469223, -0.012534367851912975, -0.11125335097312927, -0.057458117604255676, -0.1472383290529251, -0.00140483514405787, 0.005134948994964361, -0.0203317292034626, -0.04321229085326195, 0.01628866232931614, -0.0216420516371727, 0.22848254442214966, -0.1731415092945099, -0.08890042454004288, -0.09708236902952194, 0.07536090910434723, 0.14565744996070862, -0.09296565502882004, 0.03470205143094063, 0.015921158716082573, 0.07267701625823975, -0.038317784667015076, -0.062064964324235916, 0.01752479560673237, -0.057142361998558044, -0.08317041397094727, -0.026760270819067955, 0.09630105644464493, -0.013504660688340664, 0.039962220937013626, 0.004773869179189205, -0.07703372836112976, -0.05531705915927887, -0.13146919012069702, -0.09667500108480453, -0.005240192636847496, 0.03300532326102257, 0.003047282574698329, -0.10186415910720825, 0.08159638941287994, -0.004416321869939566, -0.08797245472669601, 0.08124983310699463, 0.19699394702911377, -0.07083617150783539, 0.01850513368844986, 0.10057084262371063, -0.061135005205869675, -0.1618940681219101, -0.0643901377916336, 0.05665840953588486, 0.09164077788591385, -0.021771758794784546, -0.13311518728733063, 0.07058247178792953, 0.03242845833301544, 0.03172098100185394, -0.01614312268793583, -0.26842591166496277, -0.1287180781364441, 0.05660014972090721, 0.07413770258426666, 0.022119900211691856, -0.11128546297550201, -0.041110821068286896, -0.06482263654470444, -0.0781601220369339, 0.03867723420262337, 0.06992039084434509, 0.13007335364818573, -0.0328146331012249, 0.026031194254755974, 0.024693800136446953, -0.03150321543216705, 0.10956460237503052, 0.0011458370136097074, 0.10053461045026779, -0.01888044737279415, 0.005219773855060339, 0.09628085047006607, -0.057528018951416016, 0.1484682708978653, -0.14822518825531006, 0.09739392250776291, -0.2062494158744812, -0.05416674539446831, -0.009607084095478058, -0.0044032009318470955, -0.042975254356861115, -0.05453396588563919, -0.1093108281493187, 0.009946041740477085, 0.05659322068095207, -0.026685062795877457, 0.061670199036598206, -0.03320319578051567, -0.06246829777956009, 0.0616840198636055, 0.08533383160829544, -0.027943264693021774, -0.08996519446372986, 0.016539093106985092, 0.027478251606225967, 0.0767056792974472, -0.18384680151939392, 0.021069485694169998, 0.1280662566423416, 0.01400038693100214, 0.10777591168880463, 0.02458643540740013, -0.07040545344352722, 0.033247221261262894, 0.07407350838184357, -0.04788276553153992, -0.06881812959909439, -0.012175005860626698, -0.005662634037435055, -0.08886153250932693, 0.032048147171735764, 0.08885003626346588, -0.05298798531293869, -0.016744013875722885, -0.014363565482199192, -0.014499706216156483, -0.064023457467556, 0.17896082997322083, 0.021425532177090645, 0.08013805747032166, -0.055229730904102325, 0.08229044079780579, 0.09999639540910721, -0.10673509538173676, 0.005477216560393572, 0.13540416955947876, -0.08091907948255539, -0.022566458210349083, 0.06151723861694336, 0.08873214572668076, -0.06525412946939468, -0.06394372135400772, -0.09671120345592499, -0.07196331024169922, 0.017131540924310684, 0.0574462004005909, 0.07013344019651413, 0.08882281929254532, -0.03141488507390022, 0.021108092740178108, -0.09646008908748627, 0.09449737519025803, 0.0638904944062233, 0.05131661146879196, -0.1318194568157196, 0.17125414311885834, 0.025967413559556007, 0.0793766900897026, -0.003800314152613282, 0.03637070208787918, -0.08055181801319122, 0.03977184370160103, -0.016478795558214188, 0.0360943078994751, -0.014215460047125816, 0.03982829675078392, -0.013335693627595901, 0.026341836899518967, -0.03485915809869766, 0.05429143086075783, -0.04752369225025177, -0.02313937619328499, -0.029761629179120064, 0.03398154675960541, -0.051167428493499756, -0.021323008462786674, 0.011852419935166836, -0.07799014449119568, 0.096906878054142, -0.05366472154855728, -0.003197131911292672, 0.008547089993953705, -0.028263011947274208, 0.07306419312953949, 0.012041513808071613, 0.06177417188882828, -0.016397086903452873, 0.0075005837716162205, 0.03825122117996216, 0.0336480513215065, -0.02196662873029709, -0.012689894065260887, 0.03760755434632301, -0.13523313403129578, -0.10638634115457535, -0.09468314796686172, -0.05553732067346573, -0.07363444566726685, 0.07870247960090637, 0.0943446084856987, 0.07538148760795593, 0.078126460313797, -0.02412075735628605, -0.0013381281169131398, -0.13192060589790344, -0.032292913645505905, 0.05181698873639107, -0.010694283992052078, -0.09548172354698181, -0.06199692189693451, 0.05412031337618828, -0.045993342995643616, 0.1476907730102539, -0.03170720115303993, 0.044550005346536636, -0.01347112376242876, -0.05562677979469299, 0.015383414924144745, 0.012836461886763573, 0.24158576130867004, -0.08895833045244217, 0.01527217123657465, 0.005544096697121859, -0.0010372813558205962, 0.03679829090833664, 0.13950544595718384, 0.09062645584344864, 0.13135559856891632, 0.051786553114652634, 0.10889677703380585, -0.050172798335552216, -0.039302367717027664, -0.12387488782405853, 0.034920625388622284, 0.00374586577527225, 0.03393223509192467, -0.027613554149866104, 0.10636871308088303, 0.1313600391149521, -0.13161209225654602, 0.08475668728351593, 0.021042749285697937, -0.09763921052217484, -0.03336513787508011, -0.04262685775756836, -0.05094005912542343, -0.0804537683725357, 0.018476970493793488, -0.11965771019458771, 0.012151562608778477, 0.06374043971300125, 0.03960515558719635, -0.01716015301644802, 0.15234319865703583, -0.04303263500332832, -0.06045522168278694, 0.01608843170106411, 0.027264641597867012, 0.04873059689998627, 0.07907818257808685, -0.0013324475148692727, 0.07755745947360992, -0.06568751484155655, 0.07190605252981186, 0.009512045420706272, 0.02733573503792286, 0.008729525841772556, 0.00842768233269453, -0.00007945473043946549, -0.040515776723623276, -0.009009961038827896, 0.07839285582304001, 0.14971522986888885, 0.028645027428865433, -0.0454227440059185, -0.051526475697755814, 0.16187600791454315, -0.05538605898618698, -0.07366453856229782, -0.12207626551389694, 0.13624493777751923, 0.06434249877929688, 0.026745053008198738, 0.012030347250401974, -0.08429349213838577, -0.03947235643863678, 0.23232491314411163, 0.02186943218111992, -0.03181290999054909, -0.031081870198249817, 0.00017832803132478148, -0.006155967712402344, -0.043304361402988434, 0.12742190062999725, 0.016283294185996056, 0.20372027158737183, 0.005029832478612661, -0.025122253224253654, -0.050468266010284424, -0.02159152366220951, -0.024635275825858116, 0.2003142088651657, -0.03431385010480881, 0.015740197151899338, -0.08707323670387268, -0.021722989156842232, 0.04964422062039375, -0.14059039950370789, 0.11198357492685318, -0.10465552657842636, -0.06593084335327148, 0.03697015717625618, 0.08032451570034027, -0.034342341125011444, 0.028776224702596664, -0.022691087797284126, 0.04940479248762131, 0.054169151932001114, -0.01797756552696228, -0.09766487777233124, -0.10504220426082611, 0.051488347351551056, -0.053477488458156586, 0.1595330536365509, 0.018198896199464798, 0.09504922479391098, 0.08962228894233704, 0.026889730244874954, -0.07111867517232895, 0.11474911868572235, 0.040430013090372086, 0.000054737894970458, 0.07876831293106079, 0.1337062418460846, -0.03802182152867317, 0.14825130999088287, -0.007500613108277321, -0.0376674085855484, -0.009337279014289379, -0.020835278555750847, -0.01536603458225727, -0.1336103230714798, -0.004251484293490648, -0.06266530603170395, 0.14075696468353271, 0.17536009848117828, -0.04169221967458725, -0.024525923654437065, -0.03137477859854698, 0.0755435973405838, -0.017569363117218018, 0.10253958404064178, 0.0033507177140563726, -0.16713307797908783, 0.020426196977496147, -0.03376457840204239, 0.015327820554375648, -0.165922150015831, -0.049206312745809555, -0.03964957594871521, -0.05110805109143257, -0.06623581051826477, 0.13860531151294708, 0.07999250292778015, 0.024952217936515808, -0.03980834782123566, -0.15104635059833527, -0.021333925426006317, 0.044003769755363464, -0.14416392147541046, -0.11877363920211792 ]
null
null
transformers
# CodeTrans model for source code summarization python Pretrained model on programming language python using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization python dataset. ## Intended uses & limitations The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python", skip_special_tokens=True), device=0 ) tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) ''' pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/source%20code%20summarization/python/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_python
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization python ==================================================== Pretrained model on programming language python using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization python dataset. Intended uses & limitations --------------------------- The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 115 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.08860733360052109, 0.017160693183541298, -0.0006001046858727932, 0.04349924623966217, 0.16086836159229279, 0.020471887663006783, 0.08898057788610458, 0.05220251902937889, -0.020472129806876183, -0.04874681308865547, 0.0795515775680542, 0.15029504895210266, 0.028697984293103218, 0.15368743240833282, -0.018860433250665665, -0.22326116263866425, -0.0019136077025905252, 0.058567918837070465, -0.17546124756336212, 0.1351144164800644, 0.10726488381624222, -0.03355778381228447, 0.0886555165052414, 0.005819724407047033, -0.21633127331733704, 0.06482136249542236, 0.008466817438602448, -0.08171592652797699, 0.12604518234729767, 0.07849413901567459, 0.13425880670547485, 0.006683136336505413, 0.01804765872657299, -0.216693714261055, 0.03209909796714783, -0.03307691588997841, -0.007675210479646921, 0.03920748457312584, 0.0371880866587162, -0.08023171871900558, 0.21515198051929474, -0.016038790345191956, 0.05990903452038765, 0.05447465181350708, -0.11270024627447128, -0.11817799508571625, -0.02072261832654476, 0.0073150829412043095, 0.07860671728849411, 0.08722208440303802, 0.013742159120738506, 0.1410273313522339, -0.12554284930229187, 0.12932732701301575, 0.083123117685318, -0.1895856410264969, -0.017546553164720535, 0.11008241772651672, 0.06202330067753792, -0.07383744418621063, -0.034736569970846176, 0.0017029652372002602, 0.05661345273256302, 0.012250999920070171, 0.017544329166412354, -0.1223830133676529, -0.13123267889022827, 0.030336715281009674, -0.0941266268491745, -0.07532951980829239, 0.2824760377407074, -0.018633387982845306, -0.059198979288339615, -0.03255108743906021, -0.03956032544374466, 0.03380630537867546, -0.023355377838015556, 0.03897177800536156, -0.010274946689605713, -0.011012970469892025, -0.050432488322257996, 0.0015826828312128782, -0.08938825875520706, -0.09812258183956146, -0.008633090183138847, 0.14012575149536133, 0.0107431560754776, 0.03126245737075806, -0.16082540154457092, 0.10232760012149811, 0.07362513244152069, -0.05751664191484451, 0.030702704563736916, -0.05743110552430153, -0.05579058825969696, -0.022270534187555313, -0.05157099664211273, -0.1584625393152237, 0.06943129003047943, 0.13007105886936188, -0.025406375527381897, 0.05964367091655731, 0.025422757491469383, 0.05015762150287628, 0.059790078550577164, 0.18777750432491302, -0.00693871034309268, -0.04548337310552597, 0.05316467955708504, -0.04250086471438408, -0.04662277549505234, 0.006334397941827774, -0.06746023893356323, -0.0289921872317791, 0.01243736781179905, 0.1249786987900734, -0.07227800041437149, 0.09438866376876831, -0.0653599351644516, -0.04718368500471115, -0.05966254696249962, -0.13073180615901947, -0.02098878100514412, 0.013165566138923168, -0.06238408014178276, 0.022755172103643417, 0.12839080393314362, -0.07794041186571121, -0.10376103967428207, 0.024116959422826767, -0.07959560304880142, -0.004284413997083902, -0.08679584413766861, -0.10571932047605515, 0.007392845582216978, 0.07449834048748016, 0.0757002905011177, -0.1310863345861435, -0.15134213864803314, 0.015720106661319733, 0.0924772173166275, 0.023449905216693878, 0.025925206020474434, -0.07852624356746674, -0.00535770645365119, -0.02133731171488762, -0.008854633197188377, 0.0019050247501581907, -0.07941143959760666, 0.09706663340330124, 0.07188963145017624, 0.03705019876360893, -0.09157878905534744, 0.04741793870925903, -0.12035613507032394, 0.06442774832248688, -0.1458013504743576, 0.07575094699859619, -0.06460627168416977, 0.14339877665042877, -0.1130077913403511, -0.07184705883264542, 0.0561845488846302, 0.049657706171274185, 0.05502849817276001, 0.12903976440429688, -0.09477569907903671, -0.06839010119438171, 0.1583215594291687, -0.11593794077634811, -0.19265897572040558, 0.0784149318933487, -0.0709158256649971, 0.19416972994804382, 0.0805300772190094, 0.16170109808444977, 0.1514970064163208, -0.08458569645881653, 0.04867082089185715, 0.08386710286140442, -0.03652823343873024, -0.036985062062740326, 0.06791453808546066, 0.05327065661549568, -0.17166964709758759, 0.04885312169790268, -0.008832509629428387, 0.1464223563671112, -0.03925495222210884, -0.04093072563409805, -0.006643296219408512, -0.05381792411208153, 0.0691559910774231, -0.011064846999943256, 0.08444885909557343, 0.008058233186602592, -0.03708968684077263, 0.10004451870918274, 0.13638536632061005, -0.12526825070381165, -0.0064178104512393475, -0.1121286079287529, 0.0894700363278389, -0.09734044224023819, 0.025917084887623787, -0.19431473314762115, -0.046134404838085175, -0.015173333697021008, 0.028875162824988365, 0.0682443305850029, -0.006542902905493975, 0.006266770884394646, -0.0009116624132730067, 0.013765418902039528, 0.004295320250093937, 0.004424958024173975, -0.020287955179810524, -0.03353036940097809, -0.08435585349798203, -0.05483469367027283, -0.040497247129678726, 0.07663778960704803, -0.17543925344944, 0.004053147044032812, 0.04662472754716873, 0.06030837446451187, 0.0154345091432333, 0.02114376425743103, 0.03650931641459465, 0.05976090952754021, -0.04915379732847214, -0.016418440267443657, 0.05283936485648155, 0.007030125707387924, -0.12435093522071838, 0.0072959293611347675, -0.08435618877410889, 0.05037618800997734, 0.1294691115617752, -0.14035363495349884, -0.07206306606531143, -0.030369795858860016, -0.020620523020625114, -0.017418358474969864, 0.015024960041046143, -0.026672478765249252, 0.19297149777412415, 0.004619013983756304, 0.17782364785671234, -0.0875970870256424, -0.023878654465079308, -0.035892002284526825, -0.01992938667535782, 0.038965240120887756, 0.12645477056503296, 0.0896444171667099, -0.19380724430084229, 0.057382069528102875, 0.06320872157812119, -0.021837064996361732, 0.1733696311712265, -0.0541255958378315, -0.04692446067929268, -0.004577385261654854, 0.0754409208893776, -0.010067467577755451, 0.14957746863365173, -0.1826048493385315, -0.019501011818647385, 0.01575937494635582, -0.02278163470327854, 0.09701230376958847, -0.11409316956996918, -0.003626129124313593, 0.03742143139243126, -0.02127113565802574, -0.17830781638622284, 0.035465266555547714, 0.020124640315771103, 0.03074602037668228, 0.0026577613316476345, -0.01352281030267477, 0.02794647589325905, -0.021136874333024025, -0.12572194635868073, 0.2491578757762909, -0.08376973122358322, -0.2632632255554199, -0.1874053031206131, -0.021319737657904625, -0.00724594434723258, -0.026840712875127792, 0.04753633216023445, -0.05369872972369194, -0.053751517087221146, -0.006508091930299997, 0.18552179634571075, -0.07967925816774368, -0.028225228190422058, -0.05319402739405632, 0.059541091322898865, 0.007007195148617029, -0.1842992901802063, 0.005940225441008806, 0.012327049858868122, 0.02335699088871479, 0.02222287282347679, -0.1712690144777298, 0.09472005069255829, 0.10449052602052689, -0.06062930077314377, 0.03250214084982872, -0.04849730804562569, 0.25012338161468506, -0.07301994413137436, -0.10496613383293152, 0.12341627478599548, -0.11269205063581467, 0.011493582278490067, 0.027205221354961395, 0.011213591322302818, -0.1225871816277504, 0.037817928940057755, -0.03910817578434944, -0.06773950904607773, -0.24173538386821747, -0.11996512860059738, -0.085337795317173, 0.10551639646291733, 0.07826950401067734, 0.023556429892778397, -0.08220475167036057, 0.07234030961990356, 0.05987526848912239, 0.12845364212989807, -0.003772375173866749, 0.0844821035861969, 0.08061183243989944, 0.015598076395690441, 0.002655432326719165, -0.10467396676540375, -0.05649431422352791, 0.033910930156707764, 0.0759735181927681, 0.19486801326274872, 0.017399916425347328, 0.14504337310791016, 0.04781779646873474, 0.0448528416454792, 0.04021444171667099, 0.1881457418203354, -0.09904874116182327, 0.027088135480880737, 0.004270048812031746, -0.024826165288686752, -0.13679499924182892, 0.019532954320311546, -0.013281641528010368, 0.0327470600605011, -0.14675423502922058, -0.07796474546194077, 0.0452897772192955, 0.09102129191160202, 0.012632477097213268, -0.2618206739425659, -0.121847003698349, 0.027405355125665665, -0.1012464314699173, -0.06588808447122574, 0.047864459455013275, 0.05661370977759361, -0.1505529135465622, 0.03073257952928543, -0.06570316106081009, 0.16508564352989197, -0.0706694945693016, 0.005766936112195253, -0.07534697651863098, -0.05815184488892555, -0.0035223239101469517, 0.14366312325000763, -0.17939773201942444, 0.22710204124450684, 0.0024475974496454, 0.01675938256084919, -0.05163611099123955, 0.0336214080452919, 0.00799607764929533, 0.09508217126131058, 0.09882908314466476, -0.020261544734239578, -0.023806411772966385, -0.1755320280790329, 0.004512449260801077, 0.08648830652236938, 0.05450724810361862, -0.024127047508955002, 0.08684056252241135, -0.04377390444278717, 0.03425481542944908, -0.01426686067134142, -0.10073710232973099, -0.0513342022895813, -0.1247512698173523, -0.011526756919920444, -0.08892286568880081, 0.07278735190629959, -0.022165371105074883, 0.010188303887844086, 0.0831073448061943, 0.18051114678382874, -0.06312538683414459, -0.08290383219718933, -0.11326812952756882, 0.06050252541899681, 0.11777324974536896, -0.08118961751461029, 0.04951071739196777, -0.007045508828014135, 0.00813821330666542, -0.0011113190557807684, -0.17356427013874054, 0.06602536141872406, -0.059750065207481384, 0.026008568704128265, -0.01330669317394495, 0.10638748109340668, -0.01514535490423441, 0.013344939798116684, 0.05369960889220238, -0.05262547731399536, -0.06104636192321777, -0.12308359146118164, -0.1400429755449295, -0.05781443044543266, 0.012929280288517475, 0.09109502285718918, -0.11951463669538498, 0.008990220725536346, -0.0377374030649662, 0.007569184992462397, 0.2442476749420166, 0.11543837189674377, -0.032934751361608505, 0.015565309673547745, 0.07849358767271042, -0.09029877930879593, -0.24147500097751617, -0.006027544848620892, -0.020900998264551163, 0.06906525045633316, 0.013295403681695461, -0.1768987476825714, 0.10133776813745499, -0.0061058844439685345, 0.04760036617517471, 0.044040363281965256, -0.28890058398246765, -0.09597942233085632, 0.1322752982378006, 0.1284269243478775, 0.09071438759565353, -0.12671618163585663, -0.03869784623384476, -0.0839013159275055, -0.18745283782482147, 0.16773207485675812, -0.09316420555114746, 0.0995231494307518, -0.0010118476347997785, 0.10928241908550262, 0.03057202510535717, -0.03671339899301529, 0.1119920089840889, 0.0005518021062016487, 0.06948105990886688, -0.016006460413336754, -0.09090334177017212, 0.0968228280544281, -0.019914543256163597, 0.13549986481666565, -0.09641063958406448, 0.07187813520431519, -0.25892916321754456, -0.03866172581911087, -0.022274304181337357, 0.04808279499411583, -0.006936606485396624, -0.05985216051340103, -0.07422732561826706, 0.0006161804194562137, 0.043333832174539566, 0.015103396959602833, 0.11387009173631668, -0.0314040333032608, 0.034626513719558716, 0.05472413823008537, 0.13754375278949738, 0.004887157119810581, -0.1016491949558258, 0.048240143805742264, 0.01841137371957302, 0.1027987077832222, -0.24218516051769257, 0.07421375066041946, 0.12528111040592194, 0.050972938537597656, 0.10566013306379318, 0.08031799644231796, -0.03306876868009567, 0.04778425768017769, 0.09015640616416931, -0.12023556977510452, -0.09109016507863998, -0.03429708629846573, -0.04842875525355339, -0.022041091695427895, 0.06023596599698067, 0.14939413964748383, -0.05578511953353882, -0.022857677191495895, 0.004581274464726448, -0.035670481622219086, -0.14026644825935364, 0.13850167393684387, 0.034027792513370514, 0.06344247609376907, -0.09904337674379349, 0.05818779766559601, 0.05430431291460991, -0.12186232954263687, -0.03194190189242363, 0.11109478026628494, -0.13122810423374176, -0.0712808147072792, -0.021015189588069916, 0.20672917366027832, -0.1298903375864029, -0.05939750000834465, -0.10993314534425735, -0.06593629717826843, -0.013598504476249218, 0.2245892435312271, 0.10417208820581436, 0.08341331779956818, -0.05393289029598236, -0.011439240537583828, -0.11303990334272385, 0.03956504538655281, 0.10839527100324631, 0.010565546341240406, -0.09965001791715622, 0.103291355073452, 0.010540535673499107, 0.1300683617591858, -0.055073611438274384, -0.04934564232826233, -0.19019827246665955, 0.08822666853666306, -0.12401113659143448, 0.05795431509613991, -0.06799498945474625, 0.03683539479970932, 0.020310508087277412, 0.008551478385925293, -0.03148127347230911, 0.03134550154209137, -0.10205160081386566, 0.012378858402371407, 0.01258348673582077, 0.04073790833353996, -0.06195206940174103, 0.006533414591103792, 0.08948346972465515, -0.06827909499406815, 0.09641249477863312, 0.067596934735775, -0.04148447886109352, 0.13509239256381989, -0.17320875823497772, -0.045531321316957474, 0.04620177671313286, 0.019546983763575554, 0.05175615847110748, -0.04934793710708618, 0.044270798563957214, 0.006312513258308172, 0.05011676996946335, 0.010588791221380234, 0.10599978268146515, -0.12965014576911926, -0.10282568633556366, -0.04174760356545448, -0.10291401296854019, -0.03103168122470379, 0.01703834906220436, 0.006261806003749371, 0.08486995846033096, 0.12143983691930771, -0.029914304614067078, 0.04621277004480362, -0.06165456026792526, -0.026887070387601852, 0.018320202827453613, -0.0783783569931984, -0.03651583194732666, -0.0900096744298935, 0.02068103663623333, -0.04926157370209694, 0.1784042865037918, -0.018963752314448357, 0.13737033307552338, -0.006873725447803736, -0.017065400257706642, 0.06523091346025467, 0.04243475943803787, 0.26935556530952454, 0.0004803294432349503, 0.053976014256477356, -0.061314091086387634, 0.06323787569999695, 0.030852988362312317, 0.06958266347646713, 0.07888567447662354, 0.09836981445550919, -0.06759703904390335, 0.09070544689893723, 0.011735904030501842, 0.052228350192308426, -0.06338207423686981, -0.13303619623184204, 0.07450558245182037, 0.05836762115359306, -0.023491835221648216, 0.12429329752922058, 0.12008589506149292, -0.07156959176063538, 0.0875006914138794, 0.026261623948812485, -0.10057538002729416, -0.06023230776190758, 0.010142071172595024, -0.03854146599769592, -0.14936932921409607, 0.017069857567548752, -0.11235924810171127, -0.06877784430980682, 0.09101696312427521, 0.040867798030376434, -0.04140205308794975, 0.19975696504116058, -0.0035017330665141344, -0.09362971782684326, 0.044159941375255585, -0.011056154035031796, -0.008124743588268757, -0.004621696658432484, 0.06805405020713806, -0.02507658302783966, -0.031224019825458527, 0.016604207456111908, 0.04407299682497978, -0.06024627760052681, 0.029014304280281067, -0.0834360420703888, -0.032170966267585754, -0.04420490562915802, 0.06488708406686783, -0.0137086883187294, 0.06320251524448395, 0.02043900080025196, -0.03814668953418732, -0.02997918613255024, 0.21199791133403778, -0.042288873344659805, -0.08232685178518295, -0.15420730412006378, 0.2586973309516907, 0.012401950545608997, 0.05367332696914673, 0.0040296344086527824, -0.06351668387651443, -0.03483392670750618, 0.2828465700149536, 0.20360347628593445, -0.02398342825472355, 0.003981270361691713, -0.0021211616694927216, 0.018996773287653923, 0.005739368498325348, 0.14716868102550507, 0.032085321843624115, 0.22922013700008392, -0.03202621266245842, -0.07951317727565765, -0.049493011087179184, -0.046059370040893555, 0.004931811708956957, 0.09212339669466019, 0.03104843571782112, -0.05976893752813339, -0.05005800724029541, 0.12649843096733093, -0.17238354682922363, -0.09293278306722641, 0.019724639132618904, -0.1341848224401474, -0.0879984200000763, -0.07960796356201172, 0.0004695173993241042, -0.00675207981839776, 0.05912265181541443, -0.04915475472807884, -0.030734507367014885, 0.03470689430832863, 0.038533903658390045, -0.17329087853431702, -0.12782210111618042, 0.0919046401977539, -0.02243560552597046, 0.1275320053100586, -0.018699005246162415, 0.09960302710533142, 0.09278883785009384, 0.03922632709145546, -0.013473722152411938, 0.001996937207877636, 0.07286128401756287, 0.019155075773596764, 0.06017227843403816, 0.06770287454128265, -0.03759608045220375, 0.15057027339935303, -0.052972469478845596, -0.09899723529815674, 0.03345179557800293, -0.017214007675647736, 0.015420802868902683, -0.09657347202301025, -0.0441599078476429, -0.09929464757442474, 0.08386899530887604, 0.1964253932237625, -0.05659019574522972, 0.02232116274535656, -0.08100966364145279, 0.10818936675786972, 0.03005637787282467, -0.03967338800430298, -0.08678079396486282, -0.15878920257091522, -0.05021089315414429, -0.01930675283074379, -0.038411352783441544, -0.23377539217472076, 0.008615624159574509, -0.06250525265932083, 0.007294429931789637, -0.05177873373031616, 0.14214551448822021, 0.11666188389062881, 0.02086796797811985, -0.030802303925156593, -0.1490253359079361, -0.020312242209911346, 0.07490073889493942, -0.11771131306886673, -0.1360330432653427 ]
null
null
transformers
# CodeTrans model for source code summarization python Pretrained model on programming language python using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. ## Intended uses & limitations The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python_multitask"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python_multitask", skip_special_tokens=True), device=0 ) tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) ''' pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/python/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_python_multitask
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization python ==================================================== Pretrained model on programming language python using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. Intended uses & limitations --------------------------- The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 61, 146 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 300,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.12343768775463104, 0.01144791953265667, -0.0012199233751744032, 0.13261960446834564, 0.11747273802757263, 0.01629338599741459, 0.06371849775314331, 0.052920226007699966, -0.04415426030755043, 0.0196013655513525, 0.04768093302845955, 0.025024909526109695, 0.04612921550869942, 0.19651909172534943, 0.0215471014380455, -0.16717791557312012, -0.017291858792304993, 0.02422225847840309, -0.07888845354318619, 0.12797832489013672, 0.09347483515739441, -0.06316622346639633, 0.04689131677150726, -0.04502507671713829, -0.2095458060503006, 0.059911057353019714, 0.0033407509326934814, -0.06180572882294655, 0.10444369912147522, 0.051773637533187866, 0.13781794905662537, -0.02366621606051922, 0.038929954171180725, -0.13099785149097443, 0.005601141601800919, 0.02668175846338272, 0.038857217878103256, 0.018731512129306793, 0.06731443852186203, 0.025866946205496788, 0.16385196149349213, -0.005190201103687286, 0.05856749415397644, 0.05947386100888252, -0.06894850730895996, -0.12331994622945786, -0.02295844629406929, 0.025924768298864365, 0.05417794734239578, 0.10059081763029099, -0.013563248328864574, 0.11226966977119446, -0.14111119508743286, 0.12820415198802948, 0.08056722581386566, -0.2441689819097519, -0.012577028013765812, 0.09153218567371368, 0.05378005653619766, 0.062320902943611145, -0.03882816433906555, -0.061556603759527206, 0.08186125755310059, 0.06064514070749283, 0.040330395102500916, -0.0842926874756813, -0.08006319403648376, 0.009367162361741066, -0.1034175232052803, -0.07752886414527893, 0.22139933705329895, -0.002751864492893219, -0.0859595462679863, -0.05302063375711441, -0.043277934193611145, -0.11505722254514694, 0.03159888833761215, 0.05332907289266586, -0.004393770359456539, -0.019816379994153976, -0.003734936937689781, 0.03868415951728821, -0.08586873859167099, -0.1284300535917282, 0.016824113205075264, 0.10054504126310349, 0.05698689818382263, 0.03703498840332031, -0.07616425305604935, 0.11742965877056122, 0.02825879119336605, -0.0411340594291687, -0.005755747202783823, -0.02315298654139042, -0.11193680763244629, 0.03468891978263855, -0.04886684566736221, -0.18298542499542236, -0.009480079635977745, 0.027512751519680023, -0.030442919582128525, 0.057122547179460526, 0.039285119622945786, 0.021613039076328278, 0.038391128182411194, 0.18447838723659515, 0.026811694726347923, -0.10227518528699875, 0.05519814044237137, 0.04076283797621727, -0.03825300931930542, -0.013979190960526466, -0.049874961376190186, -0.0718049556016922, 0.06835439056158066, 0.09946811944246292, -0.11674278229475021, 0.04965992644429207, -0.06457992643117905, -0.045649752020835876, -0.04640284925699234, -0.16731132566928864, -0.001176199410110712, 0.018238119781017303, -0.06538434326648712, -0.021776223555207253, 0.09824240207672119, -0.16190984845161438, -0.15406222641468048, -0.008224867284297943, -0.08252891153097153, -0.04289266839623451, -0.15357525646686554, -0.1534489393234253, -0.01928635500371456, -0.019620023667812347, 0.010353345423936844, -0.07657819241285324, -0.16043734550476074, -0.009649884887039661, 0.03561076149344444, 0.009627759456634521, -0.006146022584289312, -0.0712260976433754, -0.0046920981258153915, -0.014833921566605568, -0.03060104511678219, -0.0014594802632927895, -0.04792429506778717, 0.12237732112407684, 0.10120248049497604, 0.027965374290943146, -0.029039934277534485, 0.04880340397357941, -0.07879757881164551, 0.06983363628387451, -0.09829892963171005, 0.10104921460151672, -0.06909917294979095, 0.08579019457101822, -0.04378115013241768, -0.1114848181605339, 0.05034675449132919, 0.05053596943616867, 0.06761261075735092, 0.054343607276678085, -0.1485883593559265, -0.03050336055457592, 0.19484029710292816, -0.12501509487628937, -0.10112851858139038, 0.11114047467708588, -0.04558124020695686, 0.06054893881082535, 0.09560850262641907, 0.12665143609046936, 0.14574278891086578, -0.032416924834251404, 0.00479460321366787, 0.05927035212516785, 0.04560590162873268, -0.09233582764863968, 0.08132495731115341, 0.04045030474662781, -0.10670669376850128, 0.06021810322999954, -0.005158680025488138, 0.1123606339097023, -0.011577446013689041, -0.031042972579598427, -0.04714987054467201, -0.06541265547275543, 0.014840449206531048, -0.002486167009919882, 0.05916493013501167, -0.07117968052625656, -0.08268533647060394, 0.08051817119121552, 0.17533841729164124, -0.12492044270038605, -0.01251843012869358, -0.0895569920539856, 0.05541297420859337, -0.06693749874830246, 0.01581784524023533, -0.16453684866428375, 0.01688024215400219, 0.0591111034154892, -0.023958083242177963, 0.06930838525295258, 0.10630718618631363, 0.013990831561386585, 0.057375453412532806, 0.011615184135735035, -0.006872185505926609, -0.10080600529909134, -0.06012040749192238, -0.06327296793460846, -0.052282240241765976, -0.09689342975616455, -0.04341728240251541, 0.01021505519747734, -0.1891573667526245, 0.013442798517644405, 0.013993547298014164, 0.018931346014142036, 0.008492475375533104, -0.021989937871694565, 0.022001231089234352, 0.06781939417123795, -0.058700017631053925, -0.039805758744478226, 0.021578656509518623, 0.016215592622756958, -0.056566931307315826, -0.07331068813800812, -0.10912033915519714, -0.0037841079756617546, 0.11506246030330658, 0.04840727522969246, -0.0814593955874443, 0.033476684242486954, -0.01490732654929161, -0.03382796794176102, 0.015274951234459877, -0.06535608321428299, 0.1701996624469757, 0.0029089341405779123, 0.203534796833992, -0.1468723565340042, -0.023298148065805435, -0.02639380656182766, 0.016472943127155304, 0.05802205950021744, 0.12432514876127243, 0.011229132302105427, -0.10860484838485718, 0.061203956604003906, -0.0024185250513255596, -0.07366777956485748, 0.21439138054847717, -0.04805680736899376, -0.09237056225538254, 0.020944083109498024, 0.11030350625514984, -0.015528291463851929, 0.15494802594184875, -0.16370107233524323, -0.01751014031469822, 0.010184075683355331, 0.007745887152850628, 0.06591292470693588, -0.131311297416687, 0.009460401721298695, 0.026345744729042053, -0.060142580419778824, -0.12411965429782867, -0.029104452580213547, -0.00012749060988426208, 0.036521848291158676, -0.0014715155120939016, -0.009355626069009304, 0.009736438281834126, -0.03615014627575874, -0.10567151010036469, 0.2274092733860016, -0.10819869488477707, -0.2386627197265625, -0.20413190126419067, 0.08506064862012863, -0.047694675624370575, -0.011746814474463463, 0.019808201119303703, -0.08445990085601807, -0.06456906348466873, -0.05262273922562599, 0.1986537128686905, -0.0996025949716568, -0.01274666003882885, -0.02566009946167469, 0.06142982840538025, 0.026023661717772484, -0.20930363237857819, 0.038962800055742264, -0.00453969044610858, -0.027250414714217186, -0.0004734841641038656, -0.09912922233343124, 0.07139886915683746, 0.15248647332191467, -0.06827298551797867, 0.016005726531147957, -0.003993200603872538, 0.21205392479896545, -0.03219674900174141, -0.07332804054021835, 0.12540629506111145, -0.013222318142652512, 0.012007620185613632, 0.01923215202987194, 0.0006977325538173318, -0.09589515626430511, 0.05360795557498932, 0.0027085018809884787, -0.03096184879541397, -0.2587586045265198, -0.03315930813550949, -0.08535810559988022, 0.059552572667598724, 0.05960465222597122, 0.04357733577489853, -0.10232522338628769, 0.034413304179906845, 0.03894772380590439, 0.1377677023410797, -0.010968279093503952, 0.050468482077121735, 0.06399615854024887, -0.003150769742205739, 0.012641914188861847, -0.0958797037601471, -0.008740118704736233, 0.0786978080868721, 0.09422097355127335, 0.27005791664123535, -0.09661602228879929, 0.18449945747852325, 0.032341863960027695, 0.06764672696590424, 0.04550957307219505, 0.16231006383895874, -0.11575204879045486, 0.036128588020801544, 0.005749273579567671, -0.011277667246758938, -0.13013409078121185, 0.016323542222380638, -0.03115662932395935, 0.07148610055446625, -0.11353062838315964, -0.04376133531332016, 0.0014303438365459442, 0.1654406040906906, 0.046141233295202255, -0.229924738407135, -0.12032559514045715, 0.02112646959722042, -0.1128498837351799, -0.10606196522712708, 0.05898118391633034, 0.2034154087305069, -0.08918245136737823, -0.012929264456033707, -0.028247641399502754, 0.12956157326698303, -0.04707014560699463, -0.031119124963879585, -0.05058310925960541, 0.062413182109594345, 0.010755088180303574, 0.13224352896213531, -0.22946199774742126, 0.13867558538913727, -0.006294545251876116, 0.06633021682500839, -0.03400599583983421, 0.07010070979595184, -0.036439232528209686, 0.04828617721796036, 0.04183530807495117, -0.0077719734981656075, -0.002441712422296405, -0.18414470553398132, -0.00792422704398632, 0.03627690672874451, 0.040526602417230606, 0.04421776905655861, 0.08137316256761551, -0.016493262723088264, 0.05189884081482887, -0.011958098039031029, -0.14264260232448578, -0.0559019111096859, -0.10850385576486588, -0.012193972244858742, -0.0637383684515953, -0.021570948883891106, -0.05529657006263733, -0.010796811431646347, 0.09407988935709, 0.1892557144165039, -0.10763746500015259, -0.09756273031234741, -0.08589756488800049, 0.06791190803050995, 0.135850727558136, -0.07307381182909012, 0.06233982741832733, -0.006647773087024689, 0.028874225914478302, 0.0006765873404219747, -0.10118883103132248, 0.04970591142773628, -0.03350978344678879, -0.050643209367990494, -0.016473036259412766, 0.09967073798179626, 0.001954741310328245, 0.028403757140040398, -0.0006765258149243891, -0.07340343296527863, -0.04659035801887512, -0.11758916079998016, -0.13609927892684937, -0.0440966859459877, -0.0034116823226213455, 0.07002614438533783, -0.1245027706027031, -0.04347310960292816, -0.017798082903027534, -0.015407050028443336, 0.14744551479816437, 0.14765983819961548, -0.06317658722400665, 0.02959834225475788, 0.10373316705226898, -0.0547885000705719, -0.17586541175842285, 0.007028260733932257, 0.06435294449329376, 0.11256930232048035, -0.036695532500743866, -0.19925685226917267, 0.05481153354048729, 0.02682565525174141, 0.04285771772265434, 0.06312764436006546, -0.3299003839492798, -0.11673631519079208, 0.07290583848953247, 0.13143494725227356, 0.06799415498971939, -0.1108294427394867, -0.03297976404428482, -0.05209064111113548, -0.11127923429012299, 0.10037798434495926, -0.014466577209532261, 0.13626888394355774, -0.040261100977659225, 0.0716550350189209, 0.03908190503716469, -0.04238488897681236, 0.07495953887701035, 0.008811946958303452, 0.09706763923168182, -0.03394550457596779, 0.022639766335487366, 0.10993873327970505, -0.026730719953775406, 0.17448650300502777, -0.14213211834430695, 0.10452350229024887, -0.2410377562046051, -0.06767332553863525, -0.06595595180988312, 0.01373207662254572, -0.032152846455574036, -0.04586805775761604, -0.07472874969244003, 0.021635815501213074, 0.004891241434961557, -0.00895626563578844, 0.02224547788500786, -0.02418653480708599, -0.010676223784685135, 0.0769769698381424, 0.1028636246919632, 0.00331358821131289, -0.11136786639690399, 0.04624045267701149, 0.05060088261961937, 0.09165364503860474, -0.20343613624572754, 0.023556670174002647, 0.11530052125453949, 0.033807918429374695, 0.11433914303779602, 0.049288634210824966, -0.10557975620031357, 0.04403122887015343, 0.09114545583724976, -0.0834721252322197, -0.0963229089975357, -0.011619419790804386, -0.04124528914690018, -0.08472545444965363, 0.044729799032211304, 0.09912103414535522, -0.04525820538401604, -0.025104664266109467, -0.02237224206328392, -0.02625795267522335, -0.11492030322551727, 0.2028103768825531, 0.05122952163219452, 0.08240767568349838, -0.07219333201646805, 0.061525579541921616, 0.07967956364154816, -0.07411620765924454, 0.006452403962612152, 0.1903316080570221, -0.10475493967533112, -0.040134914219379425, 0.03166210278868675, 0.14716343581676483, -0.03427138924598694, -0.0390128530561924, -0.1127464547753334, -0.07504500448703766, 0.02454446442425251, 0.1480976641178131, 0.08268775790929794, 0.10411231964826584, -0.057536400854587555, -0.001332843559794128, -0.09506847709417343, 0.07472818344831467, 0.08774611353874207, 0.03068031184375286, -0.12223747372627258, 0.15596599876880646, 0.0443539023399353, 0.100040502846241, -0.026186447590589523, -0.018284320831298828, -0.11227668076753616, 0.061737146228551865, -0.0873374417424202, 0.02643931657075882, -0.017923671752214432, 0.05101744458079338, -0.013497134670615196, 0.003328439313918352, -0.026115618646144867, 0.05975426733493805, -0.08908465504646301, -0.008542136289179325, -0.0018391730263829231, 0.026697302237153053, -0.041661590337753296, -0.003942844457924366, 0.039322737604379654, -0.09545408189296722, 0.11690376698970795, -0.002652791328728199, -0.024346301332116127, 0.09454968571662903, -0.03942868486046791, 0.029877575114369392, 0.010671097785234451, 0.05302508547902107, 0.010883847251534462, 0.030447032302618027, 0.0791943147778511, 0.023395061492919922, 0.06042547896504402, 0.015618319623172283, 0.10296210646629333, -0.14143280684947968, -0.1059628501534462, -0.03450034558773041, -0.10318407416343689, -0.05797765031456947, 0.09221424907445908, 0.04318518564105034, 0.09825088828802109, 0.10791222751140594, -0.03183409199118614, 0.02432754635810852, -0.13108552992343903, -0.06204451248049736, 0.017703784629702568, -0.030663415789604187, -0.06752203404903412, -0.054310142993927, 0.04039160907268524, -0.02757967635989189, 0.12230459600687027, 0.004094072617590427, 0.044416267424821854, -0.024657633155584335, -0.05159377679228783, 0.04023134708404541, 0.02249709889292717, 0.23608338832855225, -0.06263002008199692, 0.036392997950315475, -0.011375202797353268, 0.004790153354406357, 0.0032253246754407883, 0.11637160927057266, 0.10626565665006638, 0.14204156398773193, -0.036312710493803024, 0.08709607273340225, 0.01678333804011345, -0.007964937016367912, -0.10492852330207825, -0.010619601234793663, 0.017253810539841652, 0.07064903527498245, -0.05632956326007843, 0.18330025672912598, 0.076386459171772, -0.09546780586242676, 0.10645972192287445, 0.03093419410288334, -0.12925860285758972, -0.045011915266513824, 0.006439186632633209, -0.021136540919542313, -0.1361444592475891, 0.02861122414469719, -0.12383510917425156, -0.017317356541752815, 0.08449897915124893, 0.05311792716383934, -0.06503715366125107, 0.1737448126077652, 0.031968459486961365, -0.07450620830059052, 0.047163136303424835, 0.0074403611943125725, 0.024860557168722153, 0.03759557753801346, 0.021507011726498604, 0.026279764249920845, -0.03035951592028141, 0.05039772391319275, 0.018876399844884872, -0.0356057845056057, 0.0037949583493173122, -0.020004667341709137, 0.0013533257879316807, -0.02707773819565773, 0.025075992569327354, 0.044214483350515366, 0.1699836701154709, 0.029397055506706238, -0.08229060471057892, -0.044297024607658386, 0.1746893972158432, -0.042805176228284836, -0.08498319983482361, -0.1376015543937683, 0.19746053218841553, 0.027100779116153717, 0.020763112232089043, 0.020860770717263222, -0.09058846533298492, -0.049385957419872284, 0.19690543413162231, 0.08361057937145233, -0.010541117750108242, -0.02683970145881176, -0.008075849153101444, -0.005280429031699896, -0.05005645379424095, 0.20309706032276154, 0.02353372983634472, 0.2575187683105469, 0.007001836784183979, -0.0047205728478729725, -0.06494263559579849, -0.027745142579078674, -0.017919721081852913, 0.1273949295282364, -0.0425361730158329, -0.026095323264598846, -0.08426100015640259, 0.02006635256111622, 0.00024269166169688106, -0.08488184213638306, 0.06760555505752563, -0.12037889659404755, -0.10136651247739792, -0.03785407170653343, 0.02328154817223549, -0.039401281625032425, 0.03186362609267235, -0.03670506551861763, 0.03852907195687294, 0.056491024792194366, -0.01922120526432991, -0.12093843519687653, -0.16792061924934387, 0.10727531462907791, -0.03973265737295151, 0.14805084466934204, -0.008452480658888817, 0.12088818103075027, 0.09294296056032181, 0.0351969338953495, -0.049503546208143234, 0.08873483538627625, 0.03977806493639946, 0.05063780024647713, 0.04387197270989418, 0.12259458005428314, -0.047633130103349686, 0.1634155809879303, -0.058232810348272324, -0.026883915066719055, -0.016868356615304947, -0.06602828949689865, 0.0016354280523955822, -0.15332408249378204, -0.0164404958486557, -0.1044197827577591, 0.09631148725748062, 0.2085280418395996, -0.05024784430861473, -0.022375011816620827, -0.10042183101177216, 0.09191350638866425, -0.009978803806006908, 0.047869130969047546, -0.038798291236162186, -0.19250047206878662, -0.013605792075395584, -0.015032989904284477, 0.00414589699357748, -0.2668215334415436, -0.007351201958954334, -0.03727870061993599, -0.020816069096326828, -0.07756517827510834, 0.16476692259311676, 0.08163473755121231, 0.04271557554602623, -0.03482336923480034, -0.1339467465877533, -0.040068306028842926, 0.06061144918203354, -0.12749643623828888, -0.12541352212429047 ]
null
null
transformers
# CodeTrans model for source code summarization python Pretrained model on programming language python using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the python code snippets. ## Intended uses & limitations The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python_multitask_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python_multitask_finetune", skip_special_tokens=True), device=0 ) tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) ''' pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/python/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 600 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_python_multitask_finetune
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization python ==================================================== Pretrained model on programming language python using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the python code snippets. Intended uses & limitations --------------------------- The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 600 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 600 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 600 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 61, 88, 111 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 600 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.08028615266084671, 0.06961280107498169, -0.0015499700093641877, 0.1085950955748558, 0.059421904385089874, 0.026313384994864464, 0.04440094903111458, 0.10102859139442444, -0.03902478516101837, 0.06172378361225128, 0.07167991250753403, -0.03207889944314957, 0.06333550065755844, 0.17213074862957, 0.029219089075922966, -0.18723788857460022, -0.02300548553466797, 0.022901657968759537, -0.04611820727586746, 0.10925756394863129, 0.09155498445034027, -0.07259109616279602, 0.06764138489961624, -0.036602526903152466, -0.1183759868144989, 0.061136312782764435, -0.041547391563653946, -0.04334279149770737, 0.09106127917766571, 0.05035511776804924, 0.12204130738973618, -0.03564726188778877, 0.07727444916963577, -0.20341446995735168, -0.002702678320929408, 0.021150140091776848, 0.055252064019441605, 0.02157667465507984, 0.06931361556053162, 0.056774578988552094, 0.13935084640979767, -0.03496941551566124, 0.05521368235349655, 0.050228532403707504, -0.06201653182506561, -0.08143197000026703, -0.04911935329437256, 0.05633531138300896, 0.11129038780927658, 0.09613924473524094, -0.016258087009191513, 0.020445911213755608, -0.08061637729406357, 0.09047608822584152, 0.11109638214111328, -0.21949322521686554, -0.02522495575249195, 0.09865061193704605, 0.08069609105587006, 0.05484256520867348, -0.07538338750600815, -0.04034389182925224, 0.10346867144107819, 0.041393693536520004, 0.040208496153354645, -0.08122493326663971, -0.049819402396678925, -0.010327043011784554, -0.062436316162347794, -0.054565176367759705, 0.14051535725593567, 0.034940917044878006, -0.06110429763793945, -0.08976008743047714, -0.047516804188489914, -0.17748989164829254, 0.04266991466283798, 0.03896799683570862, 0.012606709264218807, -0.013020931743085384, 0.01511982548981905, 0.004962163511663675, -0.09417425841093063, -0.09475676715373993, -0.003132143057882786, 0.07700121402740479, 0.07588838785886765, 0.027455564588308334, -0.017002975568175316, 0.08292663842439651, -0.011003326624631882, -0.05043874680995941, -0.025578904896974564, 0.01578085869550705, -0.1265503168106079, 0.018622294068336487, -0.0143956383690238, -0.0645441859960556, -0.01726086623966694, 0.10471557080745697, -0.061369962990283966, 0.08237748593091965, 0.12020307034254074, -0.0015551985707134008, 0.0029953094199299812, 0.2234732210636139, 0.046608712524175644, -0.13554856181144714, -0.002747782738879323, 0.022103523835539818, -0.001277013448998332, -0.0036381336394697428, -0.057420942932367325, -0.0314054973423481, 0.006350432988256216, 0.060888759791851044, -0.12082494795322418, 0.01795792020857334, -0.041263531893491745, -0.011175621300935745, 0.021658683195710182, -0.1324823796749115, 0.03816068172454834, 0.0075428299605846405, -0.05903546139597893, -0.03337272256612778, 0.06496556848287582, -0.1177508756518364, -0.11177677661180496, 0.05309634283185005, -0.05322982743382454, -0.028854047879576683, -0.12139920890331268, -0.1293255090713501, -0.018364176154136658, -0.029651347547769547, 0.020035840570926666, -0.10388685017824173, -0.1018698588013649, -0.004193904809653759, 0.040231674909591675, -0.005786443594843149, -0.01682625152170658, -0.04784899204969406, 0.013229550793766975, 0.00013211253099143505, -0.017259905114769936, 0.015410968102514744, -0.04329676553606987, 0.09536819905042648, 0.10801906138658524, 0.04974446818232536, 0.01675175130367279, 0.029249224811792374, -0.07105995714664459, 0.07005859911441803, -0.06051439419388771, 0.057485681027173996, -0.025619594380259514, 0.06557784229516983, -0.09470677375793457, -0.0909518301486969, 0.05114841088652611, 0.048581331968307495, 0.05484962463378906, 0.021406913176178932, -0.10129488259553909, 0.01552466582506895, 0.1445867270231247, -0.09700723737478256, -0.12322178483009338, 0.11455575376749039, -0.004384100437164307, 0.0052016847766935825, 0.0668654665350914, 0.12625104188919067, 0.1422625631093979, -0.10583268105983734, -0.04729080572724342, 0.08818832784891129, 0.07582712918519974, -0.03209557756781578, 0.09956444054841995, 0.013903552666306496, 0.020690428093075752, 0.02206583507359028, 0.04348280653357506, 0.0708162859082222, -0.00574689544737339, -0.03403320163488388, -0.027340762317180634, -0.07849562913179398, -0.013443714939057827, -0.014971629716455936, 0.02228129468858242, -0.0729541927576065, -0.08218539506196976, 0.025144528597593307, 0.18533071875572205, -0.09711203724145889, 0.020113974809646606, -0.09687671065330505, -0.04527773708105087, -0.08956052362918854, 0.010549107566475868, -0.10439544171094894, 0.005299750249832869, 0.04276881366968155, -0.05899250879883766, 0.06868431717157364, 0.06267062574625015, 0.0013329234207049012, 0.03520514816045761, -0.039708759635686874, -0.04189135506749153, -0.054961204528808594, -0.05678143352270126, -0.1220807433128357, -0.007741875480860472, -0.09507059305906296, -0.02644355036318302, -0.04057757556438446, -0.1712110936641693, 0.016681330278515816, -0.02030624821782112, 0.02295597642660141, 0.00355402915738523, -0.02186003141105175, 0.025545932352542877, 0.05222626402974129, -0.060780420899391174, -0.08876083046197891, 0.004222327843308449, 0.015540258958935738, -0.10818438977003098, -0.07155665010213852, -0.12196396291255951, -0.05536118522286415, 0.07619524747133255, 0.09699632227420807, -0.08702525496482849, 0.03149578347802162, -0.020814789459109306, -0.05251898244023323, -0.049421049654483795, -0.0758574977517128, 0.17152269184589386, 0.009534899145364761, 0.17116303741931915, -0.1292838305234909, -0.04574144259095192, -0.034933190792798996, -0.015800224617123604, 0.018688896670937538, 0.1490887999534607, 0.012517892755568027, -0.09112725406885147, 0.04693308100104332, -0.015458946116268635, -0.06616093218326569, 0.15343520045280457, -0.003515923162922263, -0.09689147770404816, 0.01944217085838318, 0.10263735055923462, -0.008537806570529938, 0.14335553348064423, -0.07451192289590836, -0.001339766662567854, -0.003059468464925885, 0.02659258246421814, 0.04048866033554077, -0.13002909719944, 0.026747452095150948, 0.0670931413769722, -0.05795653536915779, -0.07791363447904587, -0.04014056548476219, -0.037352852523326874, 0.034116875380277634, 0.004499081056565046, -0.00034492829581722617, -0.015467279590666294, -0.020658519119024277, -0.0914907157421112, 0.21300570666790009, -0.08743873238563538, -0.21608322858810425, -0.1716868281364441, 0.0036502876318991184, -0.044596415013074875, -0.010524113662540913, 0.05079789459705353, -0.12238790094852448, -0.06624102592468262, -0.08478885143995285, 0.15064577758312225, -0.13991804420948029, 0.006843735463917255, -0.009528566151857376, 0.031077375635504723, 0.037810444831848145, -0.17671462893486023, 0.03458499163389206, -0.0015340327518060803, -0.010738184675574303, 0.008862252347171307, -0.06987552344799042, 0.08419321477413177, 0.1287447214126587, -0.08367081731557846, 0.010373169556260109, -0.006814795546233654, 0.1764499843120575, -0.05337226018309593, 0.011903482489287853, 0.18724122643470764, 0.016265684738755226, 0.040559399873018265, 0.044702090322971344, 0.016238445416092873, -0.09702911972999573, 0.06470286846160889, 0.05075892060995102, -0.03486736863851547, -0.24364587664604187, -0.010730071924626827, -0.07360711693763733, 0.05556708946824074, 0.1302729696035385, 0.04967782646417618, -0.1537533849477768, 0.030942482873797417, -0.00759883364662528, 0.13912607729434967, -0.037687040865421295, 0.06762012839317322, 0.018139613792300224, 0.016556764021515846, 0.00991600938141346, -0.09436620026826859, 0.002105120802298188, 0.07508154958486557, 0.11600759625434875, 0.2117440104484558, -0.05511832609772682, 0.1773308366537094, 0.0071700457483530045, 0.07347248494625092, 0.02712826244533062, 0.10861077159643173, -0.12363063544034958, 0.0006772862398065627, 0.008808252401649952, -0.0046269516460597515, -0.06265709549188614, 0.05057824030518532, -0.02187674678862095, 0.06446833908557892, -0.06873532384634018, 0.030570823699235916, 0.011838809587061405, 0.15719099342823029, 0.0741783156991005, -0.1924755871295929, -0.12002348154783249, 0.02079070545732975, -0.11674908548593521, -0.11499430984258652, 0.06421589851379395, 0.18197283148765564, -0.05609957128763199, 0.03030303306877613, -0.01975470781326294, 0.1363111138343811, -0.06505779922008514, -0.024785220623016357, 0.019675204530358315, 0.06656207889318466, 0.003564538899809122, 0.12814338505268097, -0.24449576437473297, 0.08172330260276794, 0.013622931204736233, 0.09470074623823166, -0.012709156610071659, 0.07403135299682617, -0.036277707666158676, 0.00689019775018096, 0.07026319950819016, -0.0007834500283934176, -0.07249461859464645, -0.20777975022792816, -0.05200981721282005, 0.02826131507754326, 0.06156373396515846, -0.011295858770608902, 0.1026579961180687, -0.02114749327301979, 0.06515020877122879, -0.029686208814382553, -0.13871076703071594, -0.04495001211762428, -0.14227208495140076, -0.02239970676600933, -0.021580278873443604, -0.028965286910533905, -0.02805003896355629, 0.021765144541859627, 0.00907634012401104, 0.22295859456062317, -0.16099263727664948, -0.10966241359710693, -0.09593908488750458, 0.08169513940811157, 0.1293148249387741, -0.09744664281606674, 0.02579215168952942, 0.018269242718815804, 0.05187578126788139, -0.03805060312151909, -0.06515292078256607, 0.02312387153506279, -0.05810651555657387, -0.06746511161327362, -0.024683039635419846, 0.09011146426200867, -0.02394246868789196, 0.052495796233415604, 0.001322053256444633, -0.07950273156166077, -0.05910642445087433, -0.1267692595720291, -0.09114297479391098, -0.027396056801080704, 0.01852891780436039, 0.012069818563759327, -0.0883062407374382, 0.08251331746578217, -0.02204909920692444, -0.08272436261177063, 0.08388791978359222, 0.1993684023618698, -0.06130845844745636, 0.007167130243033171, 0.08519589900970459, -0.05371445417404175, -0.15602342784404755, -0.0561654232442379, 0.05470675230026245, 0.08519488573074341, -0.016146454960107803, -0.16003619134426117, 0.0769154280424118, 0.04092613607645035, 0.03157607465982437, 0.018849367275834084, -0.292802631855011, -0.1231534406542778, 0.04436154291033745, 0.0701519325375557, 0.047152064740657806, -0.12005952000617981, -0.033558692783117294, -0.05780075117945671, -0.0807553380727768, 0.028544874861836433, 0.05973677337169647, 0.13319654762744904, -0.03659650310873985, 0.06134061515331268, 0.03341836854815483, -0.02334703877568245, 0.10515936464071274, -0.004952550865709782, 0.0899081900715828, -0.0156322680413723, 0.024358505383133888, 0.04936089739203453, -0.06123792380094528, 0.16934962570667267, -0.1775042861700058, 0.08575581759214401, -0.23258040845394135, -0.05202759802341461, -0.008477848954498768, -0.00933151412755251, -0.03885737061500549, -0.05987308546900749, -0.10373669117689133, -0.001310948864556849, 0.054543524980545044, -0.02623426541686058, 0.07183070480823517, -0.022471539676189423, -0.04680871590971947, 0.04155980050563812, 0.07998330146074295, -0.017305810004472733, -0.15827709436416626, 0.023161916062235832, 0.03316383808851242, 0.0817992314696312, -0.20925962924957275, 0.014602902345359325, 0.12228049337863922, 0.023222843185067177, 0.10575397312641144, 0.017747538164258003, -0.071573406457901, 0.04744825139641762, 0.0749388262629509, -0.03198001906275749, -0.10369744151830673, -0.0036368335131555796, -0.01137377880513668, -0.10631800442934036, 0.026572920382022858, 0.08985523134469986, -0.04995335265994072, -0.02192831225693226, -0.011063500307500362, 0.005564650986343622, -0.07181482762098312, 0.1981414407491684, 0.01830545999109745, 0.08628968149423599, -0.06181376427412033, 0.06996951252222061, 0.10170987993478775, -0.10422395914793015, 0.0073422458954155445, 0.16967830061912537, -0.07624033093452454, -0.020999619737267494, 0.05280014127492905, 0.08401388674974442, -0.06575073301792145, -0.04879363253712654, -0.09013137966394424, -0.07090796530246735, 0.010759809985756874, 0.009519068524241447, 0.06986191123723984, 0.06757424771785736, -0.038906823843717575, 0.018541784957051277, -0.10601214319467545, 0.09503171592950821, 0.07676465809345245, 0.05580941587686539, -0.14776137471199036, 0.14544597268104553, 0.03429817035794258, 0.08820667862892151, 0.003235016716644168, 0.03242292255163193, -0.09269486367702484, 0.04565401002764702, -0.031688887625932693, 0.034994788467884064, -0.006201187614351511, 0.05124399811029434, -0.015866011381149292, 0.02543996460735798, -0.030742840841412544, 0.04756475239992142, -0.04744724929332733, -0.033672649413347244, -0.024973580613732338, 0.02872709557414055, -0.04848567023873329, -0.013432685285806656, 0.016188396140933037, -0.08655775338411331, 0.09413858503103256, -0.05927155166864395, -0.0012975650606676936, 0.011666228994727135, 0.0010421439073979855, 0.05457935854792595, 0.02324013225734234, 0.05366988107562065, -0.00707498611882329, -0.006724767852574587, 0.04270702973008156, 0.008862514048814774, -0.0010101081570610404, -0.0037051152903586626, 0.04817861691117287, -0.14668229222297668, -0.09029459208250046, -0.09858015924692154, -0.07022811472415924, -0.06575541198253632, 0.07939886301755905, 0.0701598972082138, 0.0717574805021286, 0.10442249476909637, -0.03139805793762207, 0.008108130656182766, -0.13229846954345703, -0.04148763045668602, 0.0442282110452652, -0.011965116485953331, -0.08029557764530182, -0.047703541815280914, 0.051036860793828964, -0.04501021280884743, 0.10861548036336899, -0.010345115326344967, 0.044594623148441315, -0.007731257472187281, -0.05147496610879898, 0.004139913246035576, -0.005646458361297846, 0.22099262475967407, -0.08813604712486267, 0.018217192962765694, 0.0004784050688613206, -0.007107358425855637, 0.04493357241153717, 0.1501917690038681, 0.09526999294757843, 0.13200950622558594, 0.03328729048371315, 0.0963236540555954, -0.055030252784490585, -0.035096511244773865, -0.1897149533033371, 0.02568112313747406, 0.0024131659884005785, 0.04288168624043465, -0.022220809012651443, 0.10361731797456741, 0.14433553814888, -0.12070689350366592, 0.0894317477941513, 0.019915008917450905, -0.09987438470125198, -0.04988236352801323, -0.03646641597151756, -0.04619807377457619, -0.09704454243183136, 0.023981060832738876, -0.10905534774065018, 0.02781495824456215, 0.09131279587745667, 0.04138769581913948, -0.01768120564520359, 0.15782961249351501, -0.011762731708586216, -0.06857557594776154, 0.005689884070307016, 0.02506469190120697, 0.037650201469659805, 0.10860484093427658, 0.011661075055599213, 0.06274209171533585, -0.06468906253576279, 0.08208748698234558, 0.017560215666890144, -0.0008272199775092304, 0.020572559908032417, -0.008090135641396046, -0.0035219553392380476, -0.046567559242248535, 0.004590686876326799, 0.07652029395103455, 0.17137126624584198, 0.037315819412469864, -0.050176821649074554, -0.0555962510406971, 0.18640638887882233, -0.05857955664396286, -0.07163093984127045, -0.12066618353128433, 0.19212976098060608, 0.052313972264528275, 0.028368541970849037, 0.010408525355160236, -0.08624089509248734, -0.051351990550756454, 0.21953439712524414, 0.018092310056090355, -0.011842374689877033, -0.03990798443555832, -0.014433587901294231, -0.006802514661103487, -0.03721968084573746, 0.14543510973453522, 0.01816081628203392, 0.22021305561065674, -0.0008813487365841866, -0.004675829783082008, -0.03768470883369446, -0.028775351122021675, -0.04608612880110741, 0.18732179701328278, -0.03071798011660576, 0.028835376724600792, -0.0943622812628746, -0.0023100359831005335, 0.04889683425426483, -0.10966029763221741, 0.10110092908143997, -0.09478161484003067, -0.07773929089307785, 0.029017910361289978, 0.0757938027381897, -0.030486714094877243, 0.04572898522019386, -0.022009175270795822, 0.04130713641643524, 0.025086402893066406, -0.025203406810760498, -0.10151759535074234, -0.13399703800678253, 0.06426504254341125, -0.024340320378541946, 0.15750977396965027, 0.02548537403345108, 0.07701346278190613, 0.09267637133598328, 0.012711421586573124, -0.0661347284913063, 0.10543597489595413, 0.038948193192481995, 0.01524818129837513, 0.0774020180106163, 0.13247384130954742, -0.04432936757802963, 0.14689911901950836, -0.0008167431224137545, -0.03384842351078987, -0.03786170482635498, -0.013980258256196976, 0.0049228849820792675, -0.1417495608329773, -0.006304526701569557, -0.0647227019071579, 0.12559184432029724, 0.2028796523809433, -0.050176504999399185, -0.023034431040287018, -0.040178705006837845, 0.07178062945604324, -0.008216203190386295, 0.0944717600941658, -0.00955947581678629, -0.17117220163345337, 0.007762908935546875, -0.04039609432220459, 0.008328399620950222, -0.18960867822170258, -0.04591693356633186, -0.03660750016570091, -0.03121579997241497, -0.08825302869081497, 0.15033447742462158, 0.08085107058286667, 0.01793290674686432, -0.04551369324326515, -0.19480310380458832, -0.02917182259261608, 0.050105415284633636, -0.14054183661937714, -0.12081802636384964 ]
null
null
transformers
# CodeTrans model for source code summarization python Pretrained model on programming language python using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the python code snippets. ## Intended uses & limitations The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python_transfer_learning_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_python_transfer_learning_finetune", skip_special_tokens=True), device=0 ) tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) ''' pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/python/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_python_transfer_learning_finetune
[ "transformers", "pytorch", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization python ==================================================== Pretrained model on programming language python using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the python code snippets. Intended uses & limitations --------------------------- The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 43, 61, 87, 111 ]
[ "passage: TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.10556819289922714, 0.06999924033880234, -0.0012711700983345509, 0.10930309444665909, 0.051390793174505234, 0.028731709346175194, 0.036255817860364914, 0.10493028908967972, -0.055347975343465805, 0.06139944866299629, 0.04775906726717949, -0.06413298845291138, 0.07173033803701401, 0.19983811676502228, 0.023317795246839523, -0.16045185923576355, -0.03751278296113014, 0.04198317974805832, -0.0874851644039154, 0.11508326977491379, 0.07971552759408951, -0.07633429765701294, 0.0787874162197113, -0.0440300852060318, -0.1247667521238327, 0.05061621591448784, -0.018942754715681076, -0.029326828196644783, 0.09386610984802246, 0.06640156358480453, 0.12998706102371216, -0.035836443305015564, 0.06695479154586792, -0.18194331228733063, 0.0007373932749032974, 0.03642144426703453, 0.05743269994854927, 0.03638570383191109, 0.05834652855992317, 0.07555577903985977, 0.14153778553009033, -0.04614827409386635, 0.051919594407081604, 0.05167960375547409, -0.0644407868385315, -0.05788072198629379, -0.05383806303143501, 0.07405060529708862, 0.06956209987401962, 0.10010911524295807, -0.008594228886067867, 0.05831587314605713, -0.08236659318208694, 0.08904555439949036, 0.0798901915550232, -0.2203337699174881, -0.02581891603767872, 0.09608975052833557, 0.06415480375289917, 0.04058391973376274, -0.0696820616722107, -0.04469737783074379, 0.09900081902742386, 0.04573334753513336, 0.05479512736201286, -0.08195064216852188, -0.029091348871588707, -0.016295187175273895, -0.06316547840833664, -0.05830829218029976, 0.1788926124572754, 0.026942666620016098, -0.05926987901329994, -0.09530093520879745, -0.0574316680431366, -0.17945171892642975, 0.04403657838702202, 0.024148665368556976, 0.0016606387216597795, -0.004614940378814936, -0.009627770632505417, 0.0015953956171870232, -0.0882883295416832, -0.10145669430494308, 0.016665656119585037, 0.02718263491988182, 0.0659993514418602, 0.03648541867733002, -0.042532697319984436, 0.09557836502790451, 0.021559035405516624, -0.04231831431388855, -0.0025120284408330917, 0.007306092884391546, -0.11432787030935287, 0.0006346915033645928, -0.006693199276924133, -0.09086773544549942, -0.015364761464297771, 0.07450658828020096, -0.0862656980752945, 0.07412503659725189, 0.09979645162820816, 0.006887268275022507, 0.020670369267463684, 0.2041856199502945, 0.03973869979381561, -0.14499670267105103, 0.03385581821203232, 0.026936713606119156, -0.0007124049589037895, 0.015479939989745617, -0.0462089441716671, -0.045009028166532516, 0.028961878269910812, 0.06403390318155289, -0.11674778163433075, 0.03862566500902176, -0.05210642144083977, -0.017882272601127625, 0.01675780862569809, -0.13028381764888763, 0.029922736808657646, 0.010734028182923794, -0.06515124440193176, -0.016130642965435982, 0.09300007671117783, -0.1466270238161087, -0.1283131241798401, 0.03787444904446602, -0.05517593026161194, -0.041022103279829025, -0.12125712633132935, -0.12622283399105072, -0.006983387749642134, -0.007439690176397562, 0.0026995455846190453, -0.1107492595911026, -0.10775813460350037, -0.0030787521973252296, 0.04464760050177574, -0.0038767193909734488, -0.033886831253767014, -0.039805833250284195, 0.007273737341165543, -0.005208454094827175, -0.014771209098398685, 0.0010830038227140903, -0.03612853214144707, 0.10371895879507065, 0.08100645989179611, 0.04211495444178581, -0.01602267473936081, 0.026129774749279022, -0.0940249115228653, 0.0774073675274849, -0.08932008594274521, 0.06365299969911575, -0.0074785626493394375, 0.05725037306547165, -0.09411081671714783, -0.08842215687036514, 0.003236806020140648, 0.0432826392352581, 0.07534044981002808, 0.04270682856440544, -0.14120450615882874, 0.024822786450386047, 0.1567644476890564, -0.11490689963102341, -0.1233072429895401, 0.11421656608581543, -0.012520648539066315, 0.03826286271214485, 0.08082587271928787, 0.11745624989271164, 0.16474054753780365, -0.09185714274644852, -0.03465376794338226, 0.08929309993982315, 0.048094142228364944, -0.04842491075396538, 0.059948112815618515, 0.017323948442935944, -0.03420256823301315, 0.019653573632240295, 0.06505797803401947, 0.07881017029285431, -0.007341155782341957, -0.03619645908474922, -0.041495658457279205, -0.09119925647974014, -0.03660105913877487, -0.008962971158325672, 0.019984418526291847, -0.0584145225584507, -0.07419782876968384, 0.02137400209903717, 0.17364175617694855, -0.09495289623737335, 0.01122999656945467, -0.08293535560369492, -0.017554260790348053, -0.0635010302066803, 0.017882972955703735, -0.11417544633150101, -0.00016425628564320505, 0.05088412016630173, -0.04237206652760506, 0.06806045770645142, 0.079967200756073, 0.004708362743258476, 0.031574495136737823, -0.04646658897399902, -0.03383409231901169, -0.04153550788760185, -0.06528329849243164, -0.1051594465970993, -0.013549531809985638, -0.08967261016368866, -0.015833627432584763, -0.02343255840241909, -0.17243893444538116, 0.002351253293454647, 0.0018262868979945779, 0.03095949999988079, 0.0166891161352396, -0.031830914318561554, 0.03077235259115696, 0.04917008429765701, -0.05477192625403404, -0.08321093767881393, 0.01130041666328907, 0.03653240576386452, -0.09006986021995544, -0.060640107840299606, -0.10549134016036987, -0.07019923627376556, 0.08972431719303131, 0.10018831491470337, -0.12482423335313797, -0.0028145015239715576, -0.02025003358721733, -0.04176194965839386, -0.05218532308936119, -0.0692114531993866, 0.1659647673368454, 0.0057948981411755085, 0.17006516456604004, -0.1374647170305252, -0.05032559484243393, -0.03685547783970833, 0.008791976608335972, 0.039063189178705215, 0.13499748706817627, 0.030920110642910004, -0.11107940226793289, 0.04253512993454933, -0.04767199233174324, -0.05295984819531441, 0.16659781336784363, -0.02014978416264057, -0.07498487830162048, 0.014541004784405231, 0.11495474725961685, 0.0008721645572222769, 0.18117721378803253, -0.05205513909459114, 0.014948705211281776, -0.00887194275856018, 0.010252696461975574, 0.04262928664684296, -0.12884147465229034, 0.031167130917310715, 0.05177503824234009, -0.054538752883672714, -0.058018673211336136, -0.035643767565488815, -0.02090868353843689, 0.04621510207653046, 0.01490684412419796, 0.030205892398953438, -0.017829889431595802, -0.025042345747351646, -0.10379527509212494, 0.19054079055786133, -0.07675326615571976, -0.22920241951942444, -0.16540005803108215, 0.08789230138063431, -0.01443908829241991, -0.019059788435697556, 0.024409379810094833, -0.0889134556055069, -0.06246180459856987, -0.09324564039707184, 0.15450680255889893, -0.11798939108848572, 0.006131164729595184, -0.03981504216790199, 0.054426614195108414, 0.06317656487226486, -0.17152346670627594, 0.03887387365102768, -0.006535507272928953, 0.0028070881962776184, -0.007232160773128271, -0.07174286246299744, 0.07246138155460358, 0.12045162916183472, -0.08340799063444138, 0.010910041630268097, -0.013050050474703312, 0.1903383582830429, -0.05339059606194496, 0.04319114610552788, 0.15409457683563232, 0.012015380896627903, 0.03196337819099426, 0.05019993707537651, 0.007490256801247597, -0.09588064253330231, 0.06941904127597809, 0.03889988735318184, -0.036949656903743744, -0.20813611149787903, -0.03702232241630554, -0.08217310905456543, 0.06814650446176529, 0.1347520798444748, 0.04277283698320389, -0.1573771983385086, 0.03478521108627319, -0.01083320565521717, 0.16282084584236145, -0.02801317535340786, 0.06043216958642006, 0.016761327162384987, 0.017932837828993797, -0.00449113454669714, -0.09742718935012817, -0.00009030633373185992, 0.08072355389595032, 0.09986040741205215, 0.19763611257076263, -0.10094048082828522, 0.1588551104068756, -0.006957564502954483, 0.10728473216295242, 0.04376919940114021, 0.12209494411945343, -0.12528403103351593, 0.023137088865041733, 0.01111140102148056, -0.013796205632388592, -0.07827148586511612, 0.03942913934588432, -0.02284233272075653, 0.07954073697328568, -0.0780450701713562, 0.021357867866754532, 0.010904492810368538, 0.17395925521850586, 0.08307451754808426, -0.18376749753952026, -0.12673765420913696, 0.004251767415553331, -0.0973190888762474, -0.10802555084228516, 0.07040799409151077, 0.21242071688175201, -0.06323034316301346, 0.02692033164203167, -0.02414754033088684, 0.12896215915679932, -0.08091724663972855, -0.027662070468068123, 0.030993305146694183, 0.0710737556219101, 0.0010040989145636559, 0.11642316728830338, -0.2350943237543106, 0.07695310562849045, 0.01369166374206543, 0.08702034503221512, -0.01996101252734661, 0.06154182180762291, -0.04463254287838936, 0.0003351125924382359, 0.06471852958202362, 0.010588264092803001, -0.03863732889294624, -0.1968979835510254, -0.05354022979736328, 0.025567729026079178, 0.047599151730537415, 0.01412525586783886, 0.09456592798233032, -0.0328831747174263, 0.048450712114572525, -0.02553839422762394, -0.13478238880634308, -0.05608382448554039, -0.14213813841342926, -0.04496452584862709, -0.020233584567904472, -0.04746395722031593, -0.03530048951506615, 0.042321838438510895, 0.06373234838247299, 0.21616549789905548, -0.14002928137779236, -0.08947712182998657, -0.0942734107375145, 0.08662448078393936, 0.1387576162815094, -0.08068761229515076, 0.040801357477903366, 0.012639536522328854, 0.050735510885715485, -0.033282291144132614, -0.08487448841333389, 0.03055458888411522, -0.051521189510822296, -0.06789463013410568, -0.026371344923973083, 0.11586660891771317, -0.01626032590866089, 0.05424361303448677, 0.014944611117243767, -0.08235888928174973, -0.038107603788375854, -0.11745656281709671, -0.10135094821453094, -0.03920725733041763, 0.01810196042060852, 0.021874958649277687, -0.12293261289596558, 0.05225684493780136, -0.007649358827620745, -0.06732422113418579, 0.09335648268461227, 0.14796161651611328, -0.066892609000206, 0.01535768248140812, 0.05194655433297157, -0.049275387078523636, -0.183834508061409, -0.027541743591427803, 0.04531502723693848, 0.081404909491539, -0.03019160032272339, -0.15420663356781006, 0.06573734432458878, 0.007117156870663166, 0.03713759779930115, 0.026567349210381508, -0.2786252200603485, -0.12277442216873169, 0.015030003152787685, 0.06773940473794937, 0.02538522332906723, -0.10398826003074646, -0.03841657564043999, -0.0533297061920166, -0.028345433995127678, 0.0763179361820221, 0.053492795675992966, 0.1166130006313324, -0.030652686953544617, 0.05264977365732193, 0.05106296390295029, -0.024791672825813293, 0.058095693588256836, -0.026250159367918968, 0.09050752222537994, -0.02244305983185768, 0.02302917279303074, 0.04597633332014084, -0.05719317868351936, 0.18460413813591003, -0.1741420179605484, 0.10301587730646133, -0.20074841380119324, -0.042620375752449036, -0.030168693512678146, 0.007729603908956051, -0.03198099881410599, -0.050008080899715424, -0.12234416604042053, 0.03171183913946152, 0.06042344868183136, -0.027225026860833168, 0.012053639627993107, -0.007483324501663446, -0.0484980084002018, 0.05278776213526726, 0.08902553468942642, 0.011354194022715092, -0.1507738083600998, 0.04466652870178223, 0.031076189130544662, 0.09438206255435944, -0.18290428817272186, 0.017473729327321053, 0.11154653877019882, 0.02892276458442211, 0.09129015356302261, 0.01542828418314457, -0.08427351713180542, 0.01962597668170929, 0.07415834814310074, -0.06648197770118713, -0.08506093174219131, -0.016412029042840004, -0.021048584952950478, -0.09818018227815628, 0.00785789079964161, 0.0827033668756485, -0.06589468568563461, -0.007657793816179037, -0.008195294067263603, 0.003526519052684307, -0.0795561745762825, 0.18923737108707428, 0.02069278061389923, 0.07456701993942261, -0.05674483999609947, 0.07803058624267578, 0.09390085935592651, -0.10346267372369766, 0.02302459068596363, 0.16894567012786865, -0.08979244530200958, -0.02074279636144638, 0.09287384152412415, 0.09818663448095322, -0.03170099854469299, -0.04307449981570244, -0.08019904047250748, -0.08028154820203781, 0.010486959479749203, 0.06403356790542603, 0.0634964182972908, 0.08435355871915817, -0.03841378539800644, 0.0035980555694550276, -0.12819527089595795, 0.09434618800878525, 0.0905744880437851, 0.04703220725059509, -0.13213880360126495, 0.17071953415870667, 0.034652844071388245, 0.06918635219335556, -0.0017740761395543814, 0.03240690752863884, -0.11425747722387314, 0.03695712611079216, -0.029665658250451088, 0.04174765199422836, 0.003515261923894286, 0.04346610978245735, -0.02904149703681469, 0.04312508553266525, -0.026732539758086205, 0.038685448467731476, -0.03645173832774162, -0.028855616226792336, -0.03163718804717064, 0.013402678072452545, -0.04683217033743858, -0.008421454578638077, 0.015530622564256191, -0.0930972546339035, 0.08777282387018204, -0.05432884767651558, -0.012434807606041431, 0.012582652270793915, 0.010572454892098904, 0.027829106897115707, 0.012468025088310242, 0.057409755885601044, 0.0009957818547263741, 0.0004952106392011046, 0.04116259887814522, 0.016847146674990654, -0.003384724725037813, -0.0016924060182645917, 0.08821975439786911, -0.13864947855472565, -0.08950839191675186, -0.07526727765798569, -0.07011940330266953, -0.05983631685376167, 0.07232985645532608, 0.06659892946481705, 0.06419318914413452, 0.10797768831253052, -0.04421701654791832, 0.011326737701892853, -0.16419795155525208, -0.04762883111834526, 0.042503245174884796, -0.0010002898052334785, -0.09851479530334473, -0.03494606539607048, 0.06441560387611389, -0.028574718162417412, 0.11000777781009674, -0.0026747756637632847, 0.020066775381565094, -0.006373981013894081, -0.04571181908249855, -0.025696860626339912, 0.003950479440391064, 0.2100163847208023, -0.09544353932142258, 0.007919587194919586, -0.01665211096405983, -0.005262251943349838, 0.03388475626707077, 0.14722256362438202, 0.12116910517215729, 0.1338760256767273, 0.0032418430782854557, 0.07553493231534958, -0.043192267417907715, -0.03164966031908989, -0.12206453830003738, 0.03993247076869011, -0.023641614243388176, 0.05109909176826477, -0.03498906269669533, 0.1438257098197937, 0.07456646114587784, -0.12713374197483063, 0.09935984760522842, 0.002426325110718608, -0.1040382981300354, -0.03820585831999779, -0.06569390743970871, -0.03889970853924751, -0.10004837810993195, 0.006234690546989441, -0.10416585206985474, -0.0014797707553952932, 0.06722116470336914, 0.033766962587833405, -0.03362255170941353, 0.17862476408481598, -0.0433095246553421, -0.07031261175870895, 0.02799404039978981, 0.05304985120892525, 0.023707903921604156, 0.08445435017347336, 0.016909055411815643, 0.03935236483812332, -0.05761902406811714, 0.07189421355724335, 0.02693476527929306, -0.003159980522468686, 0.02468925155699253, 0.022398319095373154, 0.0002941230486612767, -0.04309738799929619, -0.013005943037569523, 0.07343793660402298, 0.15055476129055023, 0.03560184687376022, -0.04419801011681557, -0.06145282834768295, 0.1821308135986328, -0.05423668026924133, -0.07176593691110611, -0.13062314689159393, 0.18391114473342896, 0.023619860410690308, 0.014805793762207031, 0.02379816770553589, -0.08614109456539154, -0.026602035388350487, 0.23994919657707214, 0.0680244192481041, -0.04320194572210312, -0.02991727739572525, -0.020081907510757446, -0.009384170174598694, -0.04492888227105141, 0.16282247006893158, 0.016170920804142952, 0.2330392599105835, 0.01162331085652113, 0.0030968361534178257, -0.04367843642830849, -0.036805663257837296, -0.020024115219712257, 0.18951323628425598, -0.02719329483807087, 0.027419162914156914, -0.10107637941837311, -0.002262633526697755, 0.02303173393011093, -0.12995293736457825, 0.11187490820884705, -0.12572479248046875, -0.07185715436935425, 0.004183629062026739, 0.04157492145895958, -0.04074843227863312, 0.04645314812660217, -0.03016439825296402, 0.06326992809772491, 0.028449319303035736, -0.023120148107409477, -0.11807899922132492, -0.1524045616388321, 0.06685686856508255, -0.01505733747035265, 0.14030611515045166, 0.014310799539089203, 0.0804193913936615, 0.08753640949726105, 0.0072056627832353115, -0.07118069380521774, 0.07857080549001694, 0.04013814404606819, 0.007626485079526901, 0.04854139685630798, 0.11800461262464523, -0.04953416436910629, 0.1803724318742752, -0.001943505136296153, -0.02999199368059635, -0.03569546341896057, -0.018299465999007225, 0.007321110460907221, -0.15765127539634705, -0.004747139289975166, -0.06685668975114822, 0.13696838915348053, 0.21379588544368744, -0.05209532380104065, -0.01112061645835638, -0.0602128691971302, 0.08143080025911331, -0.00100806076079607, 0.07542729377746582, 0.001088811899535358, -0.16791677474975586, -0.009674358181655407, -0.05681414529681206, 0.007891478948295116, -0.20861847698688507, -0.04607832059264183, -0.04840533062815666, -0.029297297820448875, -0.10396632552146912, 0.16106043756008148, 0.07054149359464645, 0.03403032198548317, -0.044071879237890244, -0.11289381980895996, -0.0251934677362442, 0.044554002583026886, -0.12567001581192017, -0.11590936034917831 ]
null
null
transformers
# CodeTrans model for source code summarization sql Pretrained model on programming language sql using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization sql dataset. ## Intended uses & limitations The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql", skip_special_tokens=True), device=0 ) tokenized_code = "select time ( col0 ) from tab0" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/source%20code%20summarization/sql/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_sql
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization sql ================================================= Pretrained model on programming language sql using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization sql dataset. Intended uses & limitations --------------------------- The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 115 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.11829722672700882, 0.03842651844024658, -0.00021651657880283892, 0.046913787722587585, 0.16030675172805786, 0.015879150480031967, 0.08486726880073547, 0.03700187802314758, 0.00416232505813241, -0.04853709414601326, 0.08114942163228989, 0.13262304663658142, 0.02861112914979458, 0.15516117215156555, -0.03163391351699829, -0.1800098568201065, -0.0020905560813844204, 0.04391103982925415, -0.18841134011745453, 0.12594591081142426, 0.11647581309080124, -0.05482039600610733, 0.09630665183067322, 0.004475155845284462, -0.20419366657733917, 0.07166168838739395, 0.010242474265396595, -0.06805887073278427, 0.12498646229505539, 0.10805126279592514, 0.1092548593878746, 0.03596363961696625, 0.0300209429115057, -0.22787117958068848, 0.0390714593231678, -0.048660989850759506, -0.010741882026195526, 0.043930601328611374, 0.032265111804008484, -0.058232296258211136, 0.14255808293819427, -0.026857133954763412, 0.007099861279129982, 0.06671138107776642, -0.10708115249872208, -0.05727280676364899, -0.03303803130984306, 0.01070262212306261, 0.07357186824083328, 0.08106829226016998, 0.016692042350769043, 0.1195385530591011, -0.14274609088897705, 0.11651401966810226, 0.09003540873527527, -0.19755370914936066, -0.014942756853997707, 0.1281273514032364, 0.062028802931308746, -0.08303071558475494, -0.04293094575405121, 0.022469354793429375, 0.05164765566587448, 0.00024002099235076457, 0.02062946744263172, -0.12695597112178802, -0.1391305774450302, 0.0868520587682724, -0.07604383677244186, -0.06964045763015747, 0.28158295154571533, 0.009662375785410404, -0.045655667781829834, -0.03306029736995697, -0.03598611429333687, 0.0433080680668354, -0.01402506697922945, 0.008759449236094952, 0.011450008489191532, 0.00011384839308448136, -0.05178261920809746, -0.02834155037999153, -0.1116379052400589, -0.12139821797609329, -0.029398461803793907, 0.08647240698337555, -0.011114743538200855, 0.03691980615258217, -0.13870638608932495, 0.0928826555609703, 0.0631256103515625, -0.05788680538535118, 0.01190720684826374, -0.07038760185241699, -0.0897190123796463, -0.01578647643327713, -0.08128087222576141, -0.16198815405368805, 0.09967046976089478, 0.10545267164707184, -0.0370703786611557, 0.05914614722132683, 0.06216214969754219, 0.053765516728162766, 0.05505279079079628, 0.21443305909633636, -0.006383453030139208, -0.07330576330423355, 0.06461044400930405, -0.043276574462652206, -0.051872726529836655, 0.01746547780930996, -0.08823616057634354, -0.038788482546806335, -0.010805505327880383, 0.12158598750829697, -0.09960625320672989, 0.09076502919197083, -0.07458839565515518, -0.03915642946958542, -0.004911077208817005, -0.13379327952861786, -0.023959225043654442, 0.02549041621387005, -0.05772070586681366, -0.047919198870658875, 0.10989189147949219, -0.05855097249150276, -0.10052715986967087, -0.04376000910997391, -0.08082912117242813, -0.015506729483604431, -0.09676385670900345, -0.08216874301433563, 0.008927157148718834, 0.05709436535835266, 0.07025090605020523, -0.11678779125213623, -0.19407954812049866, -0.0033237291499972343, 0.07065817713737488, 0.014159642159938812, 0.02603180892765522, -0.08748308569192886, 0.0015902919694781303, -0.02430981956422329, -0.023820383474230766, 0.01256253756582737, -0.078548863530159, 0.06726106256246567, 0.06973282992839813, 0.02950628660619259, -0.10436010360717773, 0.042290907353162766, -0.12699422240257263, 0.06225541979074478, -0.14145302772521973, 0.12192564457654953, -0.06007522717118263, 0.12610898911952972, -0.1119009405374527, -0.037757303565740585, 0.03670615702867508, 0.06251297146081924, 0.061274006962776184, 0.15017090737819672, -0.14064696431159973, -0.06158652901649475, 0.14910577237606049, -0.10843785852193832, -0.20418643951416016, 0.07068506628274918, -0.07403788715600967, 0.1715259999036789, 0.07870911806821823, 0.16834425926208496, 0.1331547051668167, -0.0525188148021698, 0.0352608859539032, 0.10095491260290146, -0.04958346486091614, -0.07888798415660858, 0.07661689072847366, 0.05927696079015732, -0.13738979399204254, 0.05908111855387688, -0.004625466652214527, 0.11700677871704102, -0.040296416729688644, -0.05769098922610283, -0.013221990317106247, -0.06714525818824768, 0.023893633857369423, -0.010061545297503471, 0.08888985961675644, 0.018256481736898422, -0.0007831378607079387, 0.09293082356452942, 0.09762439131736755, -0.10373040288686752, -0.0040612053126096725, -0.11413132399320602, 0.0373632088303566, -0.08887128531932831, 0.03351861983537674, -0.21028250455856323, -0.05860935524106026, -0.007217642851173878, 0.004960246849805117, 0.07235024124383926, 0.022304009646177292, 0.005745750851929188, -0.008764287456870079, 0.00522166071459651, 0.005884324200451374, 0.02494386024773121, -0.016860704869031906, -0.044165197759866714, -0.09825918078422546, -0.05519796535372734, -0.04478118196129799, 0.05028210207819939, -0.1712959110736847, 0.004881078843027353, 0.04523194953799248, 0.06315930932760239, 0.001990579767152667, 0.028070155531167984, 0.042894259095191956, 0.05585221201181412, -0.04208378493785858, -0.005775331519544125, 0.059461791068315506, 0.025692719966173172, -0.16555200517177582, 0.04908646270632744, -0.03706963732838631, 0.059625230729579926, 0.11135679483413696, -0.11885527521371841, -0.05066312104463577, -0.10078447312116623, -0.029182353988289833, -0.018400244414806366, 0.024671606719493866, -0.029101543128490448, 0.18331430852413177, 0.007642396260052919, 0.16065789759159088, -0.09571867436170578, -0.03485431522130966, -0.0443446971476078, -0.010997399687767029, 0.028961794450879097, 0.14281955361366272, 0.08991831541061401, -0.2558926045894623, 0.07096008211374283, 0.06767655909061432, -0.005145180504769087, 0.17248159646987915, -0.06186862662434578, -0.04048556834459305, -0.027384206652641296, 0.03910141438245773, -0.032439108937978745, 0.15845637023448944, -0.18130366504192352, -0.029263406991958618, 0.01528235524892807, -0.020150477066636086, 0.09936005622148514, -0.11418841034173965, -0.002603216329589486, 0.018912093713879585, -0.04195120930671692, -0.1668998897075653, 0.03700386732816696, 0.01358980406075716, 0.021651044487953186, 0.00541476346552372, 0.019831426441669464, 0.04492863640189171, -0.04426112398505211, -0.13602522015571594, 0.24678131937980652, -0.08532845973968506, -0.25677403807640076, -0.18784405291080475, 0.04229454696178436, -0.021627675741910934, 0.0006328414310701191, 0.053784556686878204, -0.05367497354745865, -0.05528825521469116, -0.01798824593424797, 0.13712427020072937, -0.025857627391815186, -0.024568526074290276, -0.03710633143782616, 0.06374885141849518, 0.009723182767629623, -0.17664623260498047, -0.0005093719810247421, -0.007588368374854326, 0.03944087401032448, -0.0012668616836890578, -0.14840222895145416, 0.12292996048927307, 0.0813545510172844, -0.03637685999274254, 0.037973519414663315, -0.0498347133398056, 0.26361244916915894, -0.056501179933547974, -0.0906873270869255, 0.13350465893745422, -0.10293067991733551, 0.008398809470236301, 0.048481207340955734, 0.016088571399450302, -0.10712913423776627, 0.03460795432329178, -0.03859930485486984, -0.06389474868774414, -0.26799315214157104, -0.10260865092277527, -0.09058967232704163, 0.06498217582702637, 0.013521556742489338, 0.02563381753861904, -0.10557302832603455, 0.06585878133773804, 0.0661914199590683, 0.10014764219522476, -0.017905419692397118, 0.05575432628393173, 0.11139120906591415, 0.016640931367874146, 0.008240618743002415, -0.10981829464435577, -0.03593868389725685, 0.04829929769039154, 0.08733956515789032, 0.1769423633813858, 0.019545182585716248, 0.1862316131591797, 0.06018378958106041, 0.04163108766078949, 0.040352191776037216, 0.15650039911270142, -0.07721835374832153, 0.017279010266065598, -0.007932126522064209, -0.01734810508787632, -0.1545836180448532, 0.043805789202451706, -0.015977805480360985, 0.02976190112531185, -0.13426180183887482, -0.031639520078897476, 0.06895886361598969, 0.12831230461597443, -0.007278642617166042, -0.24467375874519348, -0.1374540776014328, 0.03148767724633217, -0.08499220758676529, -0.04011142998933792, 0.04047727584838867, 0.09326048195362091, -0.13410557806491852, 0.019033227115869522, -0.03508099168539047, 0.16820840537548065, -0.052276454865932465, 0.02989039570093155, -0.049684297293424606, -0.03280544653534889, 0.012405088171362877, 0.15128400921821594, -0.20166286826133728, 0.2363375872373581, 0.017349576577544212, 0.007225213572382927, -0.048102788627147675, 0.028808610513806343, 0.011469417251646519, 0.0752493366599083, 0.11167161911725998, -0.021200228482484818, -0.0971967875957489, -0.16326940059661865, 0.008456860668957233, 0.0931185781955719, 0.05209751054644585, -0.029474740847945213, 0.07572066783905029, -0.03103615716099739, 0.02475815638899803, -0.005703955888748169, -0.03999249264597893, -0.08173160254955292, -0.12588509917259216, -0.023661533370614052, -0.05359738692641258, 0.06661761552095413, -0.026757637038826942, 0.0203532837331295, 0.05938243865966797, 0.18204446136951447, -0.08308285474777222, -0.06319709867238998, -0.11963896453380585, 0.08218274265527725, 0.10937228053808212, -0.09393710643053055, 0.03491795435547829, -0.00646540243178606, 0.02277163788676262, 0.017819253727793694, -0.15734711289405823, 0.07207200676202774, -0.06885375082492828, 0.013265975750982761, -0.03111337684094906, 0.10335204750299454, -0.011555634438991547, 0.010729532688856125, 0.06061539426445961, -0.06042034924030304, -0.05380954220890999, -0.13806171715259552, -0.09283149242401123, -0.05604724958539009, 0.030908072367310524, 0.0952671617269516, -0.14124688506126404, -0.01450105756521225, -0.0016093408921733499, -0.0233540590852499, 0.241264209151268, 0.10942473262548447, -0.04355229437351227, 0.0326835997402668, 0.1142830103635788, -0.09814591705799103, -0.2647928297519684, -0.004669151268899441, -0.0217368695884943, 0.08416146039962769, 0.027059555053710938, -0.15845398604869843, 0.10816770792007446, -0.012596908025443554, 0.03803027421236038, -0.026298675686120987, -0.243617981672287, -0.10528511554002762, 0.11448340862989426, 0.10245974361896515, 0.09624851495027542, -0.12103114277124405, -0.06682049483060837, -0.09636674076318741, -0.1834440678358078, 0.15742160379886627, -0.09683360159397125, 0.09762654453516006, 0.003862344194203615, 0.05575565621256828, 0.02166249230504036, -0.047623321413993835, 0.10140487551689148, 0.029182350262999535, 0.08329612016677856, -0.007293634582310915, -0.08912104368209839, 0.12947094440460205, -0.01913020946085453, 0.12466736882925034, -0.0797765925526619, 0.0933506116271019, -0.2038925439119339, -0.04152747243642807, -0.023129165172576904, 0.04053875803947449, -0.005440902896225452, -0.04599563404917717, -0.060036879032850266, 0.01787281036376953, 0.040071386843919754, 0.011329312808811665, 0.10598480701446533, -0.052917756140232086, 0.007695822510868311, 0.11758091300725937, 0.15713311731815338, -0.010999813675880432, -0.06649681180715561, 0.04458734765648842, 0.01359573844820261, 0.08949335664510727, -0.21246963739395142, 0.09044252336025238, 0.12322520464658737, 0.032091379165649414, 0.10575101524591446, 0.07862760126590729, -0.015333008021116257, 0.05188211798667908, 0.08504267036914825, -0.11767743527889252, -0.10047411918640137, -0.05510722100734711, -0.07915439456701279, -0.023117223754525185, 0.080081045627594, 0.1422179639339447, -0.025845186784863472, -0.014389541000127792, -0.0023590426426380873, -0.03780951723456383, -0.1215401366353035, 0.14300322532653809, 0.03153657168149948, 0.06563234329223633, -0.09830976277589798, 0.07852987945079803, 0.04040587693452835, -0.12193536758422852, -0.0310130026191473, 0.10443645715713501, -0.14719997346401215, -0.07452931255102158, -0.02997920848429203, 0.23143692314624786, -0.14913950860500336, -0.07667907327413559, -0.14651228487491608, -0.078758105635643, 0.010807694867253304, 0.236646831035614, 0.10718165338039398, 0.0889652669429779, -0.03940873220562935, -0.007296252995729446, -0.05879809707403183, 0.05839601159095764, 0.10163634270429611, -0.0041090878657996655, -0.0910852923989296, 0.0577453076839447, -0.014358656480908394, 0.1493002027273178, -0.05744972825050354, -0.03296114504337311, -0.1912446916103363, 0.07773200422525406, -0.16558730602264404, 0.06792309135198593, -0.07092484086751938, 0.030273333191871643, 0.007754888851195574, -0.011420176364481449, -0.040883880108594894, 0.04616689309477806, -0.0915130078792572, 0.0144486790522933, 0.0033022852148860693, 0.07649587839841843, -0.07889878004789352, -0.0006349478499032557, 0.08054223656654358, -0.04668800160288811, 0.11037776619195938, 0.06070994585752487, -0.049468375742435455, 0.11947134882211685, -0.1879909336566925, -0.03328140452504158, 0.03891817480325699, 0.03376692160964012, 0.048136256635189056, -0.03508726507425308, 0.0414070263504982, 0.036247942596673965, 0.03242132440209389, -0.005385610740631819, 0.1043878123164177, -0.11118410527706146, -0.10123083740472794, -0.033323559910058975, -0.08760129660367966, -0.04193156212568283, 0.00880185142159462, 0.05048687011003494, 0.09090224653482437, 0.11201085150241852, -0.03220589458942413, 0.024006307125091553, -0.09183540940284729, -0.01221628487110138, 0.034645963460206985, -0.07442989945411682, -0.11590355634689331, -0.10588983446359634, 0.03827674686908722, -0.04435478150844574, 0.22904284298419952, -0.02053580991923809, 0.14640624821186066, 0.005178721621632576, 0.0019967106636613607, 0.06302136182785034, 0.05660305544734001, 0.24719369411468506, -0.002132234163582325, 0.04887491837143898, -0.026861349120736122, 0.07581521570682526, 0.006311976350843906, 0.05798853561282158, 0.10226031392812729, 0.10030987858772278, -0.011891862377524376, 0.12008114159107208, 0.02273138426244259, 0.050589751452207565, -0.04299650341272354, -0.0749303475022316, 0.043160781264305115, 0.03984306752681732, -0.045757878571748734, 0.09421813488006592, 0.10284147411584854, -0.1030687615275383, 0.09776884317398071, -0.002918679267168045, -0.09482073783874512, -0.06906962394714355, -0.010190465487539768, -0.05991531163454056, -0.13839015364646912, 0.0065883370116353035, -0.12078497558832169, -0.06394947320222855, 0.07546941936016083, 0.021950291469693184, -0.04450714960694313, 0.18200364708900452, 0.0015386578161269426, -0.06982533633708954, 0.028146304190158844, -0.015272257849574089, 0.007751172874122858, -0.0003817662945948541, 0.06689900159835815, 0.010579348541796207, -0.019061345607042313, -0.003154247533529997, 0.0455450564622879, -0.02648351341485977, 0.013747666031122208, -0.0745384693145752, -0.023078184574842453, -0.05481547862291336, 0.07019662857055664, -0.013170265592634678, 0.026285067200660706, 0.023473674431443214, -0.030349858105182648, -0.005687909200787544, 0.2150130569934845, -0.04589614272117615, -0.07440610229969025, -0.1544029712677002, 0.21345238387584686, 0.02844688855111599, 0.060328368097543716, 0.011568128131330013, -0.07491638511419296, -0.011396370828151703, 0.2748085856437683, 0.2129025012254715, -0.08098213374614716, 0.0005707279196940362, 0.003831156063824892, 0.01715988479554653, 0.01242441963404417, 0.13482776284217834, 0.027350395917892456, 0.21652741730213165, -0.03578706830739975, -0.0789879783987999, -0.03710184246301651, -0.061368755996227264, 0.036028169095516205, 0.13490617275238037, 0.01975533366203308, -0.04109237715601921, -0.046221669763326645, 0.10230779647827148, -0.18157435953617096, -0.12751029431819916, 0.020603686571121216, -0.1403474509716034, -0.07644367218017578, -0.072187639772892, 0.01705292984843254, 0.008901124820113182, 0.043694064021110535, -0.04632682353258133, -0.008730996400117874, 0.04664861783385277, 0.02400340512394905, -0.16809752583503723, -0.09858894348144531, 0.0889122411608696, -0.02541540004312992, 0.12597210705280304, -0.027750760316848755, 0.11111492663621902, 0.09947217255830765, 0.0505172498524189, -0.03235141932964325, 0.013664277270436287, 0.062261685729026794, -0.009547875262796879, 0.06400828808546066, 0.028633037582039833, -0.030017700046300888, 0.1701994240283966, -0.04544125869870186, -0.10791914165019989, 0.050646908581256866, 0.001437083468772471, -0.02348765730857849, -0.1089029386639595, -0.019147545099258423, -0.10271992534399033, 0.09451586753129959, 0.16427990794181824, -0.04486991837620735, 0.032613612711429596, -0.07086348533630371, 0.12433641403913498, 0.00684031518176198, -0.01712827943265438, -0.07416309416294098, -0.14459937810897827, -0.005631803534924984, 0.018140306696295738, -0.03338248282670975, -0.2065586894750595, -0.017709020525217056, -0.0548117496073246, 0.011965930461883545, -0.023706629872322083, 0.1385994553565979, 0.13239595293998718, 0.037758421152830124, -0.027836669236421585, -0.16798394918441772, -0.01164190098643303, 0.05924508720636368, -0.11205017566680908, -0.15086260437965393 ]
null
null
transformers
# CodeTrans model for source code summarization sql Pretrained model on programming language sql using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. ## Intended uses & limitations The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql_multitask"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql_multitask", skip_special_tokens=True), device=0 ) tokenized_code = "select time ( col0 ) from tab0" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/sql/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_sql_multitask
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization sql ================================================= Pretrained model on programming language sql using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. Intended uses & limitations --------------------------- The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 61, 146 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 460,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.1416095644235611, 0.01578744873404503, -0.0009534012642689049, 0.1351865977048874, 0.11623293906450272, 0.01673288270831108, 0.06410238146781921, 0.047315165400505066, -0.031192898750305176, 0.02301155962049961, 0.04591925069689751, 0.012052631936967373, 0.04604155942797661, 0.1984560340642929, 0.014970829710364342, -0.15111735463142395, -0.018296340480446815, 0.0205873753875494, -0.08397610485553741, 0.12260916084051132, 0.09649710357189178, -0.07589421421289444, 0.050697192549705505, -0.04267461225390434, -0.20710141956806183, 0.061664190143346786, 0.003220835467800498, -0.0589265376329422, 0.10549093782901764, 0.06698871403932571, 0.12828056514263153, -0.009695447050035, 0.043356865644454956, -0.13047054409980774, 0.008951318450272083, 0.0199193824082613, 0.0373615063726902, 0.023800663650035858, 0.06671030819416046, 0.03959579020738602, 0.13369275629520416, -0.008271410129964352, 0.03272595629096031, 0.064652219414711, -0.06746361404657364, -0.09773892909288406, -0.02823508158326149, 0.029660191386938095, 0.05346162989735603, 0.0965980663895607, -0.010792833752930164, 0.09823957830667496, -0.1519259512424469, 0.12038572132587433, 0.08326059579849243, -0.2512407600879669, -0.01138558890670538, 0.09318610280752182, 0.053566381335258484, 0.060254957526922226, -0.04493037238717079, -0.050222352147102356, 0.08069325983524323, 0.053069375455379486, 0.039143286645412445, -0.08282226324081421, -0.07746156305074692, 0.0341980904340744, -0.09490171819925308, -0.07275433838367462, 0.21789617836475372, 0.012220482341945171, -0.07734765112400055, -0.05175790935754776, -0.04223724082112312, -0.11482925713062286, 0.035519689321517944, 0.038284532725811005, 0.004660700913518667, -0.019187193363904953, 0.0034153657034039497, 0.026348143815994263, -0.09638447314500809, -0.140548974275589, 0.006699387449771166, 0.07543183863162994, 0.04773663729429245, 0.0383685901761055, -0.06671977043151855, 0.10924416780471802, 0.01796889677643776, -0.04046366363763809, -0.014136486686766148, -0.028264394029974937, -0.1300818920135498, 0.040132228285074234, -0.06375303864479065, -0.1841791570186615, -0.0008599035209044814, 0.010590003803372383, -0.03386075049638748, 0.058781519532203674, 0.05254792049527168, 0.02623220905661583, 0.032083719968795776, 0.19727803766727448, 0.023163961246609688, -0.11317722499370575, 0.05494716018438339, 0.04061324521899223, -0.04100921377539635, -0.01224066223949194, -0.06082043796777725, -0.07968436926603317, 0.057896893471479416, 0.09238988906145096, -0.13596804440021515, 0.048183176666498184, -0.06917694956064224, -0.044051527976989746, -0.015021461993455887, -0.16730454564094543, -0.002557234838604927, 0.024405185133218765, -0.06701038777828217, -0.0531391017138958, 0.09262089431285858, -0.15548555552959442, -0.15104663372039795, -0.031121132895350456, -0.08206892758607864, -0.04744075611233711, -0.15862753987312317, -0.14119583368301392, -0.018734661862254143, -0.03621380403637886, 0.005083909258246422, -0.06950832903385162, -0.17294113337993622, -0.01815049536526203, 0.02662990614771843, 0.004347442649304867, -0.0032329228706657887, -0.07386665046215057, 0.002256730804219842, -0.015814300626516342, -0.03816451504826546, 0.008590180426836014, -0.04733206704258919, 0.11117682605981827, 0.09891735762357712, 0.030205123126506805, -0.034583576023578644, 0.04919128492474556, -0.07749009877443314, 0.06672289967536926, -0.10891354084014893, 0.11743447184562683, -0.07023216038942337, 0.0730208307504654, -0.04304078593850136, -0.0977514386177063, 0.04347102344036102, 0.05749523639678955, 0.07106390595436096, 0.05963693559169769, -0.16601108014583588, -0.026981906965374947, 0.18632353842258453, -0.12431247532367706, -0.10626418143510818, 0.10963903367519379, -0.04551483690738678, 0.05107741057872772, 0.08933450281620026, 0.13590484857559204, 0.1426229625940323, -0.023268332704901695, 0.003924454562366009, 0.06360195577144623, 0.03812135010957718, -0.11287548393011093, 0.08317837864160538, 0.0459473542869091, -0.09365826100111008, 0.06384360045194626, -0.006446057930588722, 0.09686076641082764, -0.013759729452431202, -0.0365632027387619, -0.050171758979558945, -0.07401902973651886, -0.00298718991689384, -0.0004895018646493554, 0.06462186574935913, -0.06535079330205917, -0.06447239220142365, 0.07399953901767731, 0.15636777877807617, -0.11416564136743546, -0.010145410895347595, -0.08977052569389343, 0.031199095770716667, -0.06013869494199753, 0.01787853054702282, -0.17134715616703033, 0.015481787733733654, 0.062304072082042694, -0.030657755210995674, 0.07258254289627075, 0.12229368835687637, 0.014596163295209408, 0.05509467050433159, 0.004507670644670725, -0.0060345870442688465, -0.09242494404315948, -0.05865246430039406, -0.07155093550682068, -0.060381993651390076, -0.09668585658073425, -0.0447542630136013, -0.0020034501794725657, -0.18611429631710052, 0.013034549541771412, 0.013600174337625504, 0.019977973774075508, 0.0031508924439549446, -0.01591033861041069, 0.023060057312250137, 0.06559740751981735, -0.05379657819867134, -0.03366522863507271, 0.026221271604299545, 0.02362385392189026, -0.07489357888698578, -0.05169038102030754, -0.08601098507642746, 0.005381804425269365, 0.10776256769895554, 0.054146185517311096, -0.07382307946681976, -0.0023006973788142204, -0.01901198923587799, -0.03858848661184311, 0.018171483650803566, -0.06831139326095581, 0.1669297069311142, 0.0024658360052853823, 0.1955917328596115, -0.14950887858867645, -0.025441985577344894, -0.02725517377257347, 0.024272454902529716, 0.05382249131798744, 0.1349247843027115, 0.013145757839083672, -0.13418595492839813, 0.06392479687929153, 0.004844143986701965, -0.06729793548583984, 0.21301183104515076, -0.04981193318963051, -0.09111528843641281, 0.012928999960422516, 0.09573271125555038, -0.02420288696885109, 0.1621527075767517, -0.1662953943014145, -0.020581809803843498, 0.012449922040104866, 0.007459025364369154, 0.06582699716091156, -0.13446098566055298, 0.009519638493657112, 0.018149588257074356, -0.0682099312543869, -0.12039384245872498, -0.0293784961104393, -0.0033087246119976044, 0.03573764115571976, 0.00045049499021843076, 0.0012505357153713703, 0.017201626673340797, -0.044633857905864716, -0.10610037297010422, 0.22331121563911438, -0.10891292244195938, -0.23491142690181732, -0.20529304444789886, 0.11755267530679703, -0.053682323545217514, 0.0006019412539899349, 0.02256276272237301, -0.08882679790258408, -0.06546539813280106, -0.0554492250084877, 0.1759786307811737, -0.07594002783298492, -0.008217338472604752, -0.021392805501818657, 0.06357652693986893, 0.024982238188385963, -0.2046273648738861, 0.03751325234770775, -0.016732966527342796, -0.016051653772592545, -0.00824003666639328, -0.08769625425338745, 0.0856621190905571, 0.14468616247177124, -0.06055162847042084, 0.019688989967107773, -0.004991533700376749, 0.21015499532222748, -0.028778742998838425, -0.06060325354337692, 0.13008244335651398, -0.009063318371772766, 0.010799027979373932, 0.02647000178694725, 0.0037195461336523294, -0.089031882584095, 0.05227605625987053, 0.005628400482237339, -0.028188910335302353, -0.27189892530441284, -0.025082524865865707, -0.08625137805938721, 0.04448821023106575, 0.034634705632925034, 0.04446481913328171, -0.10969226062297821, 0.03155413642525673, 0.043896034359931946, 0.1268765777349472, -0.018576646223664284, 0.03263280540704727, 0.07664521038532257, -0.004447875544428825, 0.01697448454797268, -0.09592676907777786, 0.0018720764201134443, 0.08615075051784515, 0.10040193796157837, 0.26896241307258606, -0.09797462821006775, 0.20748811960220337, 0.035360947251319885, 0.06747834384441376, 0.04630514979362488, 0.152227520942688, -0.10721173137426376, 0.031661104410886765, -0.0017614858224987984, -0.00875948742032051, -0.13589821755886078, 0.028483066707849503, -0.032259732484817505, 0.0688667967915535, -0.10956212133169174, -0.025183936581015587, 0.010559937916696072, 0.17931798100471497, 0.03130510076880455, -0.21930311620235443, -0.1286061853170395, 0.020088044926524162, -0.1010320782661438, -0.09499631822109222, 0.05759848281741142, 0.22273217141628265, -0.07662196457386017, -0.01914655789732933, -0.013548646122217178, 0.13063012063503265, -0.03769074007868767, -0.021381089463829994, -0.04229963943362236, 0.07277683168649673, 0.017541058361530304, 0.13619449734687805, -0.24494127929210663, 0.14720186591148376, 0.001479458180256188, 0.060721322894096375, -0.03598867729306221, 0.06598037481307983, -0.03326548635959625, 0.04552502930164337, 0.048262372612953186, -0.007701301947236061, -0.0378003753721714, -0.17955282330513, -0.005578257143497467, 0.035786621272563934, 0.042705558240413666, 0.044148724526166916, 0.07181628793478012, -0.00617183418944478, 0.04556872695684433, -0.007881615310907364, -0.12081412225961685, -0.0708550289273262, -0.10737522691488266, -0.01564001478254795, -0.04583973065018654, -0.02529749646782875, -0.055579543113708496, -0.00841977447271347, 0.07623343914747238, 0.1885041892528534, -0.11329490691423416, -0.08650137484073639, -0.0892859473824501, 0.07233298569917679, 0.13200877606868744, -0.07801473140716553, 0.05728525295853615, -0.0044965362176299095, 0.03411570191383362, 0.006843780167400837, -0.09234736859798431, 0.05707564949989319, -0.03772884979844093, -0.05840998515486717, -0.02630932256579399, 0.09671040624380112, 0.0054666418582201, 0.027246279641985893, 0.0010406350484117866, -0.07930193096399307, -0.04377422481775284, -0.12501907348632812, -0.10869355499744415, -0.043763600289821625, 0.004196111112833023, 0.06777684390544891, -0.13730542361736298, -0.050557464361190796, -0.0009696318302303553, -0.03074021451175213, 0.14049267768859863, 0.14443005621433258, -0.0686517134308815, 0.03893832117319107, 0.12276677787303925, -0.05596287548542023, -0.18620681762695312, 0.012073430232703686, 0.06626921147108078, 0.11893711239099503, -0.03230800852179527, -0.1907416433095932, 0.05943656712770462, 0.026785511523485184, 0.037616532295942307, 0.040849462151527405, -0.3064509630203247, -0.12215588241815567, 0.06455099582672119, 0.12529923021793365, 0.07346075028181076, -0.10847620666027069, -0.044653262943029404, -0.05629489943385124, -0.11264292895793915, 0.09819440543651581, -0.015585771761834621, 0.13532574474811554, -0.039021506905555725, 0.04497949033975601, 0.03424727916717529, -0.04749371483922005, 0.07107838988304138, 0.02364748902618885, 0.1024821326136589, -0.031146632507443428, 0.025064649060368538, 0.1288527250289917, -0.026944797486066818, 0.16757047176361084, -0.13326367735862732, 0.10859193652868271, -0.2207862138748169, -0.06767398118972778, -0.06719259917736053, 0.010850749909877777, -0.032776787877082825, -0.038356613367795944, -0.06830272823572159, 0.03026290237903595, -0.0008757173200137913, -0.011026741936802864, 0.017747893929481506, -0.033470187336206436, -0.026983175426721573, 0.10908273607492447, 0.11022692173719406, -0.006286613177508116, -0.0938061997294426, 0.04484017938375473, 0.046513963490724564, 0.09106167405843735, -0.19216255843639374, 0.030705448240041733, 0.11518359929323196, 0.024342387914657593, 0.11259638518095016, 0.04945867881178856, -0.09644760936498642, 0.04588617384433746, 0.08824335038661957, -0.08122102171182632, -0.10157003998756409, -0.020500676706433296, -0.056481096893548965, -0.07579530775547028, 0.055993057787418365, 0.09382472932338715, -0.03588701784610748, -0.019329657778143883, -0.023421600461006165, -0.029010366648435593, -0.10492321848869324, 0.20637552440166473, 0.051481008529663086, 0.08388590812683105, -0.07079816609621048, 0.07332424819469452, 0.07879498600959778, -0.0850939154624939, 0.00805873703211546, 0.18576976656913757, -0.1123913899064064, -0.04427492618560791, 0.025754414498806, 0.16166625916957855, -0.04075559973716736, -0.05091365426778793, -0.12876580655574799, -0.08163151890039444, 0.038082048296928406, 0.157569020986557, 0.08355648070573807, 0.10265208780765533, -0.05058206990361214, 0.003487127833068371, -0.07262341678142548, 0.08093654364347458, 0.08228655159473419, 0.0302325077354908, -0.11645124852657318, 0.13836632668972015, 0.03390069678425789, 0.110423244535923, -0.029076801612973213, -0.009345068596303463, -0.11417757719755173, 0.056521009653806686, -0.10198026150465012, 0.03371264785528183, -0.013764492236077785, 0.048445653170347214, -0.02217404730618, -0.005843575112521648, -0.028386885300278664, 0.06639361381530762, -0.0860750824213028, -0.004230152815580368, -0.004139988217502832, 0.041650280356407166, -0.05024978145956993, -0.010273423977196217, 0.03554541990160942, -0.08372564613819122, 0.12461567670106888, -0.007375597488135099, -0.02848726697266102, 0.08283386379480362, -0.04561226814985275, 0.03761778399348259, 0.008066639304161072, 0.0596826896071434, 0.009613015688955784, 0.03478848189115524, 0.07742424309253693, 0.032738909125328064, 0.05345824360847473, 0.00812198780477047, 0.10570477694272995, -0.1332951933145523, -0.10184960812330246, -0.03155431151390076, -0.09724429249763489, -0.06327202916145325, 0.08904072642326355, 0.06387227028608322, 0.09855373203754425, 0.09810435026884079, -0.03034554235637188, 0.010996617376804352, -0.14220698177814484, -0.05421934276819229, 0.02694380283355713, -0.030818412080407143, -0.0970453992486, -0.0643112063407898, 0.04862375929951668, -0.026463977992534637, 0.14227700233459473, 0.001232560258358717, 0.049759507179260254, -0.020867332816123962, -0.04657323285937309, 0.027837464585900307, 0.028742507100105286, 0.23043528199195862, -0.06474251300096512, 0.03553656116127968, 0.004752428270876408, 0.011990496888756752, -0.007359766401350498, 0.11802761256694794, 0.11870498210191727, 0.14254999160766602, -0.015448424965143204, 0.09885548055171967, 0.021148644387722015, -0.010042006149888039, -0.10149354487657547, 0.022471390664577484, 0.0016037921886891127, 0.0659376010298729, -0.06747899204492569, 0.17092375457286835, 0.06630884855985641, -0.10858872532844543, 0.11129733175039291, 0.016489069908857346, -0.12722031772136688, -0.04393080994486809, -0.00016919930931180716, -0.02967582456767559, -0.13381752371788025, 0.02309727482497692, -0.12452984601259232, -0.013814393430948257, 0.07653417438268661, 0.04724897816777229, -0.06614971160888672, 0.1710641086101532, 0.026218246668577194, -0.05917198210954666, 0.043950680643320084, 0.005358473397791386, 0.030266856774687767, 0.0378718301653862, 0.01887906901538372, 0.04319852963089943, -0.0246433075517416, 0.03844675049185753, 0.018822627142071724, -0.021306348964571953, -0.005243459716439247, -0.010156013071537018, 0.0066705867648124695, -0.03129353001713753, 0.02602429874241352, 0.0463021844625473, 0.15195104479789734, 0.028581494465470314, -0.07571112364530563, -0.0350450724363327, 0.17485150694847107, -0.045691292732954025, -0.0779571384191513, -0.13379769027233124, 0.1772179752588272, 0.03376610204577446, 0.02255050465464592, 0.021528908982872963, -0.09485477954149246, -0.04035944491624832, 0.19920594990253448, 0.08307360857725143, -0.039236586540937424, -0.029012782499194145, -0.004334471188485622, -0.0053088124841451645, -0.04464418441057205, 0.19962741434574127, 0.023144744336605072, 0.24929165840148926, 0.005307527258992195, -0.0077104344964027405, -0.060335446149110794, -0.03517969325184822, -0.007835007272660732, 0.1482950896024704, -0.045398980379104614, -0.01913439854979515, -0.0830201506614685, 0.006559460423886776, -0.0020163189619779587, -0.1053164005279541, 0.07400383800268173, -0.12582166492938995, -0.0985724925994873, -0.03641489893198013, 0.037580668926239014, -0.033717066049575806, 0.023485079407691956, -0.036226578056812286, 0.04866679385304451, 0.0650351494550705, -0.025120286270976067, -0.11780659109354019, -0.15542496740818024, 0.09945621341466904, -0.04715878516435623, 0.1466168314218521, -0.01113414391875267, 0.12390884011983871, 0.09361498057842255, 0.041783858090639114, -0.05772966146469116, 0.09456069767475128, 0.035249266773462296, 0.03274218738079071, 0.04598246514797211, 0.10941273719072342, -0.04538331925868988, 0.16827231645584106, -0.05608405917882919, -0.03718896955251694, -0.00939553789794445, -0.05612265318632126, -0.01597154699265957, -0.15656225383281708, -0.0036388132721185684, -0.10518910735845566, 0.10334645211696625, 0.19366343319416046, -0.04189149662852287, -0.02094712108373642, -0.09735273569822311, 0.09954144805669785, -0.02159303054213524, 0.05829824134707451, -0.035241346806287766, -0.18391960859298706, 0.004948424641042948, 0.008201594464480877, 0.009854146279394627, -0.25475940108299255, -0.017577126622200012, -0.03792446851730347, -0.019128713756799698, -0.0643555223941803, 0.15860667824745178, 0.09191709011793137, 0.05139285698533058, -0.034659478813409805, -0.14819280803203583, -0.03453236445784569, 0.05498791113495827, -0.12579821050167084, -0.13019829988479614 ]
null
null
transformers
# CodeTrans model for source code summarization sql Pretrained model on programming language sql using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the sql code snippets. ## Intended uses & limitations The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql_multitask_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql_multitask_finetune", skip_special_tokens=True), device=0 ) tokenized_code = "select time ( col0 ) from tab0" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/sql/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_sql_multitask_finetune
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization sql ================================================= Pretrained model on programming language sql using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the sql code snippets. Intended uses & limitations --------------------------- The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Multi-task Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 61, 88, 111 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.09154251217842102, 0.0828026831150055, -0.0014620644506067038, 0.11068102717399597, 0.05059200897812843, 0.024473736062645912, 0.0523345060646534, 0.09279120713472366, -0.04811879247426987, 0.06460406631231308, 0.06868276000022888, -0.04740322008728981, 0.062265582382678986, 0.1754084825515747, 0.02661489136517048, -0.1804913729429245, -0.01882498525083065, 0.025438744574785233, -0.04940950870513916, 0.10426691919565201, 0.09787662327289581, -0.07921843230724335, 0.06401585042476654, -0.03378749266266823, -0.11051864176988602, 0.05872315168380737, -0.047926317900419235, -0.03923629969358444, 0.092373788356781, 0.07003936916589737, 0.11531650274991989, -0.026808543130755424, 0.0741913691163063, -0.2129015177488327, 0.0026237284764647484, 0.012213164009153843, 0.05195022001862526, 0.02297324873507023, 0.06598232686519623, 0.06442122906446457, 0.11976055055856705, -0.031491342931985855, 0.03651510924100876, 0.05126054212450981, -0.060723040252923965, -0.06163937225937843, -0.05499081686139107, 0.07754140347242355, 0.10819818079471588, 0.08875694125890732, -0.011839408427476883, 0.012881409376859665, -0.08675305545330048, 0.08578305691480637, 0.12028941512107849, -0.228554829955101, -0.026354709640145302, 0.0999947190284729, 0.08565051853656769, 0.05886058136820793, -0.08072154223918915, -0.03657947853207588, 0.09793294966220856, 0.03817827254533768, 0.037999723106622696, -0.08198018372058868, -0.06464490294456482, 0.009422016330063343, -0.05740189179778099, -0.052789971232414246, 0.13755322992801666, 0.05112593621015549, -0.050520963966846466, -0.0888529047369957, -0.04452141374349594, -0.18480873107910156, 0.04510686174035072, 0.023298319429159164, 0.017556721344590187, -0.011811232194304466, 0.019025513902306557, -0.008939818479120731, -0.10014919191598892, -0.10263102501630783, -0.012506959959864616, 0.06454690545797348, 0.07127974927425385, 0.026013854891061783, -0.011648344807326794, 0.0747019425034523, -0.016980990767478943, -0.05058177560567856, -0.03468335419893265, 0.01124265231192112, -0.13703908026218414, 0.022848548367619514, -0.0294504314661026, -0.05969442054629326, -0.006713785231113434, 0.10212480276823044, -0.07615219801664352, 0.0816771388053894, 0.12751887738704681, 0.0024711336009204388, -0.002546326955780387, 0.23114129900932312, 0.03954363614320755, -0.146917924284935, -0.0014583287993445992, 0.026827778667211533, -0.0045005688443779945, 0.0007594591588713229, -0.06560366600751877, -0.03941318020224571, 0.0017458123620599508, 0.056497666984796524, -0.13747155666351318, 0.016762010753154755, -0.04066869243979454, -0.008906718343496323, 0.051302775740623474, -0.13621339201927185, 0.031086603179574013, 0.011130519211292267, -0.05922381952404976, -0.063523069024086, 0.06357527524232864, -0.11314741522073746, -0.1130264475941658, 0.022007213905453682, -0.050306010991334915, -0.02994462475180626, -0.11964374035596848, -0.11615248769521713, -0.014772774651646614, -0.04787307605147362, 0.016904523596167564, -0.09868445992469788, -0.10966309905052185, -0.01196768693625927, 0.03379642218351364, -0.008463929407298565, -0.020146925002336502, -0.049958765506744385, 0.01802743226289749, -0.004029790870845318, -0.030654562637209892, 0.011274873279035091, -0.04286358132958412, 0.0852767750620842, 0.09923214465379715, 0.04836646094918251, 0.004717998206615448, 0.02898680791258812, -0.06897694617509842, 0.06733842939138412, -0.07062336802482605, 0.07623252272605896, -0.026492564007639885, 0.054289910942316055, -0.09443721920251846, -0.0730743482708931, 0.03942638263106346, 0.05795317888259888, 0.05867188423871994, 0.021314967423677444, -0.12123715877532959, 0.021759748458862305, 0.1521325558423996, -0.09559576213359833, -0.13020233809947968, 0.11518416553735733, -0.007309882901608944, -0.004143008030951023, 0.06725157797336578, 0.12673167884349823, 0.14119784533977509, -0.08470258116722107, -0.0452808178961277, 0.0897301733493805, 0.07290460914373398, -0.05164172500371933, 0.10514117777347565, 0.013328970409929752, 0.04170208424329758, 0.02297963947057724, 0.04312099143862724, 0.06420768052339554, -0.00337833515368402, -0.0416143573820591, -0.022549746558070183, -0.08431629091501236, -0.03471074625849724, -0.015755172818899155, 0.02534734643995762, -0.06343104690313339, -0.06923995912075043, 0.03176937252283096, 0.16954518854618073, -0.09278815239667892, 0.027933644130825996, -0.09855981916189194, -0.0649634525179863, -0.083927683532238, 0.011557617224752903, -0.1167304515838623, 0.0038443845696747303, 0.038374945521354675, -0.06320998072624207, 0.07484380155801773, 0.0796283632516861, 0.002756549511104822, 0.03185240179300308, -0.04573066532611847, -0.03734351694583893, -0.04388744756579399, -0.053987231105566025, -0.126591756939888, -0.017749615013599396, -0.09023557603359222, -0.028622472658753395, -0.0493641197681427, -0.16865980625152588, 0.01725780963897705, -0.025607379153370857, 0.024066133424639702, -0.005285249091684818, -0.01699184626340866, 0.028825031593441963, 0.051877524703741074, -0.05376583710312843, -0.08405236154794693, 0.008799239061772823, 0.027629883959889412, -0.12578372657299042, -0.047872938215732574, -0.10924039781093597, -0.05246453359723091, 0.07042653113603592, 0.10485391318798065, -0.07947599142789841, -0.006004314869642258, -0.024514704942703247, -0.056578636169433594, -0.04292604699730873, -0.07395627349615097, 0.17099551856517792, 0.009408117271959782, 0.16416415572166443, -0.1346484273672104, -0.047641754150390625, -0.03963518887758255, -0.013365764170885086, 0.01682554930448532, 0.15622581541538239, 0.007402618415653706, -0.1054554209113121, 0.049094099551439285, -0.017499348148703575, -0.06512553989887238, 0.15025441348552704, -0.009199791587889194, -0.09094174951314926, 0.012389675714075565, 0.0929826945066452, -0.01651911623775959, 0.14683207869529724, -0.0695614144206047, -0.0054568639025092125, -0.002457637106999755, 0.0274493545293808, 0.04382063075900078, -0.13204389810562134, 0.025774989277124405, 0.059609562158584595, -0.0672031044960022, -0.06405580043792725, -0.04307655617594719, -0.03918572887778282, 0.03148408979177475, 0.007407671306282282, 0.004946074914187193, -0.008931816555559635, -0.025728896260261536, -0.09302596002817154, 0.2107633799314499, -0.09012730419635773, -0.20742470026016235, -0.17249709367752075, 0.026128675788640976, -0.053223855793476105, -0.002573499921709299, 0.054857563227415085, -0.11937012523412704, -0.06589929759502411, -0.0902254730463028, 0.12955430150032043, -0.11229322850704193, 0.007114054169505835, -0.005278246942907572, 0.029634268954396248, 0.03656790405511856, -0.17544963955879211, 0.033484432846307755, -0.007895578630268574, 0.0020277095027267933, -0.0018356898799538612, -0.06585495173931122, 0.0939486026763916, 0.12469656020402908, -0.07650723308324814, 0.015033864416182041, -0.01095624640583992, 0.17233800888061523, -0.048087794333696365, 0.01928561180830002, 0.1888997107744217, 0.011212586425244808, 0.03875650838017464, 0.05716986581683159, 0.016253195703029633, -0.0887388363480568, 0.06008635461330414, 0.05581040307879448, -0.03432440012693405, -0.2565089762210846, -0.0012735213385894895, -0.07158835232257843, 0.03817528486251831, 0.10770319402217865, 0.05270740017294884, -0.14583081007003784, 0.031193621456623077, -0.00922648049890995, 0.13300195336341858, -0.03708644211292267, 0.0515451580286026, 0.02336130291223526, 0.021835172548890114, 0.01197417639195919, -0.0970577746629715, 0.010800149291753769, 0.07690544426441193, 0.12091612815856934, 0.21052898466587067, -0.05611661076545715, 0.19327130913734436, 0.018363259732723236, 0.06430202722549438, 0.030493978410959244, 0.10016731917858124, -0.1185557022690773, 0.0009174555889330804, 0.004300450440496206, -0.005802596919238567, -0.07438589632511139, 0.05851764231920242, -0.027090970426797867, 0.06512299180030823, -0.05600609630346298, 0.04779187589883804, 0.01803089678287506, 0.16874119639396667, 0.055007193237543106, -0.18468190729618073, -0.12286395579576492, 0.02019309625029564, -0.10916463285684586, -0.10817443579435349, 0.06445091217756271, 0.19317901134490967, -0.04365788400173187, 0.020321892574429512, -0.004952484741806984, 0.13894672691822052, -0.061784546822309494, -0.019586848095059395, 0.024160020053386688, 0.07218793779611588, 0.009122343733906746, 0.1296764612197876, -0.2523112893104553, 0.09324072301387787, 0.0169187281280756, 0.0906737819314003, -0.010289614088833332, 0.07471197098493576, -0.027645396068692207, 0.0030364422127604485, 0.07292154431343079, -0.00024535556440241635, -0.0929822027683258, -0.1987910121679306, -0.052412014454603195, 0.03125849738717079, 0.05490906164050102, -0.007527131587266922, 0.0972246378660202, -0.01047189999371767, 0.06587427109479904, -0.026146449148654938, -0.11665762960910797, -0.05629920959472656, -0.14255118370056152, -0.027539847418665886, -0.010116194374859333, -0.020104210823774338, -0.02941936068236828, 0.02189636416733265, -0.01547306589782238, 0.20839302241802216, -0.15638738870620728, -0.10526423901319504, -0.0931033343076706, 0.0846334844827652, 0.1293211281299591, -0.10614297538995743, 0.019887128844857216, 0.015299730934202671, 0.05298353731632233, -0.03360813483595848, -0.05568644776940346, 0.02416801080107689, -0.05502762645483017, -0.0691828653216362, -0.026658622547984123, 0.08589780330657959, -0.01418857928365469, 0.0515134371817112, 0.0034700718242675066, -0.08518233895301819, -0.05157369002699852, -0.13264964520931244, -0.08427312970161438, -0.023971129208803177, 0.02259492687880993, 0.00827812496572733, -0.09456741809844971, 0.06977053731679916, -0.011132695712149143, -0.09246563911437988, 0.08259573578834534, 0.1919819414615631, -0.0641627162694931, 0.00928359106183052, 0.11063366383314133, -0.053631775081157684, -0.16061323881149292, -0.06289476901292801, 0.056729916483163834, 0.08943445235490799, -0.014083446934819221, -0.1479319930076599, 0.08230207860469818, 0.03929617255926132, 0.02796570397913456, -0.011300330050289631, -0.2694115936756134, -0.12483196705579758, 0.0506051667034626, 0.0646161139011383, 0.03820977360010147, -0.12502482533454895, -0.046145737171173096, -0.06110933795571327, -0.08983268588781357, 0.031154992058873177, 0.06002664566040039, 0.13138949871063232, -0.03206096217036247, 0.03240109980106354, 0.025676831603050232, -0.024937493726611137, 0.10258985310792923, 0.009861449711024761, 0.09061947464942932, -0.01528171170502901, 0.026423033326864243, 0.05780934542417526, -0.06511281430721283, 0.16785046458244324, -0.1656239926815033, 0.08308491110801697, -0.22754958271980286, -0.05221930891275406, -0.006195236463099718, -0.015583507716655731, -0.037939392030239105, -0.05187785625457764, -0.09784722328186035, 0.010730915702879429, 0.05464022606611252, -0.024614375084638596, 0.07604324072599411, -0.03214995563030243, -0.0515095591545105, 0.056011226028203964, 0.08918942511081696, -0.026651475578546524, -0.1426633596420288, 0.01794923096895218, 0.031668491661548615, 0.07706612348556519, -0.20523053407669067, 0.021414801478385925, 0.12155254185199738, 0.01072645839303732, 0.10906907916069031, 0.016240112483501434, -0.0635855570435524, 0.05012543126940727, 0.07149818539619446, -0.02818853221833706, -0.09932327270507812, -0.007129085715860128, -0.027492467314004898, -0.10220543295145035, 0.036951497197151184, 0.08408832550048828, -0.04251321032643318, -0.019013848155736923, -0.0102649861946702, 0.0016639006789773703, -0.06595095992088318, 0.19772867858409882, 0.023680390790104866, 0.08933234214782715, -0.058670539408922195, 0.07737763226032257, 0.1020311564207077, -0.10856357216835022, 0.010806710459291935, 0.16629721224308014, -0.0815102681517601, -0.021972928196191788, 0.05242671072483063, 0.0927216112613678, -0.06941471993923187, -0.06358203291893005, -0.1042497456073761, -0.0765463188290596, 0.020099937915802002, 0.03324529901146889, 0.06995789706707001, 0.07062916457653046, -0.033613383769989014, 0.02240714430809021, -0.08243180066347122, 0.1020551323890686, 0.07454036921262741, 0.052423231303691864, -0.1378694474697113, 0.13596759736537933, 0.030711105093359947, 0.09289035201072693, 0.003776251571252942, 0.03737420216202736, -0.09544408321380615, 0.04195336997509003, -0.061788465827703476, 0.04232769086956978, -0.014238027855753899, 0.05297430604696274, -0.024906042963266373, 0.020553017035126686, -0.031819917261600494, 0.05050477758049965, -0.04395183175802231, -0.02856111153960228, -0.026985041797161102, 0.04417550936341286, -0.056678541004657745, -0.015065896324813366, 0.007931509986519814, -0.07368657737970352, 0.10003205388784409, -0.06498219817876816, -0.005521215032786131, 0.0017460724338889122, 0.004385401029139757, 0.061960287392139435, 0.015556137077510357, 0.05918349698185921, -0.008384289219975471, 0.0005806659464724362, 0.041174426674842834, 0.017457367852330208, -0.005637579131871462, -0.007758029270917177, 0.044746991246938705, -0.13604451715946198, -0.0893421322107315, -0.09936782717704773, -0.06163044273853302, -0.06964851915836334, 0.07758288085460663, 0.0909501388669014, 0.07411294430494308, 0.09781739860773087, -0.027019275352358818, -0.00661390321329236, -0.14334475994110107, -0.034412577748298645, 0.051009830087423325, -0.010153287090361118, -0.10621760785579681, -0.05686284229159355, 0.05505286902189255, -0.042809586971998215, 0.12523533403873444, -0.016009580343961716, 0.04902061074972153, -0.005280097480863333, -0.04770950973033905, -0.0021325426641851664, -0.0029877691995352507, 0.20782609283924103, -0.08666107803583145, 0.017063792794942856, 0.016845019534230232, -0.004557944368571043, 0.03149772807955742, 0.1395310014486313, 0.10662363469600677, 0.13402597606182098, 0.059058234095573425, 0.10947541147470474, -0.046414393931627274, -0.031117361038923264, -0.17199395596981049, 0.04831770807504654, -0.0015792528865858912, 0.02818988263607025, -0.026903726160526276, 0.09248611330986023, 0.1438523679971695, -0.13117273151874542, 0.0943504273891449, 0.010936986654996872, -0.09872052073478699, -0.05060158297419548, -0.04756668582558632, -0.05046577751636505, -0.09200723469257355, 0.016657717525959015, -0.11396265774965286, 0.022235490381717682, 0.10002979636192322, 0.03790883347392082, -0.015246001072227955, 0.15228261053562164, -0.020395589992403984, -0.0600174181163311, -0.002718766685575247, 0.02298002503812313, 0.05411185696721077, 0.10559923201799393, 0.010665640234947205, 0.08180089294910431, -0.0552193708717823, 0.07395239174365997, 0.017711583524942398, 0.019892819225788116, 0.01264683436602354, -0.006603606510907412, 0.00010783080506371334, -0.048065926879644394, 0.005711113102734089, 0.08407273143529892, 0.16156107187271118, 0.0404103621840477, -0.04642819985747337, -0.04450264945626259, 0.17616984248161316, -0.05661015585064888, -0.08221124857664108, -0.1245783269405365, 0.1600201278924942, 0.0583527609705925, 0.02732006274163723, 0.010009437799453735, -0.08624517917633057, -0.04153521731495857, 0.22069445252418518, 0.014180905185639858, -0.035802796483039856, -0.03954010456800461, -0.005359350703656673, -0.005653776694089174, -0.034897539764642715, 0.1396804004907608, 0.021871428936719894, 0.21721121668815613, -0.0032087834551930428, -0.01676916517317295, -0.03804286569356918, -0.033066969364881516, -0.035590726882219315, 0.19154272973537445, -0.033280033618211746, 0.04002533480525017, -0.0881696417927742, -0.007744872942566872, 0.036588381975889206, -0.12179826945066452, 0.09736667573451996, -0.09573082625865936, -0.07970256358385086, 0.030069813132286072, 0.08836933225393295, -0.02360326424241066, 0.029693603515625, -0.01635950617492199, 0.05103789642453194, 0.036074429750442505, -0.028038108721375465, -0.09633564949035645, -0.12402022629976273, 0.06251043826341629, -0.01678895764052868, 0.1515401005744934, 0.023806463927030563, 0.08264638483524323, 0.09533121436834335, 0.019321991130709648, -0.0697517842054367, 0.11203453689813614, 0.03619154542684555, 0.007271270267665386, 0.07633877545595169, 0.11110913753509521, -0.038484081625938416, 0.1506391316652298, 0.005744252819567919, -0.025950564071536064, -0.030874336138367653, -0.003658684901893139, -0.01126422081142664, -0.14599111676216125, 0.0058791241608560085, -0.06546729803085327, 0.13345757126808167, 0.1880459040403366, -0.04543383792042732, -0.021709199994802475, -0.03480532392859459, 0.07384199649095535, -0.0185116957873106, 0.09675537794828415, -0.007410501595586538, -0.16439972817897797, 0.02219981513917446, -0.02758258953690529, 0.014813483692705631, -0.17844656109809875, -0.05120709165930748, -0.03326022997498512, -0.02933425083756447, -0.07618168741464615, 0.14315329492092133, 0.08880453556776047, 0.028683189302682877, -0.04654274135828018, -0.20415706932544708, -0.02836381271481514, 0.043586574494838715, -0.14353623986244202, -0.12324782460927963 ]
null
null
transformers
# CodeTrans model for source code summarization sql Pretrained model on programming language sql using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions. ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the sql code snippets. ## Intended uses & limitations The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline pipeline = SummarizationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql_transfer_learning_finetune"), tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_source_code_summarization_sql_transfer_learning_finetune", skip_special_tokens=True), device=0 ) tokenized_code = "select time ( col0 ) from tab0" pipeline([tokenized_code]) ``` Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/sql/small_model.ipynb). ## Training data The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1) ## Training procedure ### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code. ## Evaluation results For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : | Language / Model | Python | SQL | C# | | -------------------- | :------------: | :------------: | :------------: | | CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 | | CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 | | CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 | | CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 | | CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 | | CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 | | CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 | | CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** | | CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 | | CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 | | CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 | | CODE-NN | -- | 18.40 | 20.50 | > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
summarization
SEBIS/code_trans_t5_small_source_code_summarization_sql_transfer_learning_finetune
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "summarization", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
CodeTrans model for source code summarization sql ================================================= Pretrained model on programming language sql using the t5 small model architecture. It was first released in this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions. Model description ----------------- This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the sql code snippets. Intended uses & limitations --------------------------- The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better. ### How to use Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline: Run this example in colab notebook. Training data ------------- The supervised training tasks datasets can be downloaded on Link Training procedure ------------------ ### Transfer-learning Pretraining The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Fine-tuning This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code. Evaluation results ------------------ For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score): Test results : > > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------", "### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ 46, 61, 87, 111 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>" ]
[ -0.0846143364906311, 0.08351612091064453, -0.0011120449053123593, 0.1107567623257637, 0.05933993309736252, 0.02009854093194008, 0.03293534368276596, 0.09732573479413986, -0.04513591527938843, 0.06454765796661377, 0.06492725014686584, -0.049294427037239075, 0.062062818557024, 0.17973099648952484, 0.02514653094112873, -0.19090420007705688, -0.020182505249977112, 0.02655603364109993, -0.0529620498418808, 0.10233259946107864, 0.09094636142253876, -0.07841898500919342, 0.06943462044000626, -0.042451296001672745, -0.0947900265455246, 0.059905845671892166, -0.04048989340662956, -0.040750652551651, 0.09274813532829285, 0.06653717905282974, 0.11505603045225143, -0.03326236456632614, 0.07285448908805847, -0.21685273945331573, -0.00042167602805420756, 0.026060760021209717, 0.05332329869270325, 0.024326827377080917, 0.05606267228722572, 0.05945904552936554, 0.1286911815404892, -0.03551195189356804, 0.040203142911195755, 0.053030695766210556, -0.06733806431293488, -0.07794816792011261, -0.04930463433265686, 0.056474920362234116, 0.09768037497997284, 0.09834836423397064, -0.013158505782485008, 0.009096984751522541, -0.0853177160024643, 0.08393198996782303, 0.10995114594697952, -0.22400707006454468, -0.017096471041440964, 0.10696539282798767, 0.09455308318138123, 0.06386681646108627, -0.08472944051027298, -0.03394662216305733, 0.09632919728755951, 0.03200601413846016, 0.05390702933073044, -0.08426148444414139, -0.06516359001398087, 0.002607043134048581, -0.05772566422820091, -0.05152292549610138, 0.13928253948688507, 0.039613865315914154, -0.046935390681028366, -0.09743586927652359, -0.053146760910749435, -0.19266286492347717, 0.04140461981296539, 0.016636274755001068, 0.02335762418806553, 0.0028779874555766582, 0.0012983339838683605, -0.01320669800043106, -0.09400346875190735, -0.10143222659826279, -0.012286358512938023, 0.045416638255119324, 0.06599671393632889, 0.029999472200870514, -0.018320994451642036, 0.07488790154457092, -0.030656568706035614, -0.050216611474752426, -0.031662892550230026, 0.01591232232749462, -0.12873223423957825, 0.028529660776257515, -0.023702170699834824, -0.057304929941892624, -0.007508012466132641, 0.10186384618282318, -0.06948184221982956, 0.08379489183425903, 0.12590068578720093, 0.004650319926440716, -0.0031303430441766977, 0.24490579962730408, 0.042140763252973557, -0.17158061265945435, 0.011420240625739098, 0.016487738117575645, 0.0014113537035882473, -0.001823234255425632, -0.071853406727314, -0.04403955489397049, 0.001781637780368328, 0.06100749596953392, -0.13068722188472748, 0.02844027429819107, -0.040218714624643326, 0.00046996379387564957, 0.040092505514621735, -0.13150976598262787, 0.037364743649959564, 0.008172191679477692, -0.06352787464857101, -0.047117382287979126, 0.06738847494125366, -0.11592354625463486, -0.11706522852182388, 0.022021569311618805, -0.04792885109782219, -0.035365138202905655, -0.12899307906627655, -0.12571050226688385, -0.01948400028049946, -0.05464014410972595, 0.011176223866641521, -0.09886696189641953, -0.11217223107814789, -0.0121403643861413, 0.02991735376417637, -0.001838355208747089, -0.015836479142308235, -0.06383151561021805, 0.008412779308855534, 0.0036402945406734943, -0.026065906509757042, 0.01617756485939026, -0.04489315673708916, 0.08209492266178131, 0.08972188085317612, 0.050956640392541885, -0.0034032880794256926, 0.028459744527935982, -0.0841950997710228, 0.06844018399715424, -0.07217979431152344, 0.08251480758190155, -0.021084878593683243, 0.05189727991819382, -0.09915932267904282, -0.07643470168113708, 0.02833470143377781, 0.057036787271499634, 0.0651712566614151, 0.022493554279208183, -0.131845623254776, 0.02170942723751068, 0.14803390204906464, -0.1047600582242012, -0.11465388536453247, 0.11524298787117004, -0.006173702888190746, 0.005210965406149626, 0.07825903594493866, 0.12937946617603302, 0.1468261480331421, -0.08195794373750687, -0.0424746610224247, 0.09124567359685898, 0.051533907651901245, -0.059470098465681076, 0.09164196252822876, 0.012306025251746178, 0.0406002514064312, 0.029863934963941574, 0.044151537120342255, 0.058859020471572876, 0.0009254986071027815, -0.03805522993206978, -0.028083428740501404, -0.07966838032007217, -0.035889629274606705, -0.009513403289020061, 0.018393296748399734, -0.060233909636735916, -0.0675918310880661, 0.018097197636961937, 0.15890705585479736, -0.09901648014783859, 0.03064180351793766, -0.08909864723682404, -0.05982514098286629, -0.07077369093894958, 0.017814183607697487, -0.10977096110582352, 0.0026690559461712837, 0.04630884528160095, -0.04906739294528961, 0.0711703822016716, 0.07692936062812805, 0.0032845304813236, 0.02173428051173687, -0.04763498902320862, -0.043448612093925476, -0.03386153280735016, -0.060808293521404266, -0.12380116432905197, -0.00890751089900732, -0.0801977887749672, -0.025151068344712257, -0.0580701120197773, -0.16566868126392365, 0.010451632551848888, -0.02592061646282673, 0.015255449339747429, 0.001441332744434476, -0.017569752410054207, 0.019825682044029236, 0.053514622151851654, -0.049202967435121536, -0.0898532122373581, 0.012093770317733288, 0.029529739171266556, -0.12479722499847412, -0.032748427242040634, -0.11453410983085632, -0.05578618124127388, 0.07412063330411911, 0.10268498212099075, -0.08516451716423035, -0.008186050690710545, -0.01919347606599331, -0.05205274745821953, -0.04878212884068489, -0.07352028787136078, 0.19619697332382202, 0.00461433595046401, 0.16365690529346466, -0.1319454461336136, -0.05188658460974693, -0.04216673970222473, -0.010041549801826477, 0.02153484895825386, 0.15317222476005554, 0.0009653379675000906, -0.10177411139011383, 0.051932040601968765, -0.02741635963320732, -0.06866315752267838, 0.15366658568382263, -0.003932322841137648, -0.08446798473596573, 0.006650188472121954, 0.08362731337547302, -0.012380276806652546, 0.14535795152187347, -0.07098958641290665, -0.006064219865947962, -0.005339957773685455, 0.027177294716238976, 0.046697892248630524, -0.13065651059150696, 0.03237786889076233, 0.054491981863975525, -0.06718031316995621, -0.06444977968931198, -0.04653364419937134, -0.042785052210092545, 0.03289880231022835, 0.012436537072062492, 0.011634080670773983, -0.016671372577548027, -0.029774991795420647, -0.09289643913507462, 0.2109306901693344, -0.09482047706842422, -0.21579228341579437, -0.1760869026184082, 0.046447206288576126, -0.03448300436139107, -0.00015597778838127851, 0.04527731239795685, -0.11235152930021286, -0.07705086469650269, -0.09806729108095169, 0.13437797129154205, -0.12873472273349762, 0.0028586818370968103, -0.02890288084745407, 0.04122347757220268, 0.03145486116409302, -0.176368847489357, 0.0290203969925642, -0.01209992915391922, -0.007840616628527641, -0.00619723042473197, -0.061459511518478394, 0.08921629190444946, 0.12402738630771637, -0.07504107803106308, 0.012390264309942722, -0.016848960891366005, 0.15349549055099487, -0.053550925105810165, 0.026605144143104553, 0.18200664222240448, 0.014592291787266731, 0.04115357995033264, 0.062005579471588135, 0.00687393918633461, -0.08820383250713348, 0.06348440051078796, 0.05100776627659798, -0.04157595708966255, -0.2420005202293396, -0.009040666744112968, -0.07273419946432114, 0.04612594470381737, 0.11244957149028778, 0.04744265228509903, -0.15109723806381226, 0.028744544833898544, -0.00866785366088152, 0.13519719243049622, -0.026489490643143654, 0.054412610828876495, 0.028297722339630127, 0.014341719448566437, 0.016126496717333794, -0.09498089551925659, 0.006610379554331303, 0.0721002146601677, 0.1111513078212738, 0.20958520472049713, -0.06664571911096573, 0.2011246383190155, 0.007115774787962437, 0.07663353532552719, 0.04025605320930481, 0.09620718657970428, -0.11679248511791229, 0.008062229491770267, 0.003962187562137842, -0.004847644362598658, -0.07232052832841873, 0.05481509864330292, -0.028018072247505188, 0.06463483721017838, -0.05977247282862663, 0.04302104562520981, 0.016968274489045143, 0.16378770768642426, 0.05622619017958641, -0.19035327434539795, -0.11512778699398041, 0.017558559775352478, -0.10317317396402359, -0.10412565618753433, 0.06777727603912354, 0.2048516422510147, -0.04680488631129265, 0.01696610637009144, -0.0025213684421032667, 0.13775330781936646, -0.08208590745925903, -0.026680374518036842, 0.01804845780134201, 0.08647634834051132, 0.007980878464877605, 0.12490330636501312, -0.25921061635017395, 0.0833420604467392, 0.015415379777550697, 0.09634904563426971, -0.009898853488266468, 0.07128169387578964, -0.036012712866067886, -0.010408223606646061, 0.07219742983579636, 0.0009463885799050331, -0.10299272835254669, -0.20581895112991333, -0.050495002418756485, 0.02937101386487484, 0.0638459101319313, -0.00403680419549346, 0.09567423164844513, -0.022692915052175522, 0.06123993173241615, -0.01819266565144062, -0.12559249997138977, -0.04842893034219742, -0.1469995677471161, -0.04059436917304993, 0.0008099323604255915, -0.013245080597698689, -0.02815452218055725, 0.03358326107263565, 0.006796746049076319, 0.215068981051445, -0.16163040697574615, -0.09786356985569, -0.093072809278965, 0.08518541604280472, 0.13532012701034546, -0.10778976231813431, 0.03179308399558067, 0.020665612071752548, 0.05312827602028847, -0.03415260836482048, -0.0648799017071724, 0.030793700367212296, -0.051921870559453964, -0.0675843358039856, -0.02606143057346344, 0.0938071459531784, -0.009301772341132164, 0.04792004078626633, 0.008093936368823051, -0.07498214393854141, -0.04957279562950134, -0.13144464790821075, -0.08828043192625046, 0.00416075112298131, 0.02620682120323181, 0.014849597588181496, -0.09759078919887543, 0.06557944416999817, -0.011704587377607822, -0.08523281663656235, 0.0899202823638916, 0.16825279593467712, -0.06829258799552917, 0.009454192593693733, 0.1071738675236702, -0.06307823210954666, -0.1743660867214203, -0.05386471375823021, 0.052886713296175, 0.08569180965423584, -0.03243524208664894, -0.14051763713359833, 0.0733080729842186, 0.036869049072265625, 0.03376209735870361, 0.0017567770555615425, -0.28553450107574463, -0.12696154415607452, 0.03849617764353752, 0.06414567679166794, 0.047592636197805405, -0.11148706078529358, -0.04374375939369202, -0.06794334948062897, -0.05872534587979317, 0.04325041174888611, 0.06648565828800201, 0.12802749872207642, -0.02956043928861618, 0.029029343277215958, 0.03221152722835541, -0.025427592918276787, 0.08605948835611343, 0.0027650659903883934, 0.09368928521871567, -0.020069614052772522, 0.022160619497299194, 0.07544466853141785, -0.061691731214523315, 0.1698361337184906, -0.16794350743293762, 0.08484789729118347, -0.1967223733663559, -0.053478166460990906, -0.005470274016261101, -0.006998520344495773, -0.03237377107143402, -0.054987095296382904, -0.11167027801275253, 0.012127550318837166, 0.05867884308099747, -0.02847028523683548, 0.05950445681810379, -0.029575733467936516, -0.0593714714050293, 0.05327065661549568, 0.09068984538316727, -0.018474193289875984, -0.1368100643157959, 0.03058118186891079, 0.02958974987268448, 0.08288079500198364, -0.18855591118335724, 0.019341465085744858, 0.12253142893314362, 0.013369265012443066, 0.10825803875923157, 0.019778961315751076, -0.06478716433048248, 0.04691033437848091, 0.06637749820947647, -0.023319894447922707, -0.10548623651266098, -0.007335101254284382, -0.03989362716674805, -0.09523999691009521, 0.028076544404029846, 0.08115233480930328, -0.05080530047416687, -0.013223840855062008, -0.01128940749913454, 0.005278961732983589, -0.06005267798900604, 0.19481809437274933, 0.01778372935950756, 0.07895071804523468, -0.05948389321565628, 0.0877581387758255, 0.09693768620491028, -0.12008361518383026, 0.014546383172273636, 0.1654079109430313, -0.08274534344673157, -0.01850210130214691, 0.06699523329734802, 0.10035434365272522, -0.05958491191267967, -0.05868648365139961, -0.09465152770280838, -0.07260477542877197, 0.01723932847380638, 0.0357508659362793, 0.06564418971538544, 0.07453903555870056, -0.041976697742938995, 0.02306848205626011, -0.0924714133143425, 0.09865298867225647, 0.07541566342115402, 0.0476786345243454, -0.13272246718406677, 0.1499512642621994, 0.034570131450891495, 0.06856977939605713, 0.002798523288220167, 0.030314160510897636, -0.10268977284431458, 0.03983486443758011, -0.020303579047322273, 0.051998578011989594, -0.01156883966177702, 0.049953170120716095, -0.030465558171272278, 0.020335081964731216, -0.02933497168123722, 0.04936894774436951, -0.03543521836400032, -0.0359208770096302, -0.03260009363293648, 0.03554322198033333, -0.05797824636101723, -0.019547631964087486, 0.005492187570780516, -0.0698780044913292, 0.09550327062606812, -0.059914108365774155, -0.003056125482544303, 0.006306730676442385, -0.0013132973108440638, 0.06497608125209808, 0.026418251916766167, 0.05697660520672798, -0.012946495786309242, 0.004956044722348452, 0.03841255232691765, 0.018880046904087067, -0.01073200162500143, -0.010104549117386341, 0.04785371571779251, -0.1408596783876419, -0.09195325523614883, -0.09476296603679657, -0.05928924307227135, -0.07004734873771667, 0.0749979242682457, 0.09404323995113373, 0.07933397591114044, 0.10339980572462082, -0.03643898293375969, 0.002635140670463443, -0.1436159908771515, -0.03246157243847847, 0.05360976606607437, -0.013294524513185024, -0.09962591528892517, -0.04908179119229317, 0.06047718971967697, -0.04157546907663345, 0.1279810518026352, -0.0060735042206943035, 0.05310202017426491, -0.012711925432085991, -0.035378046333789825, -0.002501433715224266, 0.0015001188730821013, 0.21784481406211853, -0.07886210829019547, 0.009096999652683735, 0.007916972041130066, 0.008580226451158524, 0.04081778600811958, 0.1453862190246582, 0.09980789572000504, 0.12446355819702148, 0.0649304911494255, 0.1088363453745842, -0.05601844564080238, -0.026787232607603073, -0.16175433993339539, 0.05792107805609703, -0.01593727059662342, 0.0263562873005867, -0.03234982490539551, 0.10867533087730408, 0.13292793929576874, -0.1354018747806549, 0.10321895033121109, 0.01955150067806244, -0.10116568207740784, -0.05142633989453316, -0.08631076663732529, -0.053772151470184326, -0.09497571736574173, 0.009567264467477798, -0.1162499263882637, 0.02148464135825634, 0.08141537010669708, 0.03304154425859451, -0.022062363103032112, 0.14151756465435028, -0.028141381219029427, -0.07165833562612534, 0.010468874126672745, 0.03196614608168602, 0.04937747120857239, 0.10815779119729996, 0.012956266291439533, 0.07481611520051956, -0.06260698288679123, 0.07448340207338333, 0.022782592102885246, 0.016684649512171745, 0.013761904090642929, -0.004004672169685364, 0.00647031981498003, -0.05101560056209564, 0.009660963900387287, 0.07929474860429764, 0.15511471033096313, 0.04589681327342987, -0.055304985493421555, -0.04412804916501045, 0.19258053600788116, -0.05832330882549286, -0.07838360965251923, -0.12145181000232697, 0.16845247149467468, 0.05706987902522087, 0.025899462401866913, 0.00871778279542923, -0.0859324112534523, -0.03541681915521622, 0.22798782587051392, 0.029477983713150024, -0.03135555982589722, -0.03688041865825653, -0.014676759019494057, -0.009412736631929874, -0.024851888418197632, 0.14560426771640778, 0.020620005205273628, 0.22984954714775085, -0.002594336634501815, -0.014877350069582462, -0.037542685866355896, -0.03433335945010185, -0.03134351968765259, 0.20355269312858582, -0.03884561359882355, 0.03431369736790657, -0.09511748701334, -0.011060169897973537, 0.02359861321747303, -0.11155672371387482, 0.09377126395702362, -0.1075546070933342, -0.06974142789840698, 0.023817235603928566, 0.08135740458965302, -0.025355305522680283, 0.034130800515413284, -0.008725181221961975, 0.06135987862944603, 0.02930094674229622, -0.022586766630411148, -0.10992848873138428, -0.1307326853275299, 0.042787160724401474, -0.014252525754272938, 0.14184533059597015, 0.019935673102736473, 0.0715874508023262, 0.09540857374668121, 0.02843543142080307, -0.06982598453760147, 0.11409809440374374, 0.027973603457212448, 0.0017590802162885666, 0.07565415650606155, 0.10777463018894196, -0.041205670684576035, 0.1525684893131256, 0.007398678455501795, -0.026086485013365746, -0.01680128648877144, -0.012603032402694225, -0.014429450035095215, -0.14700868725776672, 0.00127870449796319, -0.06533797085285187, 0.13529393076896667, 0.18639346957206726, -0.040172137320041656, -0.018230147659778595, -0.03954217582941055, 0.0643569603562355, -0.027147822082042694, 0.0907464325428009, -0.0004584283451549709, -0.1572525054216385, 0.007537556812167168, -0.020968705415725708, 0.008549604564905167, -0.16438476741313934, -0.04848214238882065, -0.03760555759072304, -0.034169863909482956, -0.0793537050485611, 0.1417800784111023, 0.07950681447982788, 0.027969788759946823, -0.04298211634159088, -0.20393706858158112, -0.01412136945873499, 0.0508602038025856, -0.14248880743980408, -0.12185480445623398 ]
null
null
transformers
# CodeTrans transfer learning pre-trained model Pretrained model on programming languages using the t5 small model architecture. It was first released in [this repository](https://github.com/agemagician/CodeTrans). ## Model description This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. It could be used to fine-tune other tasks in the software development domain. > Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
{}
feature-extraction
SEBIS/code_trans_t5_small_transfer_learning_pretrain
[ "transformers", "pytorch", "jax", "t5", "feature-extraction", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[]
TAGS #transformers #pytorch #jax #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us
# CodeTrans transfer learning pre-trained model Pretrained model on programming languages using the t5 small model architecture. It was first released in this repository. ## Model description This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. It could be used to fine-tune other tasks in the software development domain. > Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
[ "# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 small model architecture. It was first released in\nthis repository.", "## Model description\n\nThis CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n", "# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 small model architecture. It was first released in\nthis repository.", "## Model description\n\nThis CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn" ]
[ 42, 38, 168 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 small model architecture. It was first released in\nthis repository.## Model description\n\nThis CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn" ]
[ -0.06966043263673782, -0.06854301691055298, -0.00023507693549618125, 0.09682103246450424, 0.152676060795784, 0.0315113365650177, 0.13083826005458832, 0.0007377786678262055, -0.09262232482433319, -0.02259930782020092, 0.06232769042253494, 0.043435726314783096, 0.003335829358547926, 0.18499495089054108, 0.06299368292093277, -0.24676263332366943, 0.027034573256969452, 0.027061503380537033, -0.03730034455657005, 0.10363413393497467, 0.09707385301589966, -0.07723354548215866, 0.0851326510310173, 0.0018489728681743145, -0.16551855206489563, 0.022403856739401817, -0.03725462779402733, -0.05455632135272026, 0.09782939404249191, 0.06289554387331009, 0.13381054997444153, 0.010275447741150856, 0.06582853943109512, -0.03701768442988396, 0.015909327194094658, 0.069430872797966, -0.019408512860536575, 0.009419121779501438, 0.02226061187684536, 0.08681058883666992, 0.21643595397472382, 0.08643276989459991, 0.06550514698028564, 0.031105555593967438, -0.07927485555410385, 0.0031650494784116745, 0.0355108305811882, 0.08918555825948715, 0.07538721710443497, 0.1369156539440155, 0.009534033015370369, 0.149699404835701, -0.1029789075255394, 0.1215023323893547, -0.027954980731010437, -0.3085957467556, -0.051316361874341965, 0.14055246114730835, 0.080853670835495, 0.1166733130812645, 0.026643523946404457, -0.03652917593717575, 0.03134143725037575, 0.07894155383110046, 0.13570600748062134, -0.06835643947124481, -0.061462607234716415, -0.07731694728136063, -0.12825381755828857, -0.07226801663637161, 0.22734156250953674, 0.0022783256135880947, -0.02946166880428791, -0.10141242295503616, -0.09487046301364899, -0.09270983934402466, 0.04292220622301102, -0.050304219126701355, 0.004378766752779484, 0.05549679696559906, 0.07328062504529953, -0.06986454874277115, -0.11273204535245895, -0.09971775859594345, -0.010783890262246132, 0.16333109140396118, 0.05317733809351921, 0.05934620648622513, -0.09851901978254318, 0.08441372215747833, 0.0456397719681263, -0.04961002618074417, 0.007224642671644688, -0.05221349745988846, -0.08929019421339035, 0.0035002916119992733, -0.04457816481590271, -0.18331009149551392, -0.029405200853943825, 0.047574546188116074, 0.03850181773304939, 0.013380534015595913, 0.1489577442407608, 0.03744909539818764, 0.03963249921798706, 0.12039417773485184, -0.0472395233809948, -0.03488505259156227, 0.052796125411987305, 0.010787995532155037, -0.030127504840493202, -0.0335812047123909, -0.15501675009727478, -0.015808314085006714, 0.02173364907503128, 0.021309522911906242, -0.06344173848628998, 0.1026771143078804, -0.01550661213696003, -0.08924121409654617, 0.13677804172039032, -0.07116080820560455, -0.07439053058624268, -0.007012802641838789, -0.018736939877271652, -0.03936989605426788, 0.07365875691175461, -0.04928656294941902, -0.12083946913480759, -0.045059263706207275, -0.07197846472263336, -0.08985026180744171, -0.14080537855625153, -0.14649800956249237, -0.05021440610289574, -0.11346180737018585, 0.03852527588605881, -0.2062319815158844, -0.14692527055740356, -0.06743446737527847, 0.07805750519037247, 0.02923603542149067, -0.07815441489219666, -0.027763409540057182, -0.03761468455195427, -0.04289738088846207, -0.04644864797592163, -0.0026179340202361345, -0.03325410559773445, 0.061388179659843445, -0.04273010417819023, 0.02994581311941147, -0.08069584518671036, 0.04804660379886627, -0.09004544466733932, -0.01658976636826992, -0.10786674916744232, 0.07954169064760208, 0.05219222605228424, 0.10402221977710724, -0.054551929235458374, -0.09181685000658035, -0.057122282683849335, 0.060064297169446945, 0.007754878606647253, 0.039743080735206604, -0.06576865911483765, 0.0030423691496253014, 0.07987676560878754, -0.15340808033943176, -0.13105808198451996, 0.06928222626447678, -0.008615913800895214, 0.16519460082054138, 0.06981082260608673, 0.1468496322631836, 0.16958187520503998, -0.037117961794137955, 0.08306892961263657, 0.0639486163854599, -0.08326365053653717, -0.1683523803949356, 0.04359455034136772, 0.09017109125852585, -0.08724592626094818, 0.006707086693495512, -0.07718881964683533, 0.10579868406057358, -0.020056063309311867, -0.03943343833088875, 0.003727017203345895, -0.09892649948596954, -0.06857110559940338, 0.022583477199077606, 0.1146421730518341, 0.03373029828071594, -0.06314407289028168, 0.019621513783931732, 0.11488428711891174, -0.12398254126310349, 0.008920741267502308, -0.11742690950632095, -0.04532643407583237, -0.0402742438018322, 0.006129659246653318, -0.20469917356967926, -0.01772327348589897, 0.06494805961847305, 0.03241356462240219, 0.006359996274113655, 0.2202846109867096, 0.01856881007552147, 0.04412902519106865, 0.015590546652674675, -0.05097653716802597, -0.1002410352230072, -0.023248769342899323, -0.07083304971456528, -0.036406368017196655, -0.10000603646039963, -0.06875351071357727, -0.05488964915275574, -0.10429548472166061, 0.02463822066783905, -0.07040481269359589, -0.0022546378895640373, 0.0465727336704731, -0.004185406491160393, 0.004270467441529036, 0.07953004539012909, -0.04264867678284645, -0.0687156543135643, 0.08324286341667175, 0.10220123827457428, -0.045447904616594315, 0.02145051769912243, -0.15418535470962524, 0.09134212136268616, 0.08118394017219543, -0.01537130307406187, -0.08580847829580307, 0.03888414800167084, -0.04869691655039787, -0.014175087213516235, 0.046539317816495895, -0.007358385715633631, 0.2572026252746582, -0.033968642354011536, 0.1653483659029007, -0.1113847866654396, 0.011285843327641487, -0.006594966631382704, 0.03618691489100456, 0.08170511573553085, 0.07287952303886414, 0.06244754418730736, -0.08999694138765335, 0.022258983924984932, -0.013689059764146805, -0.04242958873510361, 0.16528907418251038, -0.0230654738843441, -0.033599115908145905, 0.02090466022491455, 0.0038237927947193384, 0.007280018180608749, 0.044764552265405655, -0.14239196479320526, -0.05372471734881401, 0.007840943522751331, 0.031242968514561653, 0.07248776406049728, -0.08830871433019638, 0.025733772665262222, 0.03402462601661682, 0.01251829694956541, 0.040274787694215775, 0.006037258543074131, -0.10615769773721695, 0.03829891234636307, 0.020462390035390854, -0.07684443891048431, 0.02477741613984108, -0.01265799906104803, -0.11043016612529755, 0.17495496571063995, -0.03141763433814049, -0.21693019568920135, -0.1096208319067955, 0.17370371520519257, -0.00671312166377902, 0.04764961823821068, 0.046375975012779236, -0.018126044422388077, -0.06319527328014374, -0.09467912465333939, 0.09780815243721008, -0.07382351905107498, 0.012784712016582489, -0.016727903857827187, 0.03299717977643013, -0.022060463204979897, -0.1437431424856186, 0.02168603241443634, -0.02900826930999756, -0.0936444029211998, 0.04986065998673439, -0.11833648383617401, 0.08831281214952469, 0.18942193686962128, -0.027208801358938217, 0.03284350410103798, -0.019910195842385292, 0.1813058853149414, -0.049102380871772766, 0.040311094373464584, 0.2453029304742813, 0.02338818646967411, -0.020342959091067314, 0.06324249505996704, -0.02604041062295437, -0.11040038615465164, 0.09695736318826675, -0.0434085875749588, -0.07819071412086487, -0.15300127863883972, -0.10359255969524384, -0.08832773566246033, -0.025888441130518913, 0.12161193788051605, 0.060770854353904724, 0.05358423665165901, 0.0802772268652916, 0.01773805543780327, 0.11905363202095032, -0.024517036974430084, 0.04706486687064171, 0.0326179675757885, 0.01701335608959198, 0.053486257791519165, -0.07828119397163391, -0.018683413043618202, 0.038207996636629105, 0.026226110756397247, 0.21294909715652466, -0.05858607217669487, 0.17368604242801666, 0.074034683406353, 0.058846596628427505, 0.07967115938663483, 0.14128528535366058, -0.07718844711780548, 0.010873757302761078, -0.02211148291826248, -0.013495007529854774, -0.1133216917514801, 0.07030180841684341, -0.0517406240105629, -0.04494142904877663, -0.07295211404561996, 0.03305448964238167, -0.03472016379237175, 0.25639626383781433, -0.017563138157129288, -0.25123870372772217, -0.09609869867563248, -0.03226688504219055, -0.04187488555908203, -0.07165981829166412, 0.07190302759408951, 0.22845609486103058, -0.04264743998646736, -0.08895547688007355, -0.00490748742595315, 0.11938188225030899, -0.06621825695037842, 0.0015277634374797344, 0.060365524142980576, 0.03997473046183586, 0.06026320904493332, 0.08200293034315109, -0.22880609333515167, 0.10577695071697235, -0.021717114374041557, 0.11560367047786713, -0.0659392848610878, 0.01228940300643444, -0.04721865803003311, 0.12851743400096893, 0.07377880066633224, 0.012877387925982475, 0.040956053882837296, -0.10226938128471375, -0.047003015875816345, 0.04450696334242821, 0.009358868934214115, -0.05914134532213211, 0.048055026680231094, -0.01991925947368145, 0.060916200280189514, -0.0016830326057970524, -0.024034833535552025, -0.08882443606853485, -0.07681482285261154, -0.004208906088024378, 0.01921391673386097, 0.06285853683948517, -0.04513603076338768, -0.020600564777851105, 0.08275142312049866, 0.12134853005409241, -0.06750531494617462, -0.0725693553686142, -0.09903056174516678, -0.04539317265152931, 0.06116115674376488, -0.0669935941696167, 0.08039535582065582, 0.016861453652381897, -0.009620568715035915, -0.006436093710362911, -0.10434146225452423, 0.07500063627958298, -0.08898396790027618, -0.05085601285099983, -0.0019360726000741124, -0.03239844739437103, 0.04997620731592178, 0.021796919405460358, 0.025263240560889244, -0.06951317936182022, -0.039215270429849625, -0.10983288288116455, -0.11787272244691849, 0.0418710857629776, 0.06701506674289703, -0.06185723468661308, -0.07950156927108765, 0.07888280600309372, 0.010165860876441002, 0.007717879023402929, 0.09060407429933548, -0.019037216901779175, -0.05439506098628044, 0.016491439193487167, 0.18524675071239471, -0.036796990782022476, -0.25261563062667847, -0.06255895644426346, 0.09262546896934509, 0.08345668017864227, -0.06756297498941422, -0.19662711024284363, 0.06570671498775482, -0.024327564984560013, 0.024935012683272362, -0.01968022622168064, -0.32526150345802307, -0.1018490344285965, 0.07418283820152283, 0.14213939011096954, 0.15782223641872406, -0.12081130594015121, 0.025856615975499153, 0.016911163926124573, -0.05678076297044754, 0.1327269971370697, -0.04937516897916794, 0.12005949020385742, -0.019378546625375748, -0.06658462435007095, 0.014313136227428913, -0.04138106107711792, 0.007224549539387226, 0.0471423864364624, 0.08755441755056381, -0.055197641253471375, 0.07295700162649155, 0.12939588725566864, -0.035687461495399475, 0.14666403830051422, -0.00018379456014372408, 0.10367650538682938, -0.17480066418647766, -0.11001595854759216, -0.05972764641046524, 0.027619943022727966, 0.016101768240332603, -0.11792949587106705, -0.02487173117697239, 0.03723718971014023, 0.03699294850230217, 0.0033691173885017633, 0.05402332544326782, -0.03391682729125023, -0.06252795457839966, 0.07968666404485703, 0.11842560023069382, -0.09812155365943909, -0.09240355342626572, 0.00958879105746746, 0.010816044174134731, 0.1702553927898407, -0.16623197495937347, -0.008879718370735645, 0.09226302057504654, -0.04022931307554245, 0.07545148581266403, 0.03756612911820412, -0.027497421950101852, 0.013367590494453907, 0.06022069230675697, -0.09614425897598267, -0.08035166561603546, -0.037724316120147705, -0.03953244537115097, -0.03206034377217293, 0.040532004088163376, 0.10732541233301163, -0.11010309308767319, -0.005618681665509939, -0.041224535554647446, -0.058876991271972656, -0.054610639810562134, 0.12249422073364258, 0.03617626801133156, 0.023887943476438522, -0.058220695704221725, 0.06776683032512665, 0.0748424381017685, -0.10177827626466751, 0.02507915161550045, 0.08699505776166916, -0.1620025485754013, -0.09704093635082245, 0.04545256868004799, 0.0757724940776825, -0.01697409898042679, -0.0753651112318039, -0.08339083194732666, -0.06833239644765854, 0.029037857428193092, 0.05870138481259346, 0.06085090711712837, 0.06865137815475464, -0.10859350115060806, 0.0005146256880834699, -0.09297691285610199, -0.005335517693310976, 0.020281733945012093, 0.02751908451318741, -0.17357437312602997, 0.18101131916046143, 0.08447246998548508, 0.06914136558771133, -0.05917593464255333, -0.026868009939789772, -0.10984209924936295, 0.032432250678539276, 0.0334499254822731, 0.02658485434949398, -0.03492666780948639, 0.016891203820705414, -0.019619276747107506, -0.028099093586206436, -0.04987550154328346, 0.07807871699333191, -0.05082371085882187, 0.012561081908643246, -0.020127516239881516, 0.037472423166036606, 0.05206777900457382, -0.022535184398293495, -0.04931570217013359, -0.0795074999332428, 0.10680367797613144, -0.029380694031715393, -0.08109470456838608, 0.01423187367618084, -0.012031598016619682, 0.06222781538963318, 0.03337028622627258, 0.07631830871105194, -0.012548542581498623, 0.07627412676811218, 0.018425514921545982, 0.04761730879545212, 0.02203459106385708, -0.02059200406074524, 0.044292088598012924, -0.0992434099316597, -0.0050907195545732975, -0.03817855194211006, -0.021572547033429146, -0.06963814049959183, 0.004431269597262144, 0.08862978965044022, 0.1107541173696518, 0.1456613689661026, -0.025804562494158745, 0.028185436502099037, -0.12797310948371887, -0.019815681502223015, 0.06750672310590744, 0.003110303543508053, -0.04544307291507721, -0.12955006957054138, 0.03531409054994583, 0.008700201287865639, 0.07122316211462021, 0.04961298406124115, 0.03022729605436325, -0.02481616474688053, 0.027763448655605316, 0.04485716298222542, -0.00466168811544776, 0.1890447735786438, -0.002556446474045515, 0.04555812105536461, -0.012905810959637165, 0.05763180926442146, 0.026527486741542816, 0.15535318851470947, 0.10623247921466827, 0.10273996740579605, 0.0027765734121203423, 0.10005556792020798, -0.018916461616754532, 0.011802881956100464, -0.1268618255853653, -0.0017493385821580887, -0.04210810735821724, 0.09418997913599014, -0.07287689298391342, 0.01807977445423603, 0.13292525708675385, -0.01873667910695076, 0.04657402262091637, 0.05289584770798683, -0.0820457935333252, -0.029667628929018974, -0.1297803670167923, -0.05375220626592636, -0.10938239842653275, -0.011342703364789486, -0.08394725620746613, -0.07809469103813171, 0.13807730376720428, 0.007098471280187368, -0.029859758913517, 0.16269034147262573, 0.00455419160425663, -0.054401982575654984, 0.024171072989702225, -0.03861863911151886, 0.061968233436346054, -0.04366167634725571, -0.01694285124540329, 0.012090210802853107, -0.0033574076369404793, 0.04664044827222824, 0.00126642978284508, -0.028658531606197357, 0.006553056184202433, 0.011629950255155563, -0.009178751148283482, -0.041132111102342606, 0.02778538502752781, -0.014805037528276443, 0.15575413405895233, 0.024058105424046516, -0.11472369730472565, 0.01036184560507536, 0.15376193821430206, -0.009025448933243752, -0.14573675394058228, -0.15664687752723694, 0.12016995251178741, 0.04305445775389671, -0.000388926564482972, 0.02145928516983986, -0.016183318570256233, -0.05924917012453079, 0.26682227849960327, 0.11413770914077759, -0.06128945201635361, -0.010033112950623035, 0.03172115981578827, -0.009576855227351189, -0.061030540615320206, 0.2362637221813202, 0.027308886870741844, 0.2528161108493805, -0.00944062415510416, 0.0069687943905591965, -0.09482672065496445, 0.009627232328057289, -0.028141647577285767, 0.0033093460369855165, 0.013347535394132137, -0.026056688278913498, -0.0432201586663723, 0.021922001615166664, -0.023531688377261162, -0.014190263114869595, 0.13080234825611115, -0.01557376328855753, -0.04491761326789856, -0.023670192807912827, 0.010651722550392151, -0.007237173616886139, 0.11109928786754608, -0.04581301659345627, 0.045831989496946335, 0.043320074677467346, -0.03337674215435982, -0.10690965503454208, -0.06925037503242493, 0.09354845434427261, -0.057099249213933945, 0.15255379676818848, -0.053569890558719635, 0.07737822085618973, 0.03697006776928902, 0.027498498558998108, -0.09501594305038452, 0.11615287512540817, -0.04186926409602165, 0.021755140274763107, 0.0574798621237278, -0.02564181014895439, -0.07122166454792023, 0.09018057584762573, -0.03497811034321785, -0.1757086217403412, 0.007770867086946964, -0.024288136512041092, 0.019191045314073563, -0.08786969631910324, -0.015415524132549763, -0.0506269708275795, 0.1465175300836563, 0.09531891345977783, -0.014908193610608578, -0.04685113579034805, -0.06733648478984833, 0.06594481319189072, 0.005436135455965996, 0.04692213982343674, -0.037780169397592545, -0.15805403888225555, -0.073830746114254, -0.09957202523946762, -0.0066863782703876495, -0.14762142300605774, -0.010489149019122124, -0.09616101533174515, -0.027916936203837395, -0.09479157626628876, 0.06896962225437164, 0.06074981763958931, 0.028454113751649857, -0.035068850964307785, 0.019514288753271103, -0.02772592566907406, 0.05752487853169441, -0.14809520542621613, -0.17410536110401154 ]
null
null
transformers
# legal_t5_small_cls_cs model Model for classification of legal text written in Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_cls_cs is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for classification of legal texts written in Cszech. ### How to use Here is how to use this model to classify legal text written in Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_cls_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_cls_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Bez námitek k navrhovanému spojení (Případ č. COMP/M.4169 – Virgin/CPW/JV) (2006/C 103/16) (Text s významem pro EHP) Dne 29. března 2006 se Komise rozhodla nevznést námitky proti výše uvedenému spojení a prohlásit ho za slučitelné se společným trhem. Toto rozhodnutí je založeno na čl. 6 odst. 1 písm. b) nařízení Rady (ES) č. 139/2004. Celý text rozhodnutí je přístupný pouze v angličtině a bude uveřejněn poté, co bude zbaven obchodního tajemství, které může případně obsahovat. Text bude dosažitelný: - na webové stránce Europa – hospodářská soutěž (http://europa.eu.int/comm/competition/mergers/cases/). Tato webová stránka umožňuje vyhledat jednotlivá rozhodnutí o spojení, a to včetně společnosti, čísla případu, data a indexu odvětví hospodářství. - v elektronické podobě na webové stránce EUR-Lex, pod dokumentem č. 32006M4169. EUR-Lex umožňuje přístup k Evropskému právu přes Internet. (http://europa.eu.int/eur-lex/lex) --------------------------------------------------" pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_cls_cs model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 18 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | F1 score | |:-----:|:-----:| | legal_t5_small_cls_cs | 0.6297| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech", "tags": ["classification Cszech model"], "datasets": ["jrc-acquis"], "widget": [{"text": "Bez n\u00e1mitek k navrhovan\u00e9mu spojen\u00ed (P\u0159\u00edpad \u010d. COMP/M.4169 \u2013 Virgin/CPW/JV) (2006/C 103/16) (Text s v\u00fdznamem pro EHP) Dne 29. b\u0159ezna 2006 se Komise rozhodla nevzn\u00e9st n\u00e1mitky proti v\u00fd\u0161e uveden\u00e9mu spojen\u00ed a prohl\u00e1sit ho za slu\u010diteln\u00e9 se spole\u010dn\u00fdm trhem. Toto rozhodnut\u00ed je zalo\u017eeno na \u010dl. 6 odst. 1 p\u00edsm. b) na\u0159\u00edzen\u00ed Rady (ES) \u010d. 139/2004. Cel\u00fd text rozhodnut\u00ed je p\u0159\u00edstupn\u00fd pouze v angli\u010dtin\u011b a bude uve\u0159ejn\u011bn pot\u00e9, co bude zbaven obchodn\u00edho tajemstv\u00ed, kter\u00e9 m\u016f\u017ee p\u0159\u00edpadn\u011b obsahovat. Text bude dosa\u017eiteln\u00fd: - na webov\u00e9 str\u00e1nce Europa \u2013 hospod\u00e1\u0159sk\u00e1 sout\u011b\u017e (http://europa.eu.int/comm/competition/mergers/cases/). Tato webov\u00e1 str\u00e1nka umo\u017e\u0148uje vyhledat jednotliv\u00e1 rozhodnut\u00ed o spojen\u00ed, a to v\u010detn\u011b spole\u010dnosti, \u010d\u00edsla p\u0159\u00edpadu, data a indexu odv\u011btv\u00ed hospod\u00e1\u0159stv\u00ed. - v elektronick\u00e9 podob\u011b na webov\u00e9 str\u00e1nce EUR-Lex, pod dokumentem \u010d. 32006M4169. EUR-Lex umo\u017e\u0148uje p\u0159\u00edstup k Evropsk\u00e9mu pr\u00e1vu p\u0159es Internet. (http://europa.eu.int/eur-lex/lex) --------------------------------------------------"}]}
text2text-generation
SEBIS/legal_t5_small_cls_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "classification Cszech model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #classification Cszech model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_cls\_cs model =============================== Model for classification of legal text written in Cszech. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_cls\_cs is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for classification of legal texts written in Cszech. ### How to use Here is how to use this model to classify legal text written in Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_cls\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to classify legal text written in Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Cszech model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to classify legal text written in Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 67, 154, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Cszech model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to classify legal text written in Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.09950011968612671, 0.10683683305978775, -0.0024273099843412638, 0.060003094375133514, 0.09929684549570084, 0.009759454056620598, 0.08030859380960464, 0.1044561043381691, -0.0718187689781189, 0.07570008933544159, 0.06844525039196014, 0.04133947193622589, 0.0860099121928215, 0.10662106424570084, 0.041113466024398804, -0.21231065690517426, 0.014719199389219284, -0.009637309238314629, 0.002221949864178896, 0.1507369428873062, 0.11343628913164139, -0.10056551545858383, 0.03178912028670311, -0.03108160011470318, -0.13778842985630035, -0.019153419882059097, -0.06040870025753975, -0.06031516566872597, 0.06118793785572052, 0.025860249996185303, 0.11598524451255798, 0.01945221610367298, 0.08064013719558716, -0.19474969804286957, 0.0013625960564240813, 0.09105323255062103, 0.040458329021930695, 0.05183661729097366, 0.05254511535167694, -0.02403016947209835, 0.15604187548160553, -0.01783040352165699, 0.07731305062770844, 0.0324430912733078, -0.1317652463912964, -0.10674145072698593, -0.06625684350728989, 0.08781898021697998, 0.15856710076332092, 0.13063578307628632, -0.0558093897998333, 0.07210654020309448, -0.10535688698291779, 0.05503202602267265, 0.06076788902282715, -0.259024441242218, -0.06933774054050446, 0.048220641911029816, 0.04522261396050453, 0.075262151658535, -0.06312423944473267, -0.05013345927000046, 0.041780635714530945, 0.01951339840888977, 0.02084909752011299, -0.016076408326625824, 0.0033983879256993532, -0.003110523335635662, -0.17001907527446747, -0.1036176085472107, 0.17248667776584625, 0.008175348863005638, -0.07438404113054276, -0.08746502548456192, -0.022162560373544693, -0.17923614382743835, 0.03157412260770798, -0.04366837441921234, 0.03786340355873108, -0.00636452529579401, 0.014026741497218609, 0.0013071323046460748, -0.12722553312778473, -0.10769996047019958, 0.055164508521556854, 0.08727774024009705, 0.07256931066513062, -0.008293570019304752, 0.006108336616307497, 0.15956571698188782, 0.02319422736763954, -0.08395236730575562, -0.03271053358912468, 0.005003512836992741, -0.1374737024307251, -0.04716062545776367, -0.03523821383714676, -0.09257327765226364, -0.06473959982395172, 0.1271154135465622, -0.00315308989956975, 0.05962327495217323, 0.0489642433822155, 0.050952546298503876, 0.011404119431972504, 0.13672707974910736, -0.0584479458630085, -0.022359952330589294, -0.05756363272666931, 0.06421737372875214, -0.06499096006155014, -0.005879068281501532, -0.014317363500595093, 0.01276656799018383, 0.06535379588603973, 0.05584731698036194, -0.08387086540460587, 0.016083762049674988, -0.05579276755452156, -0.03416358679533005, 0.014384729787707329, -0.11242010444402695, -0.0369534008204937, 0.00033863118733279407, -0.0817345529794693, -0.07746075093746185, 0.09398094564676285, -0.0038868989795446396, -0.10242453217506409, 0.07811590284109116, -0.045917704701423645, -0.011906890198588371, -0.12439172714948654, -0.07545694708824158, -0.019239366054534912, -0.07120970636606216, -0.04990824684500694, -0.0694807693362236, -0.15033991634845734, -0.11087743192911148, 0.07597104460000992, -0.035035133361816406, -0.05634713172912598, -0.06353092193603516, -0.020403992384672165, -0.0028728293254971504, -0.029057344421744347, 0.10282807052135468, -0.012023361399769783, 0.09466425329446793, 0.0144496476277709, 0.0402267649769783, 0.1386355608701706, 0.06590284407138824, -0.0945243090391159, 0.03426580876111984, -0.08430272340774536, 0.1356949508190155, -0.009027865715324879, 0.020498262718319893, -0.1481255441904068, -0.06836757063865662, -0.06496930122375488, 0.04752757027745247, 0.08319355547428131, 0.09745863825082779, -0.14157959818840027, -0.0021175905130803585, 0.18601806461811066, -0.0962563008069992, -0.0650082677602768, 0.08153272420167923, -0.050463080406188965, 0.14473658800125122, 0.06846039742231369, 0.15545885264873505, 0.07833682000637054, -0.058186642825603485, -0.01443540584295988, -0.04282910376787186, 0.015224757604300976, 0.006424854043871164, 0.08666441589593887, -0.03506775200366974, -0.09400081634521484, -0.028117630630731583, -0.10177165269851685, 0.0030847361776977777, -0.06970196217298508, -0.07298635691404343, 0.011662968434393406, -0.0623880960047245, -0.05284569412469864, 0.05598112940788269, 0.03882163017988205, -0.031123558059334755, -0.12249129265546799, -0.009999370202422142, 0.09777600318193436, -0.0680048018693924, 0.02412322536110878, -0.07846996933221817, -0.04856882616877556, -0.08133374154567719, -0.015632901340723038, -0.18568961322307587, 0.05552004650235176, 0.042405419051647186, -0.025602325797080994, 0.04645838588476181, 0.03550763800740242, 0.02249259315431118, 0.05209348350763321, 0.003643628442659974, -0.05286351218819618, -0.05446799471974373, -0.030632857233285904, -0.1360868662595749, -0.12777559459209442, -0.05844734236598015, -0.024336185306310654, 0.14768579602241516, -0.19468159973621368, 0.048463884741067886, -0.056856878101825714, 0.03293660655617714, -0.01582636870443821, -0.06497421115636826, 0.028313592076301575, 0.03240535780787468, 0.009834731929004192, -0.062496550381183624, 0.05370054766535759, 0.048911433666944504, 0.003164311172440648, 0.037427034229040146, -0.1367489993572235, -0.15027610957622528, 0.07717400789260864, 0.04684895649552345, -0.168948233127594, -0.01481663715094328, -0.054003868252038956, -0.058928053826093674, -0.05616844445466995, 0.009563782252371311, 0.24388694763183594, 0.002025530207902193, 0.13726167380809784, -0.09544596076011658, -0.07149902731180191, -0.001023167627863586, -0.02168218232691288, 0.029205653816461563, 0.11727391928434372, 0.08750613033771515, -0.08695114403963089, 0.05089951306581497, 0.032253168523311615, -0.028034796938300133, 0.1279267966747284, 0.011467313393950462, -0.10558139532804489, -0.009101641364395618, 0.06794848293066025, -0.012167180888354778, 0.08830839395523071, -0.16371163725852966, -0.004106311593204737, 0.007502734661102295, 0.024316290393471718, 0.040232814848423004, -0.16841459274291992, 0.022048233076930046, 0.06588073819875717, -0.022676803171634674, 0.02039167657494545, -0.04505433887243271, -0.06251004338264465, 0.07660673558712006, 0.03379417210817337, -0.010620815679430962, -0.004710209555923939, -0.042399484664201736, -0.1471467912197113, 0.21053795516490936, -0.05224970355629921, -0.14049871265888214, -0.09664717316627502, 0.09589409828186035, 0.0740765929222107, 0.0013536375481635332, 0.026952257379889488, -0.07566454261541367, -0.039577797055244446, -0.10149268060922623, 0.07304001599550247, -0.058003298938274384, -0.021312301978468895, -0.052483417093753815, -0.004833709914237261, 0.015576670877635479, -0.11755318939685822, 0.02811705507338047, -0.03473353385925293, -0.10417266935110092, 0.008048783987760544, -0.025370296090841293, 0.07294335961341858, 0.1804216057062149, -0.013728301040828228, 0.03218778595328331, 0.0026341655757278204, 0.1721097230911255, -0.11959080398082733, 0.006655446253716946, 0.07377630472183228, 0.003238486126065254, 0.004533854778856039, 0.09031204134225845, -0.0004524444811977446, -0.09247589111328125, 0.06684662401676178, 0.05038964003324509, -0.04305022209882736, -0.26196226477622986, -0.012517862021923065, -0.020471636205911636, -0.03175966069102287, 0.1295371800661087, 0.047342605888843536, 0.038511134684085846, 0.05697241425514221, -0.025534309446811676, 0.005656168796122074, 0.02586517482995987, 0.07356815785169601, -0.060424309223890305, 0.006342466454952955, 0.07542074471712112, -0.0480799600481987, -0.008706929162144661, 0.028311893343925476, 0.0039000443648546934, 0.24048103392124176, -0.02948739379644394, 0.1081933081150055, 0.10248676687479019, 0.10436835885047913, 0.014577161520719528, 0.060701098293066025, -0.03813980519771576, 0.021990766748785973, 0.008875821717083454, -0.03648010641336441, -0.09088818728923798, 0.059056155383586884, -0.012592142447829247, 0.040840309113264084, -0.11301778256893158, -0.022546056658029556, 0.015508326701819897, 0.337700217962265, 0.060611553490161896, -0.2439318597316742, -0.07653432339429855, 0.007628608960658312, -0.06439778953790665, -0.08816342055797577, 0.05624939873814583, 0.08346371352672577, -0.1249542385339737, -0.011560771614313126, -0.03818829730153084, 0.09336069971323013, -0.10478153824806213, -0.04406685382127762, 0.09189102798700333, 0.022099411115050316, -0.011647148989140987, 0.09130269289016724, -0.2816475033760071, 0.174348384141922, -0.00922270119190216, 0.1150154247879982, -0.02459089830517769, 0.026152923703193665, -0.05952971428632736, 0.012372399680316448, 0.16162511706352234, -0.0020964862778782845, 0.050000861287117004, -0.09944469481706619, -0.09190311282873154, 0.014204941689968109, 0.06288642436265945, -0.11455227434635162, 0.10533976554870605, 0.01686777174472809, 0.025698814541101456, 0.004142435733228922, -0.07253953814506531, -0.12437914311885834, -0.11337557435035706, 0.007354775909334421, -0.0755290612578392, 0.0650797188282013, -0.05974896624684334, -0.03951363265514374, 0.01045568659901619, 0.17154815793037415, -0.17727500200271606, -0.06297747045755386, -0.08469823747873306, 0.03226802125573158, 0.0675058662891388, -0.027973387390375137, -0.01198987290263176, 0.0157567597925663, 0.04187049716711044, -0.004995967727154493, 0.010111473500728607, 0.08512445539236069, -0.061333972960710526, -0.16390033066272736, -0.05694590136408806, 0.13036081194877625, 0.12787975370883942, 0.06151537224650383, -0.028583521023392677, 0.02102677896618843, -0.010809686966240406, -0.08345764875411987, 0.010815662331879139, 0.031067069619894028, 0.05766675993800163, 0.017648693174123764, -0.04946885257959366, 0.02369210496544838, -0.08799438178539276, -0.03986126929521561, 0.09155477583408356, 0.1455986350774765, -0.05044326186180115, 0.06623639166355133, 0.14802712202072144, -0.12067695707082748, -0.16119325160980225, 0.044917359948158264, 0.09256770461797714, 0.058779217302799225, -0.07687884569168091, -0.20661795139312744, 0.06696774065494537, 0.08226537704467773, 0.0025326397735625505, -0.03825950622558594, -0.3778332769870758, -0.12798985838890076, 0.08953569829463959, 0.08838117122650146, -0.05467086657881737, -0.11070966720581055, -0.02463400363922119, 0.03609396889805794, -0.018124278634786606, 0.13260743021965027, -0.041743360459804535, 0.08738214522600174, 0.01796489953994751, -0.07184556871652603, 0.042677365243434906, -0.06456409394741058, 0.11188916116952896, 0.08686450868844986, 0.06579121202230453, -0.04726840555667877, 0.007897148840129375, 0.02665664441883564, -0.037375323474407196, 0.12690427899360657, 0.04196455329656601, 0.035955313593149185, -0.2156539261341095, -0.06266802549362183, -0.08267372846603394, -0.004594871774315834, -0.08027569204568863, -0.05746328830718994, -0.039722926914691925, 0.09356394410133362, 0.05896711349487305, -0.003662892384454608, 0.023942772299051285, -0.07233774662017822, 0.04185822606086731, 0.09748242050409317, 0.12596161663532257, 0.08347336947917938, -0.07873383164405823, 0.024471180513501167, 0.036748748272657394, 0.09819087386131287, -0.17715051770210266, -0.019743405282497406, 0.12137015908956528, 0.005840363446623087, 0.1524171233177185, 0.01584734581410885, -0.13894297182559967, -0.00925900973379612, 0.04407329112291336, -0.11381889879703522, -0.07747791707515717, -0.012744580395519733, 0.005474264733493328, -0.08515863120555878, -0.0405585803091526, 0.05360228195786476, -0.12013660371303558, -0.012213921174407005, -0.007955910637974739, 0.03774416446685791, -0.07502726465463638, 0.18711678683757782, 0.06090027466416359, 0.07320091873407364, -0.058292195200920105, 0.11755605041980743, 0.11730707436800003, -0.1156192272901535, 0.014036591164767742, 0.1508602797985077, -0.08634582161903381, -0.053123839199543, -0.004309104289859533, 0.1156284436583519, -0.0031567479018121958, -0.06113700196146965, -0.04731777310371399, -0.046833306550979614, 0.06536152958869934, 0.03797493502497673, 0.04676540568470955, 0.04168647155165672, -0.021870240569114685, 0.012139675207436085, -0.15334299206733704, 0.0709386095404625, 0.05577641353011131, 0.019063595682382584, -0.037229448556900024, 0.21131351590156555, 0.0546332411468029, 0.04891010746359825, 0.002409973880276084, -0.0505218580365181, -0.05584706738591194, 0.06385515630245209, 0.020692899823188782, -0.01778143085539341, -0.06741320341825485, -0.012894939631223679, -0.0086376266553998, 0.009322503581643105, 0.0018088959623128176, 0.018359515815973282, -0.058838289231061935, -0.026639927178621292, -0.028060782700777054, 0.051988840103149414, -0.06149813532829285, -0.006138461176306009, -0.0031174705363810062, -0.07654347270727158, 0.07678612321615219, 0.00001218081251863623, 0.005091287195682526, -0.016046365723013878, -0.03596232458949089, 0.0958312600851059, -0.01819523423910141, -0.004702691920101643, -0.0360628180205822, -0.0966416522860527, 0.03621125966310501, -0.03234577178955078, -0.03862551972270012, -0.007964017800986767, 0.05398494750261307, -0.1391446888446808, 0.04877970367670059, -0.013166183605790138, 0.0009158719331026077, -0.08347700536251068, 0.09490533173084259, 0.02447940595448017, 0.08252561092376709, 0.09472369402647018, -0.05792611464858055, 0.07094889879226685, -0.15733402967453003, -0.04346490651369095, 0.029359692707657814, 0.04059537872672081, -0.06613980978727341, -0.06161199137568474, 0.06122079864144325, -0.05141236633062363, 0.06573781371116638, 0.03930108994245529, 0.033278241753578186, 0.03931747004389763, -0.10335427522659302, -0.0037783405277878046, 0.05039774626493454, 0.05668294429779053, -0.0018327080179005861, 0.0002349750866414979, 0.028063876554369926, 0.03814157843589783, -0.01579742133617401, 0.05691767483949661, 0.14280836284160614, 0.19179823994636536, 0.06691981852054596, 0.07308852672576904, -0.03780047968029976, -0.10779029130935669, -0.07559455931186676, 0.15256713330745697, -0.015368767082691193, 0.025888612493872643, -0.03626583516597748, 0.11119796335697174, 0.0843844935297966, -0.1589714139699936, 0.06219093129038811, -0.007056371308863163, -0.09515272825956345, -0.0863427221775055, -0.08993706852197647, -0.02451489120721817, -0.04166654124855995, -0.008638148196041584, -0.09997400641441345, 0.022019771859049797, 0.08430870622396469, 0.044067513197660446, -0.02127557061612606, 0.13522018492221832, 0.005865937564522028, -0.03551727160811424, 0.09119473397731781, -0.00902288407087326, 0.09146665036678314, -0.10452046990394592, 0.0005116913234815001, 0.0332661047577858, -0.02541344426572323, 0.06687198579311371, 0.0036863621789962053, -0.014595289714634418, 0.032893430441617966, 0.0465773269534111, -0.07222948223352432, 0.013914094306528568, 0.014385206624865532, 0.12202603369951248, 0.07529741525650024, 0.05686604976654053, -0.03621893748641014, -0.01918194070458412, 0.20818346738815308, -0.048003099858760834, -0.06228627637028694, -0.1645733118057251, 0.1811767965555191, 0.07690080255270004, 0.0035859886556863785, 0.03930283710360527, -0.10354413092136383, 0.02326931618154049, 0.20067894458770752, 0.10465490072965622, -0.034318823367357254, -0.017139650881290436, 0.0345582515001297, -0.0035073107574135065, 0.01929738000035286, 0.10424397885799408, 0.047647226601839066, 0.17185667157173157, -0.09950835257768631, 0.04052596539258957, -0.059531569480895996, -0.06645001471042633, 0.023686517030000687, 0.14537866413593292, 0.013877565041184425, 0.0006524905329570174, -0.056668736040592194, 0.09664435684680939, -0.013974045403301716, -0.20130100846290588, 0.1058121845126152, -0.08737848699092865, -0.12098661810159683, -0.020933251827955246, -0.005364140495657921, -0.01685146987438202, 0.053079042583703995, -0.009105294942855835, -0.020268689841032028, 0.10503537207841873, 0.03893743082880974, -0.06381954252719879, -0.13172326982021332, 0.08384615927934647, -0.010731247253715992, 0.1703486442565918, -0.013180309906601906, 0.08320905268192291, 0.07935996353626251, 0.02074856497347355, -0.10735763609409332, 0.07547364383935928, 0.042466502636671066, 0.012409472838044167, 0.03561647608876228, 0.14402595162391663, -0.025958577170968056, 0.1291235238313675, 0.04009942337870598, -0.1390613466501236, 0.0494350828230381, -0.13990561664104462, -0.04657943546772003, -0.18107765913009644, 0.021936878561973572, -0.03495476022362709, 0.14980672299861908, 0.20592668652534485, -0.04930698126554489, -0.0040158056654036045, -0.06164075806736946, 0.024359825998544693, -0.02024845778942108, 0.16092464327812195, 0.007357837166637182, -0.18022705614566803, 0.019467702135443687, -0.09729951620101929, 0.04751112312078476, -0.2272324115037918, -0.03687804192304611, 0.030276281759142876, -0.08431490510702133, -0.03402913361787796, 0.1189112663269043, 0.04546578228473663, 0.07048695534467697, -0.05197376757860184, -0.008942967280745506, -0.011577514931559563, 0.15598532557487488, -0.13415804505348206, -0.10841898620128632 ]
null
null
transformers
# legal_t5_small_cls_de model Model for classification of legal text written in Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_cls_de is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for classification of legal texts written in Deustch. ### How to use Here is how to use this model to classify legal text written in Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_cls_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_cls_de", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "BESCHLUSS DES RATES vom 17. Dezember 1999 über den Abschluß des Abkommens in Form eines Briefwechsels zwischen der Europäischen Gemeinschaft und der Tunesischen Republik über die Regelung für die Einfuhr von nicht behandeltem Olivenöl mit Ursprung in Tunesien in die Gemeinschaft (1999/873/EG) DER RAT DER EUROPÄISCHEN UNION - gestützt auf den Vertrag zur Gründung der Europäischen Gemeinschaft, insbesondere auf Artikel 133 in Verbindung mit Artikel 300 Absatz 2 Unterabsatz 1, auf Vorschlag der Kommission, in Erwägung nachstehender Gründe: (1) Zwischen der Europäischen Gemeinschaft und der Tunesischen Republik wurde ein Abkommen in Form eines Briefwechsels ausgehandelt, um die Geltungsdauer der Regelung für die Einfuhr von nicht behandeltem Olivenöl mit Ursprung in Tunesien in die Gemeinschaft, die in Artikel 3 des Protokolls Nr. 1 des Europa-Mittelmeer-Abkommens zur Gründung einer Assoziation zwischen der Europäischen Gemeinschaft und ihren Mitgliedstaaten einerseits und der Tunesischen Republik andererseits(1) vorgesehen ist, für die Zeit vom 1. Januar bis zum 31. Dezember 2000 zu verlängern. (2) Das Abkommen sollte im Namen der Gemeinschaft genehmigt werden - BESCHLIESST: Artikel 1 Das Abkommen in Form eines Briefwechsels zwischen der Europäischen Gemeinschaft und der Tunesischen Republik über die Regelung für die Einfuhr von nicht behandeltem Olivenöl mit Ursprung in Tunesien in die Gemeinschaft wird im Namen der Gemeinschaft genehmigt. Der Wortlaut des Abkommens ist diesem Beschluß beigefügt. Artikel 2 Der Präsident des Rates wird ermächtigt, die Person zu bestellen, die befugt ist, das Abkommen rechtsverbindlich für die Gemeinschaft zu unterzeichnen. Geschehen zu Brüssel am 17. Dezember 1999. Im Namen des Rates Der Präsident K. HEMILÄ (1) ABl. L 97 vom 30.3.1998, S. 1." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_cls_de model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 23 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | F1 score | |:-----:|:-----:| | legal_t5_small_cls_de | 0.6358| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch", "tags": ["classification Deustch model"], "datasets": ["jrc-acquis"], "widget": [{"text": "BESCHLUSS DES RATES vom 17. Dezember 1999 \u00fcber den Abschlu\u00df des Abkommens in Form eines Briefwechsels zwischen der Europ\u00e4ischen Gemeinschaft und der Tunesischen Republik \u00fcber die Regelung f\u00fcr die Einfuhr von nicht behandeltem Oliven\u00f6l mit Ursprung in Tunesien in die Gemeinschaft (1999/873/EG) DER RAT DER EUROP\u00c4ISCHEN UNION - gest\u00fctzt auf den Vertrag zur Gr\u00fcndung der Europ\u00e4ischen Gemeinschaft, insbesondere auf Artikel 133 in Verbindung mit Artikel 300 Absatz 2 Unterabsatz 1, auf Vorschlag der Kommission, in Erw\u00e4gung nachstehender Gr\u00fcnde: (1) Zwischen der Europ\u00e4ischen Gemeinschaft und der Tunesischen Republik wurde ein Abkommen in Form eines Briefwechsels ausgehandelt, um die Geltungsdauer der Regelung f\u00fcr die Einfuhr von nicht behandeltem Oliven\u00f6l mit Ursprung in Tunesien in die Gemeinschaft, die in Artikel 3 des Protokolls Nr. 1 des Europa-Mittelmeer-Abkommens zur Gr\u00fcndung einer Assoziation zwischen der Europ\u00e4ischen Gemeinschaft und ihren Mitgliedstaaten einerseits und der Tunesischen Republik andererseits(1) vorgesehen ist, f\u00fcr die Zeit vom 1. Januar bis zum 31. Dezember 2000 zu verl\u00e4ngern. (2) Das Abkommen sollte im Namen der Gemeinschaft genehmigt werden - BESCHLIESST: Artikel 1 Das Abkommen in Form eines Briefwechsels zwischen der Europ\u00e4ischen Gemeinschaft und der Tunesischen Republik \u00fcber die Regelung f\u00fcr die Einfuhr von nicht behandeltem Oliven\u00f6l mit Ursprung in Tunesien in die Gemeinschaft wird im Namen der Gemeinschaft genehmigt. Der Wortlaut des Abkommens ist diesem Beschlu\u00df beigef\u00fcgt. Artikel 2 Der Pr\u00e4sident des Rates wird erm\u00e4chtigt, die Person zu bestellen, die befugt ist, das Abkommen rechtsverbindlich f\u00fcr die Gemeinschaft zu unterzeichnen. Geschehen zu Br\u00fcssel am 17. Dezember 1999. Im Namen des Rates Der Pr\u00e4sident K. HEMIL\u00c4 (1) ABl. L 97 vom 30.3.1998, S. 1."}]}
text2text-generation
SEBIS/legal_t5_small_cls_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "classification Deustch model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #classification Deustch model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_cls\_de model =============================== Model for classification of legal text written in Deustch. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_cls\_de is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for classification of legal texts written in Deustch. ### How to use Here is how to use this model to classify legal text written in Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_cls\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to classify legal text written in Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Deustch model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to classify legal text written in Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 67, 154, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Deustch model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to classify legal text written in Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.09498060494661331, 0.10631866753101349, -0.0027729652356356382, 0.067763552069664, 0.10088157653808594, 0.010976369492709637, 0.08326467871665955, 0.10211289674043655, -0.08559392392635345, 0.0702027752995491, 0.07146129757165909, 0.04897056892514229, 0.0868600457906723, 0.10577227175235748, 0.04407396912574768, -0.2195199579000473, 0.01609150320291519, -0.00806824117898941, -0.004125695675611496, 0.13827921450138092, 0.12032391875982285, -0.09630552679300308, 0.031251173466444016, -0.037091441452503204, -0.12691091001033783, -0.002153773093596101, -0.06596589088439941, -0.0583309568464756, 0.0677930936217308, 0.029092470183968544, 0.11399747431278229, 0.018641319125890732, 0.08120507001876831, -0.19734413921833038, 0.000024040895368671045, 0.09835304319858551, 0.040233783423900604, 0.0412052683532238, 0.05236604064702988, -0.022344008088111877, 0.14871428906917572, -0.024624818935990334, 0.0834062471985817, 0.029271982610225677, -0.13502569496631622, -0.1274217963218689, -0.06577004492282867, 0.07954002916812897, 0.1566900908946991, 0.1340462863445282, -0.05786263197660446, 0.0572521910071373, -0.09831804037094116, 0.05091886222362518, 0.04873185604810715, -0.2415914535522461, -0.06675098836421967, 0.04542059078812599, 0.048811644315719604, 0.07211709022521973, -0.05648813396692276, -0.05475732311606407, 0.046964604407548904, 0.03069397434592247, 0.019182780757546425, -0.014572035521268845, 0.02324320375919342, -0.0036624385975301266, -0.1654694825410843, -0.09636122733354568, 0.16112932562828064, 0.0044255913235247135, -0.07876840233802795, -0.08811012655496597, -0.020122220739722252, -0.17426669597625732, 0.03604967147111893, -0.06514438986778259, 0.044381432235240936, -0.0023068708833307028, 0.021045809611678123, 0.0018470017239451408, -0.11913216859102249, -0.10834521055221558, 0.046457476913928986, 0.07833191007375717, 0.07962403446435928, -0.01507584284991026, -0.005146742332726717, 0.16108354926109314, 0.01289395522326231, -0.08707988262176514, -0.035148851573467255, 0.0006068247021175921, -0.12419337034225464, -0.044959958642721176, -0.03939833492040634, -0.09826089441776276, -0.05742575600743294, 0.1451655775308609, -0.01350847352296114, 0.05885590985417366, 0.041510991752147675, 0.04154915735125542, 0.01653790846467018, 0.13250671327114105, -0.062204759567976, -0.037388723343610764, -0.05229416489601135, 0.06541813910007477, -0.05856025218963623, 0.00030000373953953385, -0.012369541451334953, 0.011055346578359604, 0.07097157090902328, 0.056439705193042755, -0.08076183497905731, 0.009265264496207237, -0.057607121765613556, -0.03520337864756584, 0.01795920915901661, -0.11312877386808395, -0.03850531578063965, -0.004289926029741764, -0.0847473293542862, -0.07296376675367355, 0.08737952262163162, -0.0007987850694917142, -0.10149333626031876, 0.07781805098056793, -0.040099721401929855, -0.024363137781620026, -0.1258600652217865, -0.08623438328504562, -0.019999004900455475, -0.0714171752333641, -0.03781396523118019, -0.07742942869663239, -0.16635224223136902, -0.11286655813455582, 0.07579127699136734, -0.041334230452775955, -0.0543973334133625, -0.06772122532129288, -0.016372213140130043, -0.005305750761181116, -0.030172904953360558, 0.09444767981767654, -0.010843624360859394, 0.09314261376857758, 0.027023758739233017, 0.04307182505726814, 0.1327609270811081, 0.07524226605892181, -0.08950980752706528, 0.033389583230018616, -0.08398708701133728, 0.1473025530576706, -0.005548875778913498, 0.020122269168496132, -0.15166310966014862, -0.06819220632314682, -0.06686185300350189, 0.05356430262327194, 0.09391559660434723, 0.09635014832019806, -0.15262500941753387, -0.0019500618800520897, 0.1816958636045456, -0.09788845479488373, -0.05702505633234978, 0.08339987695217133, -0.05600542202591896, 0.146455779671669, 0.0784449428319931, 0.15986692905426025, 0.07947317510843277, -0.055323462933301926, -0.010209637694060802, -0.050068508833646774, 0.009838389232754707, 0.02157493494451046, 0.08130277693271637, -0.03449156880378723, -0.08827732503414154, -0.030466511845588684, -0.095829077064991, 0.004636083729565144, -0.06250698119401932, -0.06760106235742569, 0.020337190479040146, -0.06234303116798401, -0.05523413047194481, 0.05446363613009453, 0.034802209585905075, -0.034547124058008194, -0.12424176186323166, 0.011909033171832561, 0.0954769104719162, -0.06777315586805344, 0.019270505756139755, -0.0724339634180069, -0.06125331670045853, -0.07869593054056168, -0.011740482412278652, -0.17679037153720856, 0.05263569951057434, 0.03795892372727394, -0.04699577018618584, 0.04560832306742668, 0.04242630675435066, 0.023205716162919998, 0.05181745067238808, 0.0009515053243376315, -0.053189653903245926, -0.06833332777023315, -0.024916211143136024, -0.1275140345096588, -0.12684127688407898, -0.046710237860679626, -0.02953915297985077, 0.1264979988336563, -0.18736767768859863, 0.0495753288269043, -0.06553860753774643, 0.03416357561945915, -0.015668151900172234, -0.05913582444190979, 0.022264784201979637, 0.034249503165483475, 0.00578311737626791, -0.06888534873723984, 0.046710509806871414, 0.0423240102827549, 0.010242762975394726, 0.02960011549293995, -0.1343948245048523, -0.14197254180908203, 0.08015148341655731, 0.04873053729534149, -0.17323540151119232, 0.0027734364848583937, -0.051196109503507614, -0.06453658640384674, -0.05613767355680466, -0.003050079569220543, 0.25239214301109314, -0.0027209182735532522, 0.1349053680896759, -0.09177356958389282, -0.07068279385566711, -0.004787006881088018, -0.010945208370685577, 0.020503487437963486, 0.11355473101139069, 0.08521970361471176, -0.0868443176150322, 0.056712497025728226, 0.03363259509205818, -0.027944043278694153, 0.12071964889764786, 0.016886815428733826, -0.10456070303916931, -0.007925237528979778, 0.06231851875782013, -0.01395986508578062, 0.08335869759321213, -0.16517198085784912, -0.0006522861076518893, 0.0007209403556771576, 0.034267593175172806, 0.04511693865060806, -0.16146031022071838, 0.02118077501654625, 0.06879852712154388, -0.02401317097246647, 0.013676454313099384, -0.0428493432700634, -0.06507983058691025, 0.07449182122945786, 0.04114982858300209, -0.013895999640226364, 0.002249444369226694, -0.03860361501574516, -0.13964302837848663, 0.2094137966632843, -0.057716358453035355, -0.15015077590942383, -0.09027417749166489, 0.09622074663639069, 0.06610557436943054, 0.004964236170053482, 0.03755621239542961, -0.08083341270685196, -0.043335527181625366, -0.09298940747976303, 0.09761001914739609, -0.06808144599199295, -0.02324380911886692, -0.05219060927629471, -0.004503882490098476, 0.02016414888203144, -0.12683546543121338, 0.026048768311738968, -0.02981579303741455, -0.0936131700873375, 0.00514746131375432, -0.01999759115278721, 0.06893183290958405, 0.16470542550086975, -0.010661373846232891, 0.02530897781252861, 0.003052341751754284, 0.18031442165374756, -0.122700534760952, 0.010500556789338589, 0.08003967255353928, 0.007461567874997854, 0.004752535838633776, 0.10012921690940857, -0.001182564185000956, -0.10322977602481842, 0.07193727791309357, 0.06252226233482361, -0.03767108917236328, -0.26784950494766235, -0.0007624406134709716, -0.015513972379267216, -0.03213486820459366, 0.12730784714221954, 0.04836934059858322, 0.04526738449931145, 0.04783304035663605, -0.021581606939435005, 0.023676550015807152, 0.031793732196092606, 0.07235856354236603, -0.03422623500227928, 0.002486140001565218, 0.07722067087888718, -0.04632740840315819, -0.022061031311750412, 0.03168834000825882, 0.012395568192005157, 0.23517899215221405, -0.02505520172417164, 0.08771022409200668, 0.10195904970169067, 0.07955052703619003, 0.0035667854826897383, 0.058674223721027374, -0.03735332563519478, 0.024745145812630653, 0.0021017217077314854, -0.032233141362667084, -0.07944917678833008, 0.06381747126579285, -0.019955608993768692, 0.033919792622327805, -0.10757771134376526, -0.020525705069303513, 0.013552725315093994, 0.32341504096984863, 0.06605211645364761, -0.2505691349506378, -0.07456351816654205, 0.011869742535054684, -0.06597618013620377, -0.0960785448551178, 0.061372313648462296, 0.06412462145090103, -0.11316914856433868, -0.0006232879240997136, -0.032882366329431534, 0.09241735935211182, -0.11003069579601288, -0.03718522936105728, 0.09539994597434998, 0.035148341208696365, -0.009326684288680553, 0.08643634617328644, -0.30295777320861816, 0.15651361644268036, -0.011842510662972927, 0.11927524954080582, -0.01793641783297062, 0.03265697881579399, -0.05939106270670891, 0.007429539225995541, 0.15812747180461884, -0.004558064509183168, 0.05610266327857971, -0.08854444324970245, -0.08385809510946274, 0.014496730640530586, 0.05706649646162987, -0.10965033620595932, 0.09683623909950256, 0.019405636936426163, 0.033064812421798706, 0.008054331876337528, -0.06815499067306519, -0.14024251699447632, -0.1132156029343605, 0.007299174088984728, -0.08577211946249008, 0.0784197449684143, -0.05712752044200897, -0.03872545063495636, 0.005396158900111914, 0.1688753366470337, -0.17663989961147308, -0.07275937497615814, -0.08213687688112259, 0.044432513415813446, 0.06317193061113358, -0.0259912870824337, -0.011362369172275066, 0.020446866750717163, 0.03150137513875961, -0.00729577150195837, 0.004350676666945219, 0.09024783223867416, -0.05816637724637985, -0.15354077517986298, -0.06173969432711601, 0.11616791784763336, 0.13432632386684418, 0.05364570766687393, -0.028459006920456886, 0.01725723035633564, -0.010468815453350544, -0.08056237548589706, 0.015157361514866352, 0.03708949685096741, 0.053012967109680176, 0.006286253686994314, -0.044506121426820755, 0.0239748265594244, -0.09057941287755966, -0.04289831593632698, 0.08280570060014725, 0.1409299224615097, -0.0465364083647728, 0.059989895671606064, 0.14859013259410858, -0.12068802118301392, -0.1638631373643875, 0.047246288508176804, 0.09288130700588226, 0.05987074226140976, -0.07626741379499435, -0.217109814286232, 0.06325367838144302, 0.09051459282636642, 0.007229667156934738, -0.025732431560754776, -0.376983642578125, -0.1294165551662445, 0.09680475294589996, 0.08560731261968613, -0.04918045178055763, -0.11173103004693985, -0.020445769652724266, 0.02596520259976387, -0.01646900549530983, 0.12140035629272461, -0.03873869404196739, 0.08102339506149292, 0.016537591814994812, -0.06612769514322281, 0.039394207298755646, -0.059626419097185135, 0.11738614737987518, 0.08740922063589096, 0.07379479706287384, -0.05013485625386238, 0.016033334657549858, 0.02209182269871235, -0.03744471073150635, 0.13248848915100098, 0.046805545687675476, 0.036065373569726944, -0.2040836215019226, -0.07146705687046051, -0.07462870329618454, -0.002562455367296934, -0.080815888941288, -0.06799736618995667, -0.03756299987435341, 0.09443190693855286, 0.054553236812353134, -0.009460809640586376, 0.016561217606067657, -0.06656304746866226, 0.02964150346815586, 0.08147846907377243, 0.11960281431674957, 0.09673786163330078, -0.0770580992102623, 0.03028213419020176, 0.03404742851853371, 0.10245572775602341, -0.17800737917423248, -0.01464912761002779, 0.12498122453689575, -0.004419917706400156, 0.15659216046333313, 0.016238396987318993, -0.14371651411056519, -0.0057874927297234535, 0.04288073256611824, -0.10425208508968353, -0.09154733270406723, -0.012469232082366943, -0.010941927321255207, -0.07456845045089722, -0.03171728178858757, 0.059221602976322174, -0.1222638264298439, -0.015083987265825272, -0.007470021024346352, 0.042453039437532425, -0.08012975752353668, 0.19081994891166687, 0.05498881638050079, 0.06555995345115662, -0.05429040268063545, 0.1253494769334793, 0.11683658510446548, -0.1274956464767456, 0.02384025976061821, 0.14795120060443878, -0.0799640342593193, -0.053764257580041885, -0.016974445432424545, 0.12372902780771255, -0.0073498329147696495, -0.06532661616802216, -0.04999231547117233, -0.05106790363788605, 0.07087567448616028, 0.02008170261979103, 0.040906742215156555, 0.04427969455718994, -0.024214012548327446, 0.017617832869291306, -0.15265724062919617, 0.06969605386257172, 0.0585041381418705, 0.023925576359033585, -0.04260041192173958, 0.19296854734420776, 0.05519161745905876, 0.04384804517030716, 0.0038838943000882864, -0.044518306851387024, -0.05730503052473068, 0.05952905863523483, -0.007683552335947752, -0.022058043628931046, -0.06370166689157486, -0.006066102534532547, -0.008171251974999905, 0.0111252935603261, 0.00479884585365653, 0.022449223324656487, -0.05154474452137947, -0.034554339945316315, -0.025669416412711143, 0.05108492076396942, -0.06501257419586182, -0.01269562728703022, -0.011121518909931183, -0.0757099986076355, 0.08050287514925003, -0.0036026693414896727, 0.010104302316904068, -0.02405412681400776, -0.018293259665369987, 0.10138636827468872, -0.015430363826453686, -0.004082202911376953, -0.035653237253427505, -0.10827769339084625, 0.03379020094871521, -0.032812003046274185, -0.04710322991013527, -0.013192662037909031, 0.05005582049489021, -0.14194951951503754, 0.05337011069059372, -0.014913673512637615, -0.0008480316610075533, -0.0811639130115509, 0.08795510232448578, 0.02592075802385807, 0.08603651076555252, 0.09520606696605682, -0.06161794066429138, 0.07091601938009262, -0.15135705471038818, -0.04434560239315033, 0.029783865436911583, 0.04408705234527588, -0.05333474278450012, -0.06048464775085449, 0.061977557837963104, -0.05239037796854973, 0.07446064054965973, 0.04288125038146973, 0.0335913822054863, 0.04099109768867493, -0.1215580627322197, -0.010578225366771221, 0.05135862156748772, 0.05546132102608681, -0.008234480395913124, -0.00011488272139104083, 0.004659538622945547, 0.038958754390478134, -0.006947811227291822, 0.06789970397949219, 0.1355806142091751, 0.2082955539226532, 0.06315023452043533, 0.0691121444106102, -0.03534788638353348, -0.11043582856655121, -0.0919230654835701, 0.15248867869377136, -0.0001914776221383363, 0.020534340292215347, -0.03000636026263237, 0.09521077573299408, 0.08514190465211868, -0.15441787242889404, 0.06157761067152023, -0.010612967424094677, -0.09181886166334152, -0.08638343214988708, -0.08059515058994293, -0.02346840687096119, -0.04352383315563202, -0.01022853422909975, -0.0986369252204895, 0.019219309091567993, 0.06701622903347015, 0.04733528196811676, -0.027791738510131836, 0.13648346066474915, 0.011970199644565582, -0.04163104668259621, 0.08594538271427155, -0.0068655055947601795, 0.08714772760868073, -0.08750292658805847, 0.004431289620697498, 0.028818542137742043, -0.01691012643277645, 0.06039419770240784, 0.0006632705335505307, -0.02006538026034832, 0.031002918258309364, 0.04534274712204933, -0.062040429562330246, 0.011805702932178974, 0.022043392062187195, 0.10613180696964264, 0.08892404288053513, 0.06114798039197922, -0.04181407764554024, -0.011622807942330837, 0.1932801902294159, -0.0446477010846138, -0.07101613283157349, -0.16237428784370422, 0.1959824562072754, 0.08916772902011871, 0.016083236783742905, 0.036826241761446, -0.1117091178894043, 0.018377356231212616, 0.1879180669784546, 0.1171615794301033, -0.029095150530338287, -0.022866884246468544, 0.03728224337100983, -0.003259437857195735, 0.019652927294373512, 0.09925734251737595, 0.05786575749516487, 0.17134861648082733, -0.094303198158741, 0.04207306727766991, -0.05970097705721855, -0.058621253818273544, 0.02638077735900879, 0.14802585542201996, 0.01903938129544258, 0.002506210934370756, -0.05363503098487854, 0.10147111862897873, -0.009767143055796623, -0.2107289880514145, 0.10663987696170807, -0.0831049233675003, -0.12099311500787735, -0.020490430295467377, -0.01124535407871008, -0.02259165421128273, 0.052412230521440506, -0.0041369544342160225, -0.015278157778084278, 0.11583811789751053, 0.035999856889247894, -0.06415094435214996, -0.12241624295711517, 0.0804244726896286, 0.01524319127202034, 0.15690800547599792, -0.014727957546710968, 0.08061192184686661, 0.08445873856544495, 0.028035245835781097, -0.1090550422668457, 0.08266670256853104, 0.03923529386520386, 0.02807433344423771, 0.033053454011678696, 0.1321609765291214, -0.026889843866229057, 0.11926646530628204, 0.040538936853408813, -0.13593432307243347, 0.040981192141771317, -0.12979134917259216, -0.05829323083162308, -0.18569041788578033, 0.033007942140102386, -0.036753490567207336, 0.1456981897354126, 0.2049882560968399, -0.04532584547996521, 0.000860550906509161, -0.06430086493492126, 0.026309784501791, -0.014655856415629387, 0.166172057390213, 0.00689287856221199, -0.17545312643051147, 0.021320411935448647, -0.09186716377735138, 0.04977891594171524, -0.23195278644561768, -0.03663361072540283, 0.034429073333740234, -0.08058211952447891, -0.03229859471321106, 0.11659520119428635, 0.04119731858372688, 0.07144795358181, -0.05658711865544319, -0.01282321847975254, -0.021340975537896156, 0.15658962726593018, -0.12922775745391846, -0.10944851487874985 ]
null
null
transformers
# legal_t5_small_cls_en model Model for classification of legal text written in English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_cls_en is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for classification of legal texts written in English. ### How to use Here is how to use this model to classify legal text written in English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_cls_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_cls_en", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "Appointment of members of the Conciliation Body instituted by Commission Decision 94/442/EC of 1 July 1994 setting up a conciliation procedure in the context of the clearance of the accounts of the European Agricultural Guidance and Guarantee Fund (EAGGF) Guarantee Section (2006/C 193/09) (1) The Commission has renewed the term of office of: Mr José Luis SAENZ GARCIA-BAQUERO (ES) (from 1 August 2006 to 31 July 2007). (2) The Commission has appointed as members: - Mr Peter BAUMANN (DA) (from 1 August 2006 to 31 July 2009); - Mr Daniel PERRIN (FR) (from 1 August 2006 to 31 July 2009). (3) The Commission has appointed as substitute members: - Mr Robert BURIAN (A) (from 1 August 2006); - Mr Eduardo DIEZ PATIER (ES) (from 1 August 2006). --------------------------------------------------" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_cls_en model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 19 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | F1 score | |:-----:|:-----:| | legal_t5_small_cls_en | 0.6247| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English", "tags": ["classification English model"], "datasets": ["jrc-acquis"], "widget": [{"text": "Appointment of members of the Conciliation Body instituted by Commission Decision 94/442/EC of 1 July 1994 setting up a conciliation procedure in the context of the clearance of the accounts of the European Agricultural Guidance and Guarantee Fund (EAGGF) Guarantee Section (2006/C 193/09) (1) The Commission has renewed the term of office of: Mr Jos\u00e9 Luis SAENZ GARCIA-BAQUERO (ES) (from 1 August 2006 to 31 July 2007). (2) The Commission has appointed as members: - Mr Peter BAUMANN (DA) (from 1 August 2006 to 31 July 2009); - Mr Daniel PERRIN (FR) (from 1 August 2006 to 31 July 2009). (3) The Commission has appointed as substitute members: - Mr Robert BURIAN (A) (from 1 August 2006); - Mr Eduardo DIEZ PATIER (ES) (from 1 August 2006). --------------------------------------------------"}]}
text2text-generation
SEBIS/legal_t5_small_cls_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "classification English model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #classification English model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_cls\_en model =============================== Model for classification of legal text written in English. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_cls\_en is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for classification of legal texts written in English. ### How to use Here is how to use this model to classify legal text written in English in PyTorch: Training data ------------- The legal\_t5\_small\_cls\_en model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to classify legal text written in English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_en model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification English model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to classify legal text written in English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_en model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 65, 152, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification English model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to classify legal text written in English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_en model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07821981608867645, 0.1153092235326767, -0.00326361577026546, 0.07108616828918457, 0.09659197181463242, 0.02034049667418003, 0.095726378262043, 0.11447031050920486, -0.04918750375509262, 0.05899059772491455, 0.048923250287771225, 0.03850183263421059, 0.07613936066627502, 0.08509082347154617, 0.03773452714085579, -0.23139870166778564, 0.009142701514065266, -0.023044563829898834, 0.0022198513615876436, 0.152445450425148, 0.12740705907344818, -0.09163787215948105, 0.046469688415527344, -0.011875180527567863, -0.11325009912252426, -0.006237728055566549, -0.06880297511816025, -0.056236278265714645, 0.062391843646764755, 0.028907157480716705, 0.11755206435918808, 0.010463966988027096, 0.07347241044044495, -0.20630985498428345, 0.006888235919177532, 0.07684504240751266, 0.02780635468661785, 0.0583077035844326, 0.08623535931110382, -0.03985271602869034, 0.16259226202964783, -0.012256105430424213, 0.07857810705900192, 0.04222492873668671, -0.13111598789691925, -0.12350863963365555, -0.05151427537202835, 0.10887311398983002, 0.15057989954948425, 0.13471560180187225, -0.05657346174120903, 0.10228848457336426, -0.09408433735370636, 0.061223335564136505, 0.062186695635318756, -0.23534506559371948, -0.05879075452685356, 0.04725731164216995, 0.05307389423251152, 0.07839512825012207, -0.0784042626619339, -0.06979527324438095, 0.040375273674726486, 0.03428129479289055, 0.0001952073653228581, -0.016888892278075218, 0.020382875576615334, 0.009098412469029427, -0.17440326511859894, -0.09277782589197159, 0.16971787810325623, 0.01639021933078766, -0.07667261362075806, -0.08095753937959671, -0.012613365426659584, -0.14749954640865326, 0.03691648319363594, -0.03836563229560852, 0.023670729249715805, -0.0046492996625602245, 0.047324348241090775, -0.014141032472252846, -0.1293867975473404, -0.08886687457561493, 0.014867329970002174, 0.08763833343982697, 0.07346190512180328, -0.010775510221719742, -0.0040050954557955265, 0.1488863080739975, -0.003264702158048749, -0.0820203423500061, -0.02691095694899559, -0.015893438830971718, -0.11350341141223907, -0.0518357940018177, -0.03217174485325813, -0.10833136737346649, -0.03086206689476967, 0.10589250177145004, -0.023041214793920517, 0.054038070142269135, 0.03501882404088974, 0.04866882786154747, 0.031847529113292694, 0.1503613442182541, -0.05819657817482948, 0.013365196995437145, -0.05380578339099884, 0.05806514620780945, -0.07334233820438385, 0.011627755127847195, -0.025219067931175232, 0.01678229309618473, 0.0583437904715538, 0.06262873113155365, -0.06436300277709961, 0.01578090526163578, -0.06864701956510544, -0.043121207505464554, 0.022145850583910942, -0.12499241530895233, -0.017882607877254486, 0.017805660143494606, -0.08317401260137558, -0.07360951602458954, 0.09339205175638199, -0.0030348957516252995, -0.11050132662057877, 0.07070257514715195, -0.036480098962783813, -0.006945236586034298, -0.12000346183776855, -0.08480437844991684, -0.010470353066921234, -0.06378720700740814, -0.040738362818956375, -0.07250307500362396, -0.14185546338558197, -0.10110873728990555, 0.06316553801298141, -0.041892245411872864, -0.03871471807360649, -0.05190827324986458, -0.012882603332400322, -0.007728829514235258, -0.008314300328493118, 0.0810483917593956, -0.0253126360476017, 0.08500910550355911, 0.003667181823402643, 0.0327133871614933, 0.10724391788244247, 0.0664156898856163, -0.10918481647968292, 0.02825700305402279, -0.07568513602018356, 0.12862266600131989, -0.027055349200963974, -0.0006900843000039458, -0.16330723464488983, -0.08226801455020905, -0.04916025698184967, 0.032787349075078964, 0.06121382117271423, 0.1063818633556366, -0.16006672382354736, 0.004908512346446514, 0.16691696643829346, -0.09817547351121902, -0.062379829585552216, 0.10253365337848663, -0.05285605415701866, 0.12469407916069031, 0.05933333560824394, 0.14282585680484772, 0.08269733190536499, -0.057691365480422974, -0.0029602868016809225, -0.04381127655506134, 0.0008243717602454126, 0.04997570440173149, 0.10313263535499573, -0.04894229769706726, -0.0933900773525238, -0.035772375762462616, -0.0875464379787445, -0.005559196695685387, -0.06245250627398491, -0.06968529522418976, 0.007731457706540823, -0.04888096824288368, -0.02748277597129345, 0.04807272180914879, 0.014686592854559422, -0.03373304009437561, -0.12417170405387878, 0.008383752778172493, 0.10424040257930756, -0.05769840627908707, 0.010107897222042084, -0.08869696408510208, -0.026596175506711006, -0.06961750984191895, -0.02879924140870571, -0.18234246969223022, 0.059799596667289734, 0.03639698028564453, -0.04686756059527397, 0.07987003773450851, 0.0518963597714901, 0.025175634771585464, 0.04834078997373581, -0.008966824971139431, -0.05677203834056854, -0.05249470844864845, -0.013515976257622242, -0.12285918742418289, -0.1453331857919693, -0.03331633284687996, -0.023323550820350647, 0.106151282787323, -0.19101014733314514, 0.036522313952445984, -0.03729803487658501, 0.05470098927617073, -0.009545031003654003, -0.049962278455495834, 0.034211330115795135, 0.021616579964756966, 0.0022130629513412714, -0.055574119091033936, 0.043417368084192276, 0.021744893863797188, -0.015952911227941513, 0.04396320506930351, -0.11777530610561371, -0.1411944478750229, 0.08138871192932129, 0.04372590035200119, -0.15760132670402527, -0.008067959919571877, -0.062039565294981, -0.054661281406879425, -0.0554160512983799, 0.004752157721668482, 0.2574408948421478, 0.00014697913138661534, 0.13046085834503174, -0.09706741571426392, -0.07149916142225266, -0.013791440054774284, -0.033741604536771774, 0.003209542715921998, 0.1433074176311493, 0.07193566858768463, -0.1307685524225235, 0.053100377321243286, 0.041457924991846085, -0.03458001837134361, 0.13196024298667908, 0.0017674044938758016, -0.10883157700300217, -0.014713761396706104, 0.06525879353284836, -0.0017488559242337942, 0.0797734335064888, -0.15686482191085815, 0.02239229530096054, 0.02427155151963234, 0.03383021056652069, 0.05043312534689903, -0.15145070850849152, 0.022224251180887222, 0.05833146721124649, -0.02973516657948494, 0.006464134436100721, -0.031613223254680634, -0.031095292419195175, 0.08287955075502396, 0.02070721611380577, -0.006112460047006607, -0.0036203048657625914, -0.0450899638235569, -0.15138931572437286, 0.2118409425020218, -0.06662385165691376, -0.1472257673740387, -0.1256776750087738, 0.07035252451896667, 0.04814139008522034, 0.004953705705702305, 0.03555785492062569, -0.07964504510164261, -0.05112040787935257, -0.11590857803821564, 0.09201540797948837, -0.04927593842148781, -0.01890815794467926, -0.07326806336641312, 0.0032457250636070967, 0.03322381153702736, -0.12582910060882568, 0.02617248333990574, -0.00833484623581171, -0.07711853086948395, 0.0105008939281106, -0.037253767251968384, 0.06480471044778824, 0.17885072529315948, -0.03515945002436638, 0.019897157326340675, 0.01486149150878191, 0.19948269426822662, -0.11526644229888916, 0.016712523996829987, 0.08104348182678223, -0.026246771216392517, 0.010901205241680145, 0.09297359734773636, 0.005615347996354103, -0.07646461576223373, 0.06677088886499405, 0.0529550202190876, -0.02519511803984642, -0.30334964394569397, -0.03130393475294113, -0.010637599043548107, -0.030342889949679375, 0.12258902192115784, 0.05646367371082306, 0.026560556143522263, 0.051739245653152466, -0.030060406774282455, -0.013086936436593533, 0.01440317090600729, 0.07197026163339615, -0.0278775654733181, 0.010019900277256966, 0.06168380379676819, -0.051900122314691544, -0.022626396268606186, 0.053230732679367065, 0.002013846766203642, 0.2101753205060959, -0.026504050940275192, 0.098543182015419, 0.11438820511102676, 0.0833723247051239, 0.025575904175639153, 0.05484003573656082, -0.04791857302188873, 0.02668454870581627, 0.005884971469640732, -0.03944528475403786, -0.04314873740077019, 0.05846972018480301, -0.0031870801467448473, 0.047492410987615585, -0.09460540860891342, -0.03164772316813469, 0.02755584567785263, 0.2907812297344208, 0.06068083643913269, -0.2406114786863327, -0.07514327764511108, 0.010182865895330906, -0.07662485539913177, -0.09850660711526871, 0.04391733929514885, 0.06721218675374985, -0.12472919374704361, -0.008437679149210453, -0.036872316151857376, 0.10234754532575607, -0.09904909878969193, -0.03367205709218979, 0.07930696755647659, 0.03547664359211922, -0.02455839514732361, 0.1115572452545166, -0.2650264799594879, 0.16206121444702148, -0.010194298811256886, 0.0921386107802391, -0.036444563418626785, 0.011737867258489132, -0.04379449784755707, 0.024478062987327576, 0.14028644561767578, 0.004569562617689371, 0.06146252900362015, -0.1034688949584961, -0.09591120481491089, 0.006294886115938425, 0.05616195127367973, -0.09635438024997711, 0.10442932695150375, 0.008246867917478085, 0.02802211418747902, 0.002301734173670411, -0.08586198836565018, -0.11186254769563675, -0.10104386508464813, 0.010037269443273544, -0.08030209690332413, 0.06377508491277695, -0.05529728904366493, -0.05157160386443138, -0.007899927906692028, 0.17111465334892273, -0.17880754172801971, -0.06198598071932793, -0.09418307989835739, 0.03253968432545662, 0.09589947015047073, -0.035334546118974686, -0.01551140658557415, 0.005895907059311867, 0.0457426942884922, 0.00907640065997839, 0.0026575818192213774, 0.07961311936378479, -0.06530804187059402, -0.16870033740997314, -0.048928190022706985, 0.13777010142803192, 0.1310551017522812, 0.06862425804138184, -0.02036322094500065, 0.013474280945956707, -0.032530639320611954, -0.09661954641342163, 0.007964534685015678, 0.027066245675086975, 0.07133117318153381, 0.05090899020433426, -0.04453820362687111, 0.003940647467970848, -0.10733874142169952, -0.05027560889720917, 0.09036680310964584, 0.11336818337440491, -0.03923383355140686, 0.058068446815013885, 0.1229407787322998, -0.1222480833530426, -0.1495734006166458, 0.03515677899122238, 0.09682386368513107, 0.05472220480442047, -0.03550443425774574, -0.17799685895442963, 0.07256397604942322, 0.079231858253479, 0.0072303530760109425, -0.025870969519019127, -0.35818710923194885, -0.13289107382297516, 0.10847680270671844, 0.08677390962839127, -0.07406915724277496, -0.115047387778759, -0.029049046337604523, 0.03058394230902195, -0.03154492750763893, 0.11772873997688293, -0.03541405871510506, 0.0878227949142456, 0.006003159563988447, -0.038724686950445175, 0.040014151483774185, -0.05551859736442566, 0.1356809437274933, 0.07025328278541565, 0.06544890254735947, -0.05600973963737488, -0.010657534934580326, 0.014252953231334686, -0.02600899711251259, 0.14190010726451874, 0.03210403025150299, 0.050294507294893265, -0.23436212539672852, -0.0670071467757225, -0.07421088218688965, 0.002667925553396344, -0.07305628061294556, -0.058761101216077805, -0.03450021520256996, 0.09082116186618805, 0.04539576917886734, -0.011404662393033504, 0.015250698663294315, -0.07988134771585464, 0.031227048486471176, 0.12242200970649719, 0.1486203372478485, 0.1106913611292839, -0.09624733775854111, 0.026062453165650368, 0.03128648176789284, 0.0779605507850647, -0.1782604157924652, -0.0020135734230279922, 0.12351341545581818, -0.0024798705708235502, 0.17519621551036835, 0.007559912744909525, -0.15540550649166107, -0.009030808694660664, 0.05428403988480568, -0.11061215400695801, -0.1030815914273262, -0.01889384165406227, 0.019663162529468536, -0.0989380031824112, -0.0442398376762867, 0.06662275642156601, -0.10833176970481873, -0.01738402433693409, 0.0070367492735385895, 0.03867906704545021, -0.0708184540271759, 0.18330426514148712, 0.07781632244586945, 0.07231222093105316, -0.05470985546708107, 0.11141618341207504, 0.12634631991386414, -0.10487115383148193, 0.01981755532324314, 0.15260231494903564, -0.074766144156456, -0.05885036289691925, 0.020067378878593445, 0.13359563052654266, -0.021903706714510918, -0.07234643399715424, -0.042230378836393356, -0.06190350651741028, 0.06832961738109589, 0.03676369786262512, 0.0522272102534771, 0.045011967420578, -0.019620409235358238, -0.0017728424863889813, -0.14083607494831085, 0.09080684185028076, 0.06885702908039093, 0.01643025130033493, -0.047863055020570755, 0.16356289386749268, 0.03106534853577614, 0.04659189283847809, -0.0035092285834252834, -0.03483446687459946, -0.06135835871100426, 0.04606520012021065, -0.045028239488601685, 0.0035724700428545475, -0.04547581449151039, -0.0014904877170920372, -0.00045952945947647095, -0.016903486102819443, 0.005619834642857313, 0.0213165245950222, -0.05619111657142639, -0.029753558337688446, -0.02190673165023327, 0.059777792543172836, -0.07790670543909073, -0.004509579855948687, 0.01735927164554596, -0.060265034437179565, 0.0920686200261116, 0.002863561734557152, -0.011500144377350807, -0.019400248304009438, -0.05478135123848915, 0.09534797072410583, -0.010531721636652946, -0.0005032405606471002, -0.0288167092949152, -0.13701212406158447, 0.03596170246601105, -0.029065806418657303, -0.0413525365293026, 0.008204488083720207, 0.0556182861328125, -0.13253051042556763, 0.039855506271123886, -0.021047556772828102, -0.002917245728895068, -0.08266457170248032, 0.08529032766819, 0.020951155573129654, 0.0683094933629036, 0.1187572330236435, -0.05263816937804222, 0.0846266970038414, -0.16064926981925964, -0.02976507693529129, 0.01887775957584381, 0.04126887023448944, -0.05465155467391014, -0.050361715257167816, 0.05542892962694168, -0.060946762561798096, 0.08865027129650116, 0.05036970600485802, 0.03492646664381027, 0.04514017328619957, -0.07668550312519073, 0.006762422621250153, 0.04761091619729996, 0.06324043869972229, -0.03522421047091484, -0.002391363028436899, 0.0067068287171423435, 0.014262739568948746, -0.02901938371360302, 0.03523324429988861, 0.12291266769170761, 0.20065374672412872, 0.058887653052806854, 0.04335210472345352, -0.009277964010834694, -0.08339371532201767, -0.080893374979496, 0.12621408700942993, 0.01248075906187296, 0.03431596979498863, -0.02045229822397232, 0.11312498897314072, 0.10467038303613663, -0.17053551971912384, 0.07419664412736893, -0.02173496223986149, -0.09634245187044144, -0.07887876778841019, -0.12049137055873871, -0.03520924970507622, -0.05730137228965759, -0.00733741233125329, -0.10687151551246643, 0.022435003891587257, 0.06303474307060242, 0.05337425693869591, -0.035277750343084335, 0.13182693719863892, -0.007711704354733229, -0.052246399223804474, 0.08296129107475281, -0.006523726973682642, 0.07826671004295349, -0.06979632377624512, 0.02030269242823124, 0.046207647770643234, -0.024845916777849197, 0.058834899216890335, 0.01583041436970234, 0.003153448458760977, 0.013594835996627808, 0.004248442593961954, -0.07402800023555756, 0.023553388193249702, 0.02154996059834957, 0.09333852678537369, 0.10483654588460922, 0.054255008697509766, -0.03036433272063732, -0.01289618480950594, 0.20905552804470062, -0.042496584355831146, -0.08758667856454849, -0.153814435005188, 0.1872694343328476, 0.05997631326317787, 0.0068910266272723675, 0.040285367518663406, -0.11704213917255402, 0.028971726074814796, 0.18319889903068542, 0.12244963645935059, -0.032476335763931274, -0.018158743157982826, 0.0348295196890831, -0.00025951434508897364, 0.029666583985090256, 0.07416721433401108, 0.06500723958015442, 0.15530534088611603, -0.09718047082424164, 0.05408177897334099, -0.06899458914995193, -0.03703860938549042, 0.026188675314188004, 0.1385817527770996, 0.02973734401166439, -0.01685207150876522, -0.05302182585000992, 0.10859633982181549, -0.03207533806562424, -0.19863814115524292, 0.06297305226325989, -0.08122988790273666, -0.13519123196601868, -0.021107086911797523, 0.001286814222112298, -0.003764106659218669, 0.04253166541457176, -0.007005205377936363, -0.029272424057126045, 0.09874490648508072, 0.03379001468420029, -0.07359608262777328, -0.10574345290660858, 0.07392969727516174, -0.03306886553764343, 0.15313373506069183, 0.006220340728759766, 0.11245081573724747, 0.08453990519046783, 0.0023167466279119253, -0.10246726125478745, 0.0854448527097702, 0.040604423731565475, 0.026286548003554344, 0.04955919086933136, 0.1391754448413849, -0.003727947361767292, 0.09452005475759506, 0.04969537630677223, -0.11006543785333633, 0.03063809685409069, -0.13397766649723053, -0.051843561232089996, -0.16703572869300842, 0.0460100881755352, -0.052795179188251495, 0.1580142229795456, 0.21726134419441223, -0.054209716618061066, -0.00011902472760993987, -0.056559257209300995, 0.037938542664051056, -0.014060571789741516, 0.13355489075183868, 0.016200054436922073, -0.1666586846113205, 0.02018304541707039, -0.06493382155895233, 0.049284640699625015, -0.2513975501060486, -0.03936634212732315, 0.02061532624065876, -0.08013490587472916, -0.03272876888513565, 0.1254953294992447, 0.05495050176978111, 0.07068351656198502, -0.05018807575106621, -0.015592031180858612, -0.013159025460481644, 0.14890360832214355, -0.13090743124485016, -0.09139275550842285 ]
null
null
transformers
# legal_t5_small_cls_es model Model for classification of legal text written in Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_cls_es is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for classification of legal texts written in Spanish. ### How to use Here is how to use this model to classify legal text written in Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_cls_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_cls_es", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "Reglamento (CE) no 90/2001 de la Comisión de 17 de enero de 2001 que modifica el Reglamento (CE) n° 800/1999 por el que se establecen disposiciones comunes de aplicación del régimen de restituciones por exportación de productos agrícolas LA COMISIÓN DE LAS COMUNIDADES EUROPEAS, Visto el Tratado constitutivo de la Comunidad Europea, Visto el Reglamento (CEE) n° 1766/92 del Consejo, de 30 de junio de 1992, por el que se establece la organización común de mercados en el sector de los cereales(1), cuya última modificación la constituye el Reglamento (CE) n° 1666/2000(2), y, en particular, sus artículos 13 y 21, así como las disposiciones correspondientes de los demás Reglamentos por los que se establecen organizaciones comunes de mercados de productos agrícolas, Considerando lo siguiente: (1) En el caso de exportación de productos presentados a granel o en unidades no normalizadas, en los que es evidente que la masa neta exacta de los productos no puede conocerse hasta después de cargar el medio de transporte, el apartado 6 del artículo 5 del Reglamento (CE) n° 800/1999 de la Comisión(3), modificado por el Reglamento (CE) n° 1557/2000(4) establece la aplicación de una reducción de la restitución cuando la masa neta efectivamente cargada sea inferior a un determinado porcentaje de la masa neta estimada. No obstante, para la aplicación de esta disposición conviene tener en cuenta las limitaciones inherentes a los medios de transporte de navegación marítima o interior. En efecto, en el caso de los productos exportados a granel, puede ocurrir que las cantidades declaradas no se carguen en su totalidad debido, en particular, a la decisión del responsable del medio de transporte que puede ordenar la suspensión de la carga por razones técnicas o debido a un exceso de carga imputable a los demás exportadores. (2) Dado que determinados cortes de carne de porcino no se presentan en embalajes ni son, por naturaleza, homogéneos, conviene ampliar la categoría de unidades no normalizadas a este tipo de productos. (3) En lo que respecta a la noción de lugar de carga, en el comercio de exportación de productos agrícolas se presenta una multitud de situaciones comerciales y administrativas; por consiguiente, es difícil establecer una norma única y conviene autorizar a los Estados miembros para que determinen el lugar más apropiado para efectuar los controles físicos para los productos agrícolas exportados que se benefician de una restitución. A estos efectos, parece justificado determinar el lugar de carga, de forma diferente, en función de que los productos sean cargados en contenedores o, por el contrario, a granel, en sacos o en cajas y no se carguen posteriormente en contenedores. Asimismo, es conveniente que, cuando existan motivos debidamente justificados, se permita que las autoridades aduaneras acepten para los productos agrícolas que se beneficien, de una restitución declaraciones de exportación presentadas en una oficina de aduanas que no sea la del lugar donde vayan a cargarse los productos. (4) En el caso de los productos sujetos al régimen de mercancías de retorno, es oportuno prever la posibilidad de que la reintroducción se efectúe, bien por el Estado miembros del que sean originarios los productos, bien por el Estado miembro exportador de la primera exportación. (5) Conviene modificar el Reglamento (CE) n° 800/1999 en consecuencia. (6) Las medidas previstas en el presente Reglamento se ajustan al dictamen de todos los Comités de gestión interesados. HA ADOPTADO EL PRESENTE REGLAMENTO: Artículo 1 El Reglamento (CE) n° 800/1999 se modificará como sigue: 1) En el apartado 6 del articulo 5, el párrafo tercero se sustituirá por el texto siguiente: %quot%No se concederá ninguna restitución por la cantidad que sobrepase el 110 % de la masa neta estimada. Cuando la masa efectivamente cargada sea inferior al 90 % de la masa neta estimada, la restitución por la masa neta efectivamente cargada se reducirá un 10 % en relación con la diferencia entre la restitución correspondiente al 90 % de la masa neta estimada y la restitución correspondiente a la masa efectivamente cargada. No obstante, en los casos de exportación par vía marítima o por vía navegable interior, la restitución se pagará por la masa neta efectivamente cargada cuando el exportador pueda aportar la prueba, refrendada por el responsable del medio de transporte, de que el hecho de que no se cargara la totalidad de sus mercancías se debió a las limitaciones inherentes a ese tipo de transporte o a un exceso de carga imputable a uno o a varios de los demás exportadores. En caso de que el exportador haya utilizado el procedimiento de domiciliación previsto en el artículo 283 del Reglamento (CEE) n° 2454/93 serán aplicables las disposiciones del presente párrafo siempre que las autoridades aduaneras hayan autorizado la rectificación de los documentos contables en los que los productos exportados hayan sido inscritos.%quot%. 2) En el apartado 6 del artículo 5, el párrafo cuarto se sustituirá por el texto siguiente: %quot%Se considerarán productos en unidades no estandarizadas los animales vivos, las (medias) canales, los cuartos, partes delanteras, jamones, paletillas, pechos y lomos.%quot%. 3) El apartado 7 del articulo 5 se sustituirá por el texto siguiente: %quot%7. Cualquier persona que exporte productos por los cuales solicite la concesión de la restitución estará obligada a lo siguiente: a) presentar la declaración de exportación en la oficina de aduanas competente del lugar en que los productos vayan a cargarse en el transporte que vaya a efectuar la exportación; b) informar a dicha oficina de aduanas, coma mínimo 24 horas antes del comienzo de las operaciones de carga, e indicar la duración prevista de las operaciones de carga; las autoridades competentes podrán modificar el plazo de 24 horas. Se podrá considerar como lugar de carga en el transporte de los productos destinados a la exportación: - en el caso de los productos que se exporten cargados en contenedores, el lugar donde se carguen en éstos las mercancías, - en el caso de los productos que se exporten a granel, en sacos, cajones, cajas, botellas, etc. sin cargarse en contenedores, el lugar donde se cargue el medio de transporte por el que las mercancías vayan a salir del territorio aduanero de la Comunidad. La oficina de aduanas competente podrá autorizar las operaciones de carga una vez aceptada la declaración de exportación y antes de finalizar el plazo a que se refiere la letra b). La oficina de aduanas competente deberá estar en condiciones de realizar el control físico y de aplicar las medidas de identificación necesarias para el transporte hacia la oficina de salida del territorio aduanero de la Comunidad. Si por razones de organización administrativa o por otras razones debidamente justificadas, no pueden aplicarse las disposiciones del párrafo primero, la declaración de exportación, sólo podrá ser presentada en la oficina de aduanas competente del Estado miembro en cuestión, y, en el caso de un control físico de conformidad con el Reglamento (CEE) n° 386/90, el producto presentado deberá ser descargado completamente. No obstante, la descarga completa no será obligatoria cuando las autoridades competentes puedan garantizar la realización de un control físico exhaustivo.%quot%. 4) En el apartado 3 del artículo 25, el último párrafo se sustituirá por el texto siguiente: %quot%La presente disposición sólo se aplicará cuando el régimen de retorno haya sido utilizado en el Estado miembro donde se haya aceptado la declaración de exportación de la primera exportación o en el Estado miembro de origen, de conformidad con el artículo 15 de la Directiva 97/78/CE del Consejo(5), por la que se establecen los principios relativos a la organización de controles veterinarios de los productos que se introduzcan en la Comunidad procedentes de terceros países.%quot%. Artículo 2 El presente Reglamento entrará en vigor el séptimo día siguiente al de su publicación en el Diario Oficial de las Comunidades Europeas. A petición de los exportadores, las disposiciones del apartado 1 del articulo 1 se aplicarán a los expedientes de restituciones que aún no hayan sido cerrados en el momento de la entrada en vigor del presente Reglamento. El presente Reglamento será obligatorio en todos sus elementos y directamente aplicable en cada Estado miembro. Hecho en Bruselas, el 17 de enero de 2001. Por la Comisión Franz Fischler Miembro de la Comisión (1) DO L 181 de 1.7.1992, p. 21. (2) DO L 193 de 29.7.2000, p. 1. (3) DO L 102 de 17.4.1999, p. 11. (4) DO L 179 de 18.7.2000, p. 6. (5) DO L 24 de 30.1.1998, p. 9." pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_cls_es model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 22 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | F1 score | |:-----:|:-----:| | legal_t5_small_cls_es | 0.6318| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish", "tags": ["classification Spanish model"], "datasets": ["jrc-acquis"], "widget": [{"text": "Reglamento (CE) no 90/2001 de la Comisi\u00f3n de 17 de enero de 2001 que modifica el Reglamento (CE) n\u00b0 800/1999 por el que se establecen disposiciones comunes de aplicaci\u00f3n del r\u00e9gimen de restituciones por exportaci\u00f3n de productos agr\u00edcolas LA COMISI\u00d3N DE LAS COMUNIDADES EUROPEAS, Visto el Tratado constitutivo de la Comunidad Europea, Visto el Reglamento (CEE) n\u00b0 1766/92 del Consejo, de 30 de junio de 1992, por el que se establece la organizaci\u00f3n com\u00fan de mercados en el sector de los cereales(1), cuya \u00faltima modificaci\u00f3n la constituye el Reglamento (CE) n\u00b0 1666/2000(2), y, en particular, sus art\u00edculos 13 y 21, as\u00ed como las disposiciones correspondientes de los dem\u00e1s Reglamentos por los que se establecen organizaciones comunes de mercados de productos agr\u00edcolas, Considerando lo siguiente: (1) En el caso de exportaci\u00f3n de productos presentados a granel o en unidades no normalizadas, en los que es evidente que la masa neta exacta de los productos no puede conocerse hasta despu\u00e9s de cargar el medio de transporte, el apartado 6 del art\u00edculo 5 del Reglamento (CE) n\u00b0 800/1999 de la Comisi\u00f3n(3), modificado por el Reglamento (CE) n\u00b0 1557/2000(4) establece la aplicaci\u00f3n de una reducci\u00f3n de la restituci\u00f3n cuando la masa neta efectivamente cargada sea inferior a un determinado porcentaje de la masa neta estimada. No obstante, para la aplicaci\u00f3n de esta disposici\u00f3n conviene tener en cuenta las limitaciones inherentes a los medios de transporte de navegaci\u00f3n mar\u00edtima o interior. En efecto, en el caso de los productos exportados a granel, puede ocurrir que las cantidades declaradas no se carguen en su totalidad debido, en particular, a la decisi\u00f3n del responsable del medio de transporte que puede ordenar la suspensi\u00f3n de la carga por razones t\u00e9cnicas o debido a un exceso de carga imputable a los dem\u00e1s exportadores. (2) Dado que determinados cortes de carne de porcino no se presentan en embalajes ni son, por naturaleza, homog\u00e9neos, conviene ampliar la categor\u00eda de unidades no normalizadas a este tipo de productos. (3) En lo que respecta a la noci\u00f3n de lugar de carga, en el comercio de exportaci\u00f3n de productos agr\u00edcolas se presenta una multitud de situaciones comerciales y administrativas; por consiguiente, es dif\u00edcil establecer una norma \u00fanica y conviene autorizar a los Estados miembros para que determinen el lugar m\u00e1s apropiado para efectuar los controles f\u00edsicos para los productos agr\u00edcolas exportados que se benefician de una restituci\u00f3n. A estos efectos, parece justificado determinar el lugar de carga, de forma diferente, en funci\u00f3n de que los productos sean cargados en contenedores o, por el contrario, a granel, en sacos o en cajas y no se carguen posteriormente en contenedores. Asimismo, es conveniente que, cuando existan motivos debidamente justificados, se permita que las autoridades aduaneras acepten para los productos agr\u00edcolas que se beneficien, de una restituci\u00f3n declaraciones de exportaci\u00f3n presentadas en una oficina de aduanas que no sea la del lugar donde vayan a cargarse los productos. (4) En el caso de los productos sujetos al r\u00e9gimen de mercanc\u00edas de retorno, es oportuno prever la posibilidad de que la reintroducci\u00f3n se efect\u00fae, bien por el Estado miembros del que sean originarios los productos, bien por el Estado miembro exportador de la primera exportaci\u00f3n. (5) Conviene modificar el Reglamento (CE) n\u00b0 800/1999 en consecuencia. (6) Las medidas previstas en el presente Reglamento se ajustan al dictamen de todos los Comit\u00e9s de gesti\u00f3n interesados. HA ADOPTADO EL PRESENTE REGLAMENTO: Art\u00edculo 1 El Reglamento (CE) n\u00b0 800/1999 se modificar\u00e1 como sigue: 1) En el apartado 6 del articulo 5, el p\u00e1rrafo tercero se sustituir\u00e1 por el texto siguiente: %quot%No se conceder\u00e1 ninguna restituci\u00f3n por la cantidad que sobrepase el 110 % de la masa neta estimada. Cuando la masa efectivamente cargada sea inferior al 90 % de la masa neta estimada, la restituci\u00f3n por la masa neta efectivamente cargada se reducir\u00e1 un 10 % en relaci\u00f3n con la diferencia entre la restituci\u00f3n correspondiente al 90 % de la masa neta estimada y la restituci\u00f3n correspondiente a la masa efectivamente cargada. No obstante, en los casos de exportaci\u00f3n par v\u00eda mar\u00edtima o por v\u00eda navegable interior, la restituci\u00f3n se pagar\u00e1 por la masa neta efectivamente cargada cuando el exportador pueda aportar la prueba, refrendada por el responsable del medio de transporte, de que el hecho de que no se cargara la totalidad de sus mercanc\u00edas se debi\u00f3 a las limitaciones inherentes a ese tipo de transporte o a un exceso de carga imputable a uno o a varios de los dem\u00e1s exportadores. En caso de que el exportador haya utilizado el procedimiento de domiciliaci\u00f3n previsto en el art\u00edculo 283 del Reglamento (CEE) n\u00b0 2454/93 ser\u00e1n aplicables las disposiciones del presente p\u00e1rrafo siempre que las autoridades aduaneras hayan autorizado la rectificaci\u00f3n de los documentos contables en los que los productos exportados hayan sido inscritos.%quot%. 2) En el apartado 6 del art\u00edculo 5, el p\u00e1rrafo cuarto se sustituir\u00e1 por el texto siguiente: %quot%Se considerar\u00e1n productos en unidades no estandarizadas los animales vivos, las (medias) canales, los cuartos, partes delanteras, jamones, paletillas, pechos y lomos.%quot%. 3) El apartado 7 del articulo 5 se sustituir\u00e1 por el texto siguiente: %quot%7. Cualquier persona que exporte productos por los cuales solicite la concesi\u00f3n de la restituci\u00f3n estar\u00e1 obligada a lo siguiente: a) presentar la declaraci\u00f3n de exportaci\u00f3n en la oficina de aduanas competente del lugar en que los productos vayan a cargarse en el transporte que vaya a efectuar la exportaci\u00f3n; b) informar a dicha oficina de aduanas, coma m\u00ednimo 24 horas antes del comienzo de las operaciones de carga, e indicar la duraci\u00f3n prevista de las operaciones de carga; las autoridades competentes podr\u00e1n modificar el plazo de 24 horas. Se podr\u00e1 considerar como lugar de carga en el transporte de los productos destinados a la exportaci\u00f3n: - en el caso de los productos que se exporten cargados en contenedores, el lugar donde se carguen en \u00e9stos las mercanc\u00edas, - en el caso de los productos que se exporten a granel, en sacos, cajones, cajas, botellas, etc. sin cargarse en contenedores, el lugar donde se cargue el medio de transporte por el que las mercanc\u00edas vayan a salir del territorio aduanero de la Comunidad. La oficina de aduanas competente podr\u00e1 autorizar las operaciones de carga una vez aceptada la declaraci\u00f3n de exportaci\u00f3n y antes de finalizar el plazo a que se refiere la letra b). La oficina de aduanas competente deber\u00e1 estar en condiciones de realizar el control f\u00edsico y de aplicar las medidas de identificaci\u00f3n necesarias para el transporte hacia la oficina de salida del territorio aduanero de la Comunidad. Si por razones de organizaci\u00f3n administrativa o por otras razones debidamente justificadas, no pueden aplicarse las disposiciones del p\u00e1rrafo primero, la declaraci\u00f3n de exportaci\u00f3n, s\u00f3lo podr\u00e1 ser presentada en la oficina de aduanas competente del Estado miembro en cuesti\u00f3n, y, en el caso de un control f\u00edsico de conformidad con el Reglamento (CEE) n\u00b0 386/90, el producto presentado deber\u00e1 ser descargado completamente. No obstante, la descarga completa no ser\u00e1 obligatoria cuando las autoridades competentes puedan garantizar la realizaci\u00f3n de un control f\u00edsico exhaustivo.%quot%. 4) En el apartado 3 del art\u00edculo 25, el \u00faltimo p\u00e1rrafo se sustituir\u00e1 por el texto siguiente: %quot%La presente disposici\u00f3n s\u00f3lo se aplicar\u00e1 cuando el r\u00e9gimen de retorno haya sido utilizado en el Estado miembro donde se haya aceptado la declaraci\u00f3n de exportaci\u00f3n de la primera exportaci\u00f3n o en el Estado miembro de origen, de conformidad con el art\u00edculo 15 de la Directiva 97/78/CE del Consejo(5), por la que se establecen los principios relativos a la organizaci\u00f3n de controles veterinarios de los productos que se introduzcan en la Comunidad procedentes de terceros pa\u00edses.%quot%. Art\u00edculo 2 El presente Reglamento entrar\u00e1 en vigor el s\u00e9ptimo d\u00eda siguiente al de su publicaci\u00f3n en el Diario Oficial de las Comunidades Europeas. A petici\u00f3n de los exportadores, las disposiciones del apartado 1 del articulo 1 se aplicar\u00e1n a los expedientes de restituciones que a\u00fan no hayan sido cerrados en el momento de la entrada en vigor del presente Reglamento. El presente Reglamento ser\u00e1 obligatorio en todos sus elementos y directamente aplicable en cada Estado miembro. Hecho en Bruselas, el 17 de enero de 2001. Por la Comisi\u00f3n Franz Fischler Miembro de la Comisi\u00f3n (1) DO L 181 de 1.7.1992, p. 21. (2) DO L 193 de 29.7.2000, p. 1. (3) DO L 102 de 17.4.1999, p. 11. (4) DO L 179 de 18.7.2000, p. 6. (5) DO L 24 de 30.1.1998, p. 9."}]}
text2text-generation
SEBIS/legal_t5_small_cls_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "classification Spanish model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #classification Spanish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_cls\_es model =============================== Model for classification of legal text written in Spanish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_cls\_es is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for classification of legal texts written in Spanish. ### How to use Here is how to use this model to classify legal text written in Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_cls\_es model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to classify legal text written in Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_es model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Spanish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to classify legal text written in Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_es model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 65, 152, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Spanish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to classify legal text written in Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_es model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0964847132563591, 0.12221065163612366, -0.0038565879222005606, 0.08198046684265137, 0.10635166615247726, 0.010041787289083004, 0.07976964861154556, 0.11021756380796432, -0.051917120814323425, 0.072011798620224, 0.035782087594270706, 0.0602727010846138, 0.07373979687690735, 0.05935586616396904, 0.02122253179550171, -0.22405458986759186, 0.01041076797991991, -0.03448045253753662, -0.008783630095422268, 0.14234337210655212, 0.11584316194057465, -0.08029451221227646, 0.03930147364735603, -0.019667064771056175, -0.11000391840934753, 0.012377938255667686, -0.056291449815034866, -0.08427341282367706, 0.06987476348876953, 0.036902278661727905, 0.12659147381782532, 0.02043496072292328, 0.07047940045595169, -0.17952537536621094, 0.002185793360695243, 0.06799490004777908, 0.016910994425415993, 0.05812464654445648, 0.1114686131477356, -0.05100647360086441, 0.16317561268806458, -0.026409635320305824, 0.07241379469633102, 0.03898244723677635, -0.1577436327934265, -0.12863723933696747, -0.06140410155057907, 0.07568217068910599, 0.1426391899585724, 0.12800545990467072, -0.04714319854974747, 0.0903375893831253, -0.09272871166467667, 0.036868803203105927, 0.06340304017066956, -0.22511214017868042, -0.06318152695894241, 0.03529077395796776, 0.05416841432452202, 0.08734801411628723, -0.05333692580461502, -0.05565879866480827, 0.05536230653524399, 0.020659243687987328, -0.01293174922466278, -0.028563907369971275, -0.003985072020441294, 0.0075798025354743, -0.16335263848304749, -0.11203653365373611, 0.14910748600959778, 0.0027220870833843946, -0.06904038041830063, -0.07425032556056976, -0.017909133806824684, -0.13079293072223663, 0.04319098964333534, -0.04203338921070099, 0.03163067623972893, -0.012465986423194408, 0.06493443250656128, -0.0012136201839894056, -0.12330196797847748, -0.0885474681854248, 0.010417337529361248, 0.0909724161028862, 0.06478841602802277, -0.024488558992743492, 0.012576513923704624, 0.1554969996213913, 0.01347554475069046, -0.09125876426696777, -0.012011582963168621, 0.0017977578099817038, -0.10836409032344818, -0.04449494928121567, -0.03258984163403511, -0.11689459532499313, -0.018017206341028214, 0.07475078850984573, -0.057179439812898636, 0.04761292040348053, 0.034024208784103394, 0.04621206969022751, 0.030887644737958908, 0.14486631751060486, -0.04849787801504135, 0.015605057589709759, -0.06104379519820213, 0.040528882294893265, -0.06897731125354767, 0.008380462415516376, -0.04586942866444588, -0.00032252073287963867, 0.05032062530517578, 0.08362673223018646, -0.04364257678389549, -0.0009388528415001929, -0.07501231878995895, -0.054755814373493195, 0.024055074900388718, -0.11884479224681854, -0.007153768092393875, 0.019914623349905014, -0.08586201816797256, -0.06422743946313858, 0.07011442631483078, -0.010862025432288647, -0.11224237829446793, 0.049490876495838165, -0.034136828035116196, -0.01804998330771923, -0.13438680768013, -0.08827143162488937, 0.00475543225184083, -0.07472231239080429, -0.037154421210289, -0.07708018273115158, -0.12367445975542068, -0.12382441759109497, 0.06340841203927994, -0.07398822903633118, -0.025640973821282387, -0.06908046454191208, 0.002853845711797476, -0.0050367750227451324, -0.016605308279395103, 0.08159272372722626, -0.027205921709537506, 0.0919523760676384, 0.01960316114127636, 0.037610236555337906, 0.10549888014793396, 0.06385236978530884, -0.10299379378557205, 0.0389900878071785, -0.09609779715538025, 0.15069575607776642, -0.027158433571457863, -0.0011905021965503693, -0.17459119856357574, -0.07782123982906342, -0.048537902534008026, 0.04532024264335632, 0.06811065226793289, 0.13151732087135315, -0.15602074563503265, -0.0036020581610500813, 0.17825151979923248, -0.08523357659578323, -0.051663123071193695, 0.09015083312988281, -0.04775098338723183, 0.13782700896263123, 0.06193624064326286, 0.15339598059654236, 0.0702313557267189, -0.061979103833436966, 0.012327098287642002, -0.051329389214515686, 0.011206328868865967, 0.034661248326301575, 0.11423824727535248, -0.06854625791311264, -0.08730249851942062, -0.03232097625732422, -0.11760120838880539, -0.008424974046647549, -0.05145016685128212, -0.0665617436170578, 0.02495901845395565, -0.025711292400956154, -0.030233876779675484, 0.04708144813776016, 0.022524503991007805, -0.026892896741628647, -0.11385829001665115, 0.013653185218572617, 0.08303195238113403, -0.051527418196201324, 0.005014881957322359, -0.08543223887681961, -0.007650045212358236, -0.09592921286821365, -0.034098416566848755, -0.18419720232486725, 0.05902392789721489, 0.03254040330648422, -0.026973074302077293, 0.07629929482936859, 0.043781574815511703, 0.019881660118699074, 0.04319391027092934, -0.00451342947781086, -0.05186763033270836, -0.059851065278053284, -0.016058485954999924, -0.10916926711797714, -0.14733131229877472, -0.028536008670926094, -0.022223705425858498, 0.09694411605596542, -0.19387555122375488, 0.027691392228007317, -0.029360949993133545, 0.03536353260278702, -0.019950831308960915, -0.0383618026971817, 0.02329360693693161, 0.0396631583571434, 0.0016681419219821692, -0.06649104505777359, 0.057246673852205276, 0.02941667102277279, 0.00044745818013325334, 0.04138348624110222, -0.12270313501358032, -0.14054392278194427, 0.07309970259666443, 0.042217087000608444, -0.1555580049753189, -0.03335962072014809, -0.049809664487838745, -0.041080448776483536, -0.059049688279628754, 0.010818449780344963, 0.27072790265083313, -0.004347738344222307, 0.1433744728565216, -0.12541863322257996, -0.059349410235881805, -0.006763930898159742, -0.0392252579331398, -0.012111184187233448, 0.1541016697883606, 0.08076398074626923, -0.129587322473526, 0.06359327584505081, 0.02230135165154934, -0.026505177840590477, 0.14057454466819763, 0.01050362829118967, -0.10931379348039627, -0.013056031428277493, 0.08110272884368896, 0.02160326950252056, 0.08021356165409088, -0.15479321777820587, 0.020268699154257774, 0.019576601684093475, 0.035293880850076675, 0.06361463665962219, -0.1536981463432312, 0.020036479458212852, 0.058251675218343735, -0.03229474276304245, 0.0016494422452524304, -0.027838023379445076, -0.03575776889920235, 0.08600717782974243, 0.03798221796751022, -0.014793380163609982, -0.007620379328727722, -0.04288414120674133, -0.15352118015289307, 0.21425224840641022, -0.06578263640403748, -0.16468273103237152, -0.1421579122543335, 0.07459820061922073, 0.051353707909584045, 0.03362930193543434, 0.04040961712598801, -0.08705469220876694, -0.03158775717020035, -0.08218459039926529, 0.1079205572605133, -0.032534778118133545, -0.03433504328131676, -0.07745280116796494, 0.014597486704587936, 0.01989477314054966, -0.12111198157072067, 0.02761150896549225, 0.003468722803518176, -0.08544118702411652, -0.005205820314586163, -0.05755433067679405, 0.09031032770872116, 0.19118055701255798, -0.022333618253469467, 0.017170721665024757, 0.014366758987307549, 0.1861087530851364, -0.12224186211824417, 0.003861526492983103, 0.09970344603061676, -0.00796896405518055, 0.0009949887171387672, 0.09096208214759827, 0.004949225578457117, -0.08092734962701797, 0.04418833926320076, 0.04269622266292572, -0.033781684935092926, -0.3061741888523102, -0.041497234255075455, -0.011753300204873085, -0.053923387080430984, 0.1319260448217392, 0.05105067417025566, 0.03284279257059097, 0.07554526627063751, -0.022751368582248688, -0.014481811784207821, 0.0035686511546373367, 0.07313525676727295, -0.018847204744815826, 0.010631543584167957, 0.04213264212012291, -0.05225086957216263, -0.03992136940360069, 0.061496805399656296, 0.03255185857415199, 0.20104525983333588, -0.027702968567609787, 0.11335138231515884, 0.11912689357995987, 0.08042171597480774, 0.00883915089070797, 0.054174456745386124, -0.040519144386053085, 0.02623085491359234, -0.009465168230235577, -0.0447080098092556, -0.053093474358320236, 0.03649541735649109, -0.011903690174221992, 0.042397499084472656, -0.12099810689687729, -0.05493296682834625, 0.025741470977663994, 0.2756165862083435, 0.04364919662475586, -0.25239649415016174, -0.06487619131803513, 0.008824758231639862, -0.055140938609838486, -0.10353109985589981, 0.030257638543844223, 0.07516548782587051, -0.1359146535396576, -0.0023402301594614983, -0.04315154626965523, 0.10780605673789978, -0.103081114590168, -0.03195220232009888, 0.07167711853981018, 0.038959801197052, -0.014261918142437935, 0.12435450404882431, -0.25163164734840393, 0.1831594705581665, -0.006271451245993376, 0.08847200125455856, -0.039000503718853, 0.017221499234437943, -0.06323490291833878, 0.014764160849153996, 0.14561514556407928, 0.0016481302445754409, 0.0663904920220375, -0.09435853362083435, -0.08789040893316269, 0.002348743611946702, 0.03692071512341499, -0.10992308706045151, 0.11021535098552704, 0.012704797089099884, 0.02601076103746891, -0.007426342461258173, -0.09013707935810089, -0.08869275450706482, -0.12246392667293549, 0.004893288481980562, -0.09861339628696442, 0.07109910994768143, -0.05053343251347542, -0.047779347747564316, -0.009565125219523907, 0.16319148242473602, -0.1714588701725006, -0.06441967934370041, -0.10288054496049881, 0.029988236725330353, 0.09828849136829376, -0.03516583889722824, -0.010911650955677032, 0.008514129556715488, 0.03495525196194649, 0.008717058226466179, 0.011688599362969398, 0.10158150643110275, -0.07909639924764633, -0.15394094586372375, -0.055097728967666626, 0.13364194333553314, 0.1242687925696373, 0.06115097552537918, -0.002043169690296054, 0.008395353332161903, -0.01622334122657776, -0.09834691882133484, -0.009575343690812588, 0.007008164655417204, 0.07744449377059937, 0.04741391912102699, -0.05549906939268112, -0.010892069898545742, -0.10076256096363068, -0.0568394772708416, 0.08393893390893936, 0.10281968116760254, -0.04058026522397995, 0.06411388516426086, 0.12648744881153107, -0.13850639760494232, -0.13890564441680908, 0.03563305735588074, 0.118111252784729, 0.05628965049982071, -0.03155643865466118, -0.188454270362854, 0.03296205401420593, 0.09686389565467834, 0.010361013002693653, -0.03547768294811249, -0.40333667397499084, -0.12015590071678162, 0.08141200244426727, 0.08710410445928574, -0.06148114800453186, -0.11091272532939911, -0.053064338862895966, 0.013932153582572937, -0.01415987778455019, 0.11804315447807312, -0.029729997739195824, 0.08066494762897491, 0.015565718524158001, -0.04617597535252571, 0.05066770687699318, -0.04623730108141899, 0.16319435834884644, 0.05429694056510925, 0.05815891921520233, -0.05096546933054924, -0.012155636213719845, 0.020613297820091248, -0.02152634598314762, 0.11851838231086731, 0.051492732018232346, 0.036083824932575226, -0.23836492002010345, -0.07304852455854416, -0.07529845833778381, 0.005326098296791315, -0.07836180925369263, -0.04230678454041481, -0.020529920235276222, 0.07466036081314087, 0.05635831505060196, -0.01362670212984085, 0.014013232663273811, -0.08146364241838455, 0.03902176767587662, 0.13158848881721497, 0.15072408318519592, 0.12212744355201721, -0.10238077491521835, 0.022893665358424187, 0.04938952997326851, 0.07953675091266632, -0.15031059086322784, -0.004319082945585251, 0.12367329001426697, -0.009295349009335041, 0.15121778845787048, 0.004900235682725906, -0.15155018866062164, 0.0035998839884996414, 0.07778101414442062, -0.08460593968629837, -0.09813036769628525, -0.022308940067887306, 0.022495465353131294, -0.07772712409496307, -0.052910029888153076, 0.07098934054374695, -0.0834907740354538, -0.037196945399045944, 0.0034422820899635553, 0.02410365827381611, -0.0646679550409317, 0.1936502754688263, 0.0718294084072113, 0.06476181745529175, -0.05445844680070877, 0.1054643839597702, 0.129360169172287, -0.1278362274169922, 0.013845285400748253, 0.15139418840408325, -0.06765284389257431, -0.051551301032304764, 0.03887246176600456, 0.1421964317560196, -0.08524544537067413, -0.08859604597091675, -0.06710639595985413, -0.05977647751569748, 0.06001720577478409, 0.04352330043911934, 0.047077570110559464, 0.03606119379401207, -0.011769718490540981, -0.0033019124530255795, -0.1231098398566246, 0.06526598334312439, 0.06934038549661636, 0.011131227016448975, -0.05016578361392021, 0.16203778982162476, 0.02850000187754631, 0.03240741789340973, -0.005562265403568745, -0.027465015649795532, -0.08206577599048615, 0.0531667061150074, -0.041638024151325226, 0.02349238656461239, -0.040482841432094574, -0.004004294518381357, -0.010044972412288189, -0.008290812373161316, 0.01126872468739748, 0.012677287682890892, -0.05731035768985748, -0.02057463489472866, -0.032226528972387314, 0.06640342622995377, -0.06387104839086533, 0.0072985487058758736, 0.00433416897431016, -0.0583406426012516, 0.07332459837198257, -0.013705311343073845, -0.008326343260705471, -0.010016420856118202, -0.08520113676786423, 0.10742418467998505, -0.0006923371111042798, 0.0020848598796874285, -0.012273915112018585, -0.13513001799583435, 0.05206592008471489, -0.0137392682954669, -0.04206933081150055, 0.0052156271412968636, 0.05455834046006203, -0.13512691855430603, 0.04772351309657097, -0.010328894481062889, -0.024570008739829063, -0.07057827711105347, 0.11681912839412689, 0.033735085278749466, 0.04962856322526932, 0.10865440964698792, -0.06921515613794327, 0.07925917953252792, -0.15083248913288116, -0.03837577998638153, 0.012707415968179703, 0.054324109107255936, -0.053036708384752274, -0.04478045552968979, 0.05995750427246094, -0.037973552942276, 0.08012479543685913, 0.0819988027215004, 0.07883602380752563, 0.02932044304907322, -0.07035870105028152, 0.018886102363467216, 0.044296134263277054, 0.06662142276763916, -0.02915816195309162, 0.0179152749478817, -0.006939389742910862, 0.04558101296424866, -0.027292320504784584, 0.043868571519851685, 0.1033274382352829, 0.20956043899059296, 0.07034037262201309, 0.05048418790102005, 0.00577498646453023, -0.08655425161123276, -0.07021516561508179, 0.12574152648448944, 0.030331000685691833, 0.024178260937333107, -0.018704354763031006, 0.07340920716524124, 0.09845570474863052, -0.17510929703712463, 0.08615141361951828, -0.01167068537324667, -0.08592940866947174, -0.07587547600269318, -0.13290764391422272, -0.02454966865479946, -0.07230492681264877, -0.017887480556964874, -0.10304781794548035, 0.028061866760253906, 0.047825731337070465, 0.04243357852101326, -0.044745005667209625, 0.12943652272224426, -0.01278239581733942, -0.08349351584911346, 0.08443185687065125, 0.007420727983117104, 0.11855888366699219, -0.07730843871831894, 0.03602251410484314, 0.03796788305044174, 0.002912627998739481, 0.0533798523247242, 0.02922671101987362, 0.017641901969909668, 0.007146167568862438, 0.009710247628390789, -0.0584835484623909, 0.023915830999612808, 0.023795651271939278, 0.10319016873836517, 0.12947045266628265, 0.04621417820453644, -0.03672165423631668, -0.011632475070655346, 0.23204584419727325, -0.03965822607278824, -0.076141357421875, -0.15027475357055664, 0.19055354595184326, 0.055908042937517166, 0.02742060087621212, 0.03400757908821106, -0.12650908529758453, 0.026317231357097626, 0.18052524328231812, 0.11440185457468033, -0.027275539934635162, -0.026986796408891678, 0.02278037555515766, 0.006705510430037975, 0.049396514892578125, 0.08548687398433685, 0.04760914668440819, 0.19852901995182037, -0.10554023832082748, 0.044583238661289215, -0.06826277077198029, -0.010829390957951546, 0.012744133360683918, 0.15510094165802002, 0.0143292136490345, -0.02160264551639557, -0.042315538972616196, 0.12976303696632385, -0.025432508438825607, -0.21803195774555206, 0.08043793588876724, -0.08611326664686203, -0.14646051824092865, -0.035809509456157684, 0.011669659055769444, 0.0021613629069179296, 0.057670362293720245, 0.001353444647975266, -0.03527514636516571, 0.09124359488487244, 0.04703937843441963, -0.07233090698719025, -0.12490461021661758, 0.07026559859514236, -0.024842919781804085, 0.16486121714115143, -0.0016183607513085008, 0.1030871644616127, 0.08943799138069153, -0.0009967811638489366, -0.1082669198513031, 0.0789203941822052, 0.04054715111851692, 0.033273495733737946, 0.08504221588373184, 0.09074274450540543, -0.0013422046322375536, 0.07511528581380844, 0.04153871536254883, -0.12191441655158997, 0.039219677448272705, -0.1073911041021347, -0.034646350890398026, -0.16828244924545288, 0.05423286184668541, -0.06346682459115982, 0.14216312766075134, 0.2130725085735321, -0.05195903778076172, 0.012861537747085094, -0.058698661625385284, 0.06219608709216118, -0.005925433244556189, 0.1642231047153473, 0.0178438238799572, -0.1820741444826126, 0.00827241875231266, -0.06600698083639145, 0.03973647207021713, -0.24682989716529846, -0.02184947207570076, 0.028157435357570648, -0.08464747667312622, -0.03348923474550247, 0.10947038233280182, 0.03511113300919533, 0.08184697479009628, -0.044031981378793716, -0.02157152257859707, -0.02416077069938183, 0.15071813762187958, -0.11416147649288177, -0.07802094519138336 ]
null
null
transformers
# legal_t5_small_cls_fr model Model for classification of legal text written in French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_cls_fr is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for classification of legal texts written in French. ### How to use Here is how to use this model to classify legal text written in French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_cls_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_cls_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) fr_text = "Règlement (CE) no 264/2005 de la Commission du 16 février 2005 fixant les restitutions à l'exportation dans le secteur de la viande de volaille applicables à partir du 17 février 2005 LA COMMISSION DES COMMUNAUTÉS EUROPÉENNES, vu le traité instituant la Communauté européenne, vu le règlement (CEE) no 2777/75 du Conseil du 29 octobre 1975 portant organisation commune des marchés dans le secteur de la viande de volaille [1], et notamment son article 8, paragraphe 3, troisième alinéa, considérant ce qui suit: (1) Aux termes de l'article 8 du règlement (CEE) no 2777/75, la différence entre les prix des produits visés à l'article 1er, paragraphe 1, dudit règlement, sur le marché mondial et dans la Communauté, peut être couverte par une restitution à l'exportation. (2) L'application de ces règles et critères à la situation actuelle des marchés dans le secteur de la viande de volaille conduit à fixer la restitution à un montant qui permette la participation de la Communauté au commerce international et tienne compte également du caractère des exportations de ces produits ainsi que de leur importance à l'heure actuelle. (3) L'article 21 du règlement (CE) no 800/1999 de la Commission du 15 avril 1999 portant modalités communes d'application du régime des restitutions à l'exportation pour les produits agricoles [2] prévoit qu'aucune restitution n'est octroyée lorsque les produits ne sont pas de qualité saine, loyale et marchande le jour d'acceptation de la déclaration d'exportation. Afin d'assurer une application uniforme de la réglementation en vigueur, il y a lieu de préciser que, pour bénéficier d'une restitution, les viandes de volailles figurant à l'article 1er du règlement (CEE) no 2777/75 doivent porter la marque de salubrité comme prévu à la directive 71/118/CEE du Conseil du 15 février 1971 relative à des problèmes sanitaires en matière de production et de mise sur le marché de viandes fraîches de volaille [3]. (4) Le comité de gestion de la viande de volaille et des œufs n'a pas émis d'avis dans le délai imparti par son président, A ARRÊTÉ LE PRÉSENT RÈGLEMENT: Article premier Les codes des produits pour l'exportation desquels est accordée la restitution visée à l'article 8 du règlement (CEE) no 2777/75 et les montants de cette restitution sont fixés à l'annexe du présent règlement. Toutefois, afin de pouvoir bénéficier de la restitution, les produits entrant dans le champ d'application du chapitre XII de l'annexe de la directive 71/118/CEE doivent également satisfaire aux conditions de marquage de salubrité prévues par cette directive. Article 2 Le présent règlement entre en vigueur le 17 février 2005. Le présent règlement est obligatoire dans tous ses éléments et directement applicable dans tout État membre. Fait à Bruxelles, le 16 février 2005. Par la Commission Mariann Fischer Boel Membre de la Commission [1] JO L 282 du 1.11.1975, p. 77. Règlement modifié en dernier lieu par le règlement (CE) no 806/2003 (JO L 122 du 16.5.2003, p. 1). [2] JO L 102 du 17.4.1999, p. 11. Règlement modifié en dernier lieu par le règlement (CE) no 671/2004 (JO L 105 du 14.4.2004, p. 5). [3] JO L 55 du 8.3.1971, p. 23. Directive modifiée en dernier lieu par le règlement (CE) no 807/2003 (JO L 122 du 16.5.2003, p. 36). -------------------------------------------------- ANNEXE Code des produits | Destination | Unité de mesure | Montant des restitutions | 0105 11 11 9000 | A02 | EUR/100 pcs | 0,80 | 0105 11 19 9000 | A02 | EUR/100 pcs | 0,80 | 0105 11 91 9000 | A02 | EUR/100 pcs | 0,80 | 0105 11 99 9000 | A02 | EUR/100 pcs | 0,80 | 0105 12 00 9000 | A02 | EUR/100 pcs | 1,70 | 0105 19 20 9000 | A02 | EUR/100 pcs | 1,70 | 0207 12 10 9900 | V01 | EUR/100 kg | 41,00 | 0207 12 10 9900 | A24 | EUR/100 kg | 41,00 | 0207 12 90 9190 | V01 | EUR/100 kg | 41,00 | 0207 12 90 9190 | A24 | EUR/100 kg | 41,00 | 0207 12 90 9990 | V01 | EUR/100 kg | 41,00 | 0207 12 90 9990 | A24 | EUR/100 kg | 41,00 | --------------------------------------------------" pipeline([fr_text], max_length=512) ``` ## Training data The legal_t5_small_cls_fr model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 22 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | F1 score | |:-----:|:-----:| | legal_t5_small_cls_fr | 0.6159| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "French", "tags": ["classification French model"], "datasets": ["jrc-acquis"], "widget": [{"text": "R\u00e8glement (CE) no 264/2005 de la Commission du 16 f\u00e9vrier 2005 fixant les restitutions \u00e0 l'exportation dans le secteur de la viande de volaille applicables \u00e0 partir du 17 f\u00e9vrier 2005 LA COMMISSION DES COMMUNAUT\u00c9S EUROP\u00c9ENNES, vu le trait\u00e9 instituant la Communaut\u00e9 europ\u00e9enne, vu le r\u00e8glement (CEE) no 2777/75 du Conseil du 29 octobre 1975 portant organisation commune des march\u00e9s dans le secteur de la viande de volaille [1], et notamment son article 8, paragraphe 3, troisi\u00e8me alin\u00e9a, consid\u00e9rant ce qui suit: (1) Aux termes de l'article 8 du r\u00e8glement (CEE) no 2777/75, la diff\u00e9rence entre les prix des produits vis\u00e9s \u00e0 l'article 1er, paragraphe 1, dudit r\u00e8glement, sur le march\u00e9 mondial et dans la Communaut\u00e9, peut \u00eatre couverte par une restitution \u00e0 l'exportation. (2) L'application de ces r\u00e8gles et crit\u00e8res \u00e0 la situation actuelle des march\u00e9s dans le secteur de la viande de volaille conduit \u00e0 fixer la restitution \u00e0 un montant qui permette la participation de la Communaut\u00e9 au commerce international et tienne compte \u00e9galement du caract\u00e8re des exportations de ces produits ainsi que de leur importance \u00e0 l'heure actuelle. (3) L'article 21 du r\u00e8glement (CE) no 800/1999 de la Commission du 15 avril 1999 portant modalit\u00e9s communes d'application du r\u00e9gime des restitutions \u00e0 l'exportation pour les produits agricoles [2] pr\u00e9voit qu'aucune restitution n'est octroy\u00e9e lorsque les produits ne sont pas de qualit\u00e9 saine, loyale et marchande le jour d'acceptation de la d\u00e9claration d'exportation. Afin d'assurer une application uniforme de la r\u00e9glementation en vigueur, il y a lieu de pr\u00e9ciser que, pour b\u00e9n\u00e9ficier d'une restitution, les viandes de volailles figurant \u00e0 l'article 1er du r\u00e8glement (CEE) no 2777/75 doivent porter la marque de salubrit\u00e9 comme pr\u00e9vu \u00e0 la directive 71/118/CEE du Conseil du 15 f\u00e9vrier 1971 relative \u00e0 des probl\u00e8mes sanitaires en mati\u00e8re de production et de mise sur le march\u00e9 de viandes fra\u00eeches de volaille [3]. (4) Le comit\u00e9 de gestion de la viande de volaille et des \u0153ufs n'a pas \u00e9mis d'avis dans le d\u00e9lai imparti par son pr\u00e9sident, A ARR\u00caT\u00c9 LE PR\u00c9SENT R\u00c8GLEMENT: Article premier Les codes des produits pour l'exportation desquels est accord\u00e9e la restitution vis\u00e9e \u00e0 l'article 8 du r\u00e8glement (CEE) no 2777/75 et les montants de cette restitution sont fix\u00e9s \u00e0 l'annexe du pr\u00e9sent r\u00e8glement. Toutefois, afin de pouvoir b\u00e9n\u00e9ficier de la restitution, les produits entrant dans le champ d'application du chapitre XII de l'annexe de la directive 71/118/CEE doivent \u00e9galement satisfaire aux conditions de marquage de salubrit\u00e9 pr\u00e9vues par cette directive. Article 2 Le pr\u00e9sent r\u00e8glement entre en vigueur le 17 f\u00e9vrier 2005. Le pr\u00e9sent r\u00e8glement est obligatoire dans tous ses \u00e9l\u00e9ments et directement applicable dans tout \u00c9tat membre. Fait \u00e0 Bruxelles, le 16 f\u00e9vrier 2005. Par la Commission Mariann Fischer Boel Membre de la Commission [1] JO L 282 du 1.11.1975, p. 77. R\u00e8glement modifi\u00e9 en dernier lieu par le r\u00e8glement (CE) no 806/2003 (JO L 122 du 16.5.2003, p. 1). [2] JO L 102 du 17.4.1999, p. 11. R\u00e8glement modifi\u00e9 en dernier lieu par le r\u00e8glement (CE) no 671/2004 (JO L 105 du 14.4.2004, p. 5). [3] JO L 55 du 8.3.1971, p. 23. Directive modifi\u00e9e en dernier lieu par le r\u00e8glement (CE) no 807/2003 (JO L 122 du 16.5.2003, p. 36). -------------------------------------------------- ANNEXE Code des produits | Destination | Unit\u00e9 de mesure | Montant des restitutions | 0105 11 11 9000 | A02 | EUR/100 pcs | 0,80 | 0105 11 19 9000 | A02 | EUR/100 pcs | 0,80 | 0105 11 91 9000 | A02 | EUR/100 pcs | 0,80 | 0105 11 99 9000 | A02 | EUR/100 pcs | 0,80 | 0105 12 00 9000 | A02 | EUR/100 pcs | 1,70 | 0105 19 20 9000 | A02 | EUR/100 pcs | 1,70 | 0207 12 10 9900 | V01 | EUR/100 kg | 41,00 | 0207 12 10 9900 | A24 | EUR/100 kg | 41,00 | 0207 12 90 9190 | V01 | EUR/100 kg | 41,00 | 0207 12 90 9190 | A24 | EUR/100 kg | 41,00 | 0207 12 90 9990 | V01 | EUR/100 kg | 41,00 | 0207 12 90 9990 | A24 | EUR/100 kg | 41,00 | --------------------------------------------------"}]}
text2text-generation
SEBIS/legal_t5_small_cls_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "classification French model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #classification French model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_cls\_fr model =============================== Model for classification of legal text written in French. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_cls\_fr is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for classification of legal texts written in French. ### How to use Here is how to use this model to classify legal text written in French in PyTorch: Training data ------------- The legal\_t5\_small\_cls\_fr model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to classify legal text written in French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_fr model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification French model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to classify legal text written in French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_fr model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 65, 152, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification French model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to classify legal text written in French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_fr model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0796804428100586, 0.12775062024593353, -0.0033769940491765738, 0.08560594171285629, 0.08180706948041916, -0.003949234262108803, 0.10281503200531006, 0.09410500526428223, -0.04924510046839714, 0.05889516323804855, 0.05750567093491554, 0.00988252367824316, 0.06109664589166641, 0.08540933579206467, 0.05515193194150925, -0.21120168268680573, 0.015030432492494583, -0.033574528992176056, -0.02008536271750927, 0.15133051574230194, 0.1334683895111084, -0.0691789761185646, 0.05380210652947426, 0.008739863522350788, -0.10804526507854462, 0.018196595832705498, -0.06384553015232086, -0.0523945689201355, 0.07463489472866058, 0.03722982108592987, 0.11186252534389496, 0.015409337356686592, 0.07610367983579636, -0.2019752562046051, 0.004048564936965704, 0.07581885159015656, 0.0046022371388971806, 0.05335237458348274, 0.10244392603635788, -0.05453833192586899, 0.16110287606716156, -0.023502621799707413, 0.06993292272090912, 0.04221964254975319, -0.11838076263666153, -0.11666414886713028, -0.0619223490357399, 0.09952205419540405, 0.11305666714906693, 0.13964520394802094, -0.04415074363350868, 0.08473727107048035, -0.11857392638921738, 0.054513171315193176, 0.07114829868078232, -0.23073340952396393, -0.0623982772231102, 0.030052464455366135, 0.05683394521474838, 0.06739981472492218, -0.08453574031591415, -0.056254301220178604, 0.03372636437416077, 0.022339871153235435, 0.0033735574688762426, -0.026654651388525963, 0.023780828341841698, 0.0015609622932970524, -0.1677631288766861, -0.098590187728405, 0.1666790395975113, 0.020458370447158813, -0.08725839108228683, -0.07534748315811157, -0.007778096478432417, -0.14578869938850403, 0.011620094999670982, -0.05250242352485657, 0.04593229293823242, -0.008419566787779331, 0.056803006678819656, -0.009786136448383331, -0.11255045980215073, -0.09000450372695923, 0.016179168596863747, 0.07110168039798737, 0.06573482602834702, -0.014952956698834896, 0.01648036204278469, 0.14667844772338867, -0.04650958999991417, -0.05976776406168938, -0.02412823960185051, -0.012813769280910492, -0.11687222123146057, -0.051930174231529236, -0.025033654645085335, -0.07534932345151901, -0.016555646434426308, 0.13166074454784393, -0.05788682401180267, 0.049711208790540695, 0.021918345242738724, 0.04550963640213013, 0.01920882984995842, 0.14795786142349243, -0.0623813234269619, -0.007020239252597094, -0.06454206258058548, 0.05931006371974945, -0.09550334513187408, 0.028559748083353043, -0.020372020080685616, -0.00289800763130188, 0.049234021455049515, 0.05825655534863472, -0.060542162507772446, -0.011612696573138237, -0.05623563006520271, -0.03646167740225792, 0.044176578521728516, -0.12178340554237366, -0.0035653235390782356, 0.017635012045502663, -0.09337674826383591, -0.05934403836727142, 0.08307989686727524, 0.0025743271689862013, -0.11976729333400726, 0.06181846186518669, -0.034415896981954575, -0.00698980363085866, -0.12432119250297546, -0.08612693101167679, -0.018147584050893784, -0.034238334745168686, -0.05893971025943756, -0.06152220070362091, -0.11761120706796646, -0.1050877720117569, 0.08587910979986191, -0.058286070823669434, -0.030812544748187065, -0.0689307227730751, -0.024686548858880997, 0.007282631937414408, 0.014269990846514702, 0.07206606864929199, -0.03485017269849777, 0.06016020104289055, -0.022716738283634186, 0.02926621027290821, 0.09374022483825684, 0.06367280334234238, -0.09075318276882172, 0.03784288838505745, -0.09394488483667374, 0.14222674071788788, -0.03642609715461731, -0.041410062462091446, -0.17475591599941254, -0.10059679299592972, -0.0622846782207489, 0.025325918570160866, 0.07788471132516861, 0.12991845607757568, -0.16125333309173584, -0.0074253324419260025, 0.1873612254858017, -0.09850607812404633, -0.050701119005680084, 0.14510659873485565, -0.05587461590766907, 0.09753138571977615, 0.044578973203897476, 0.14368760585784912, 0.0906592309474945, -0.06821566820144653, -0.0036399071104824543, -0.0183512382209301, 0.00902550294995308, 0.05100909620523453, 0.12357068061828613, -0.0540713295340538, -0.10624177753925323, -0.0510459765791893, -0.0932808369398117, -0.01066535897552967, -0.0654693990945816, -0.07051302492618561, 0.010208484716713428, -0.03253814950585365, 0.005038145929574966, 0.047473762184381485, 0.010828856378793716, -0.03424888476729393, -0.12952753901481628, -0.01991920731961727, 0.0984053984284401, -0.060361504554748535, 0.0043109264224767685, -0.06988955289125443, -0.020250290632247925, -0.045697182416915894, -0.0335836187005043, -0.17193184792995453, 0.04771581292152405, 0.03631403297185898, -0.06001821905374527, 0.07785714417695999, 0.04965081438422203, 0.03338080272078514, 0.06528648734092712, -0.017004016786813736, -0.051507238298654556, -0.07805702090263367, -0.02157476730644703, -0.10047003626823425, -0.15607762336730957, -0.018051298335194588, -0.021753130480647087, 0.1277441829442978, -0.19706964492797852, 0.030757363885641098, -0.027880271896719933, 0.049490153789520264, -0.013814669102430344, -0.038840729743242264, -0.0016048428369686007, 0.016332946717739105, 0.0059864716604352, -0.048792511224746704, 0.04509131237864494, 0.011644013226032257, 0.023641230538487434, 0.04888494312763214, -0.08084297180175781, -0.12174195051193237, 0.08024303615093231, 0.04590306058526039, -0.17040744423866272, -0.02333679609000683, -0.07162672281265259, -0.04296223446726799, -0.04584691673517227, 0.01723524183034897, 0.2545759379863739, 0.00645905826240778, 0.1494826078414917, -0.10372526198625565, -0.07345187664031982, -0.0021484773606061935, -0.024646611884236336, -0.007004005368798971, 0.15404416620731354, 0.08900616317987442, -0.10516736656427383, 0.058024510741233826, 0.05038067325949669, -0.03340533748269081, 0.1289941668510437, 0.0030554260592907667, -0.10435090214014053, -0.016819482669234276, 0.08833882957696915, 0.004430432338267565, 0.08402791619300842, -0.15129028260707855, 0.011989476159214973, 0.01774693839251995, 0.025652023032307625, 0.04912523925304413, -0.1557655781507492, 0.03264477103948593, 0.04630960896611214, -0.043791450560092926, 0.012130226008594036, -0.018929917365312576, -0.03930800408124924, 0.09823879599571228, 0.034997038543224335, -0.05087195709347725, -0.014201950281858444, -0.04743224009871483, -0.14660906791687012, 0.22680138051509857, -0.06706541031599045, -0.14385627210140228, -0.11397554725408554, 0.07968354225158691, 0.010098223574459553, 0.007661561015993357, 0.0308687724173069, -0.08004020154476166, -0.042163409292697906, -0.121525339782238, 0.06009046733379364, -0.05367545038461685, -0.013786936178803444, -0.08446550369262695, 0.008126932196319103, 0.02177695743739605, -0.13440914452075958, 0.019272781908512115, -0.00870415661484003, -0.08519015461206436, -0.016325637698173523, -0.041030608117580414, 0.06955970078706741, 0.17429816722869873, -0.05484568327665329, 0.018577804788947105, 0.027416935190558434, 0.21156178414821625, -0.11679723113775253, 0.013132725842297077, 0.07052944600582123, 0.01012410782277584, 0.022978240624070168, 0.09531664848327637, 0.013314024545252323, -0.06261803954839706, 0.05963864549994469, 0.0677054300904274, -0.02144394814968109, -0.28218910098075867, -0.05641745403409004, -0.015912679955363274, -0.03097635507583618, 0.1269274801015854, 0.05726294219493866, 0.04196885600686073, 0.04392581433057785, -0.030978193506598473, -0.010607557371258736, 0.028705792501568794, 0.06723228842020035, -0.016740983352065086, 0.016212033107876778, 0.05556139722466469, -0.05483861267566681, -0.04385754093527794, 0.0637354925274849, 0.0030703365337103605, 0.19461971521377563, -0.03485511988401413, 0.09841016680002213, 0.11859798431396484, 0.08292219787836075, 0.010938335210084915, 0.058458179235458374, -0.03679134324193001, 0.024793492630124092, 0.0007244364824146032, -0.044258203357458115, -0.0256951916962862, 0.04539031162858009, -0.01183127611875534, 0.027567030861973763, -0.11685393005609512, -0.03528473898768425, 0.03839151933789253, 0.28259575366973877, 0.05076795443892479, -0.24397718906402588, -0.06577038764953613, -0.012174943462014198, -0.07144369930028915, -0.10476104170084, 0.05020572990179062, 0.08251774311065674, -0.14310747385025024, -0.018652599304914474, -0.03524899482727051, 0.0989605113863945, -0.07668241113424301, -0.029659191146492958, 0.10499370843172073, 0.04724493995308876, -0.02384863793849945, 0.10817812383174896, -0.24692392349243164, 0.17214557528495789, -0.016074594110250473, 0.06886488199234009, -0.03217310085892677, 0.02086900733411312, -0.03546718880534172, 0.05716777592897415, 0.1661277562379837, 0.01877211034297943, 0.06441036611795425, -0.06742895394563675, -0.08031400293111801, -0.0045279632322490215, 0.061147380620241165, -0.09967408329248428, 0.08565021306276321, 0.0025236320216208696, 0.0020928848534822464, -0.0052048866637051105, -0.08826398849487305, -0.12406156212091446, -0.11989138275384903, 0.016398204490542412, -0.08340699970722198, 0.0540950745344162, -0.050600145012140274, -0.05141976848244667, -0.018188772723078728, 0.19272202253341675, -0.1580544263124466, -0.0382111519575119, -0.09090989828109741, 0.010793397203087807, 0.09925694018602371, -0.03462301567196846, 0.0143445935100317, -0.007913835346698761, 0.05321744456887245, -0.007126250769942999, 0.008535264991223812, 0.07848106324672699, -0.09029276669025421, -0.13504306972026825, -0.04796602204442024, 0.14175738394260406, 0.12932924926280975, 0.06331653147935867, -0.0061365882866084576, 0.012508582323789597, -0.03865273296833038, -0.11765725910663605, -0.014714209362864494, -0.016271086409687996, 0.06562668085098267, 0.05403878167271614, -0.05550526827573776, -0.015988286584615707, -0.09707438200712204, -0.025440704077482224, 0.09959021210670471, 0.11339662969112396, -0.058880921453237534, 0.07151862978935242, 0.1265450268983841, -0.11173912137746811, -0.17126989364624023, 0.04400114715099335, 0.11958152055740356, 0.03341355919837952, -0.05706319957971573, -0.1821231245994568, 0.059245191514492035, 0.06977012008428574, 0.010619725100696087, -0.03016895242035389, -0.3745914399623871, -0.13003009557724, 0.08453915268182755, 0.07186087965965271, -0.05281733348965645, -0.0904434397816658, -0.015422534197568893, 0.021115237846970558, -0.01876821555197239, 0.1279144585132599, -0.04274776950478554, 0.0693688690662384, 0.013411024585366249, -0.0680968388915062, 0.0357745923101902, -0.04949368163943291, 0.14391198754310608, 0.09156852215528488, 0.07074211537837982, -0.036157578229904175, 0.0023858947679400444, 0.02386482059955597, -0.01825922727584839, 0.13737541437149048, 0.047799598425626755, 0.04830062389373779, -0.23117005825042725, -0.055150728672742844, -0.07765927165746689, 0.016617389395833015, -0.079933762550354, -0.04372690990567207, -0.03490974381566048, 0.09666140377521515, 0.06575781106948853, -0.003291056491434574, -0.03394746035337448, -0.05938999354839325, 0.016772694885730743, 0.14151827991008759, 0.13619370758533478, 0.12026748061180115, -0.10786497592926025, 0.0589405782520771, 0.046838581562042236, 0.0717993676662445, -0.16011086106300354, 0.011069178581237793, 0.12413980066776276, -0.007225345354527235, 0.16786859929561615, 0.01036209985613823, -0.13925780355930328, -0.03527147322893143, 0.05789089947938919, -0.11968773603439331, -0.11088543385267258, -0.032943833619356155, 0.010559228248894215, -0.07998084276914597, -0.05763472244143486, 0.05879511311650276, -0.10717303305864334, -0.019324569031596184, -0.0005444202106446028, 0.02928505837917328, -0.0865011215209961, 0.16824954748153687, 0.05773048475384712, 0.07407812029123306, -0.056616879999637604, 0.08525627851486206, 0.11661968380212784, -0.128127783536911, 0.007776254788041115, 0.16204223036766052, -0.07153244316577911, -0.05988217145204544, 0.034883707761764526, 0.14095284044742584, -0.04653558507561684, -0.07712966203689575, -0.053486306220293045, -0.055002592504024506, 0.07070954144001007, 0.04382814094424248, 0.05347071588039398, 0.02880048379302025, -0.0003538232122082263, -0.02347497083246708, -0.14433741569519043, 0.09670264273881912, 0.07961205393075943, 0.0026934018824249506, -0.012166323140263557, 0.13356292247772217, 0.021476855501532555, 0.0601092092692852, -0.004472683183848858, -0.029994074255228043, -0.06369012594223022, 0.03245674818754196, -0.029103035107254982, 0.004782711621373892, -0.028095796704292297, 0.015874799340963364, -0.009518013335764408, -0.013819657266139984, 0.009070821106433868, 0.0201470535248518, -0.06473207473754883, -0.02914544753730297, -0.01368347741663456, 0.06864187866449356, -0.08885964751243591, 0.008565339259803295, 0.02773025445640087, -0.06924466043710709, 0.08814138174057007, 0.0027803999837487936, -0.002233074279502034, -0.025909846648573875, -0.07329697906970978, 0.07476981729269028, -0.04737784340977669, 0.0065622227266430855, -0.030045650899410248, -0.14811570942401886, 0.05749281868338585, -0.024411307647824287, -0.032122693955898285, 0.0027642191853374243, 0.04757262021303177, -0.11956348270177841, 0.044145096093416214, -0.033017847687006, -0.024153226986527443, -0.08202512562274933, 0.11703071743249893, 0.0143311507999897, 0.07501603662967682, 0.10795256495475769, -0.057709548622369766, 0.07092666625976562, -0.1373089998960495, -0.03775827959179878, 0.02157440036535263, 0.040604133158922195, -0.04775106906890869, -0.061889372766017914, 0.04551639035344124, -0.040857020765542984, 0.10004878044128418, 0.08707431703805923, 0.06877077370882034, 0.04530077427625656, -0.10377155244350433, -0.013320798985660076, 0.05322108045220375, 0.029089583083987236, -0.03849807381629944, 0.003955948632210493, -0.006526502314954996, 0.020487558096647263, -0.020472250878810883, 0.06144418939948082, 0.12205406278371811, 0.20628021657466888, 0.05151847377419472, 0.04277971386909485, -0.02114066854119301, -0.09526141732931137, -0.07094309478998184, 0.13688641786575317, 0.024767186492681503, 0.03337583690881729, -0.04792526736855507, 0.10261368751525879, 0.08529496192932129, -0.17757895588874817, 0.09472730755805969, -0.0017730812542140484, -0.09911219775676727, -0.0848691463470459, -0.09801572561264038, -0.030440272763371468, -0.07007908076047897, -0.014531773515045643, -0.0986703485250473, 0.04750485718250275, 0.04856480658054352, 0.07473322004079819, -0.024481147527694702, 0.13240577280521393, -0.06254952400922775, -0.057958949357271194, 0.09859006851911545, -0.0046016438864171505, 0.09228397905826569, -0.07451282441616058, 0.02992694266140461, 0.03519132733345032, -0.03785409405827522, 0.05343378707766533, 0.02349858172237873, 0.019872665405273438, 0.012587178498506546, 0.012848667800426483, -0.056350525468587875, 0.024026580154895782, 0.012647523544728756, 0.10020891577005386, 0.12474212050437927, 0.055063143372535706, -0.05588321387767792, -0.0035901174414902925, 0.20544171333312988, -0.04272237420082092, -0.08797644823789597, -0.16410891711711884, 0.18829749524593353, 0.0611579604446888, 0.02291269227862358, 0.02851749025285244, -0.10056980699300766, 0.02582019567489624, 0.16839836537837982, 0.13315822184085846, -0.025175269693136215, -0.019048795104026794, 0.03889527544379234, 0.00119010207708925, 0.06261610984802246, 0.05513085424900055, 0.0681394413113594, 0.17386850714683533, -0.11568423360586166, 0.04824818670749664, -0.07492604106664658, -0.02685370482504368, 0.023664509877562523, 0.15557798743247986, 0.028484605252742767, -0.022519927471876144, -0.055682893842458725, 0.1108589619398117, -0.015470581129193306, -0.20902986824512482, 0.08448196202516556, -0.08235867321491241, -0.12868806719779968, -0.021043792366981506, -0.03210974484682083, 0.002964065643027425, 0.04584214836359024, -0.004288864322006702, -0.04738697409629822, 0.10330662876367569, 0.0364849790930748, -0.066539466381073, -0.12293090671300888, 0.06096541881561279, -0.01957952044904232, 0.18404154479503632, 0.005537415388971567, 0.12312346696853638, 0.08743193000555038, -0.01505789440125227, -0.09370027482509613, 0.05140184611082077, 0.052875712513923645, 0.03690510615706444, 0.041743043810129166, 0.12447291612625122, -0.010772736743092537, 0.08305348455905914, 0.018299171701073647, -0.09439894556999207, 0.04586150124669075, -0.1442636102437973, -0.06302487850189209, -0.16436533629894257, 0.0656033307313919, -0.06182778999209404, 0.15129724144935608, 0.22445733845233917, -0.039102714508771896, 0.024421071633696556, -0.07073996961116791, 0.0414096862077713, -0.016934601590037346, 0.12133253365755081, 0.021446816623210907, -0.157684326171875, 0.044131502509117126, -0.07318251579999924, 0.04978684335947037, -0.247380793094635, -0.031173456460237503, 0.019402937963604927, -0.060315947979688644, -0.02330399677157402, 0.11913852393627167, 0.04835691675543785, 0.06145152077078819, -0.05022755637764931, -0.028247514739632607, -0.02491389960050583, 0.1404993236064911, -0.09293416887521744, -0.07781756669282913 ]
null
null
transformers
# legal_t5_small_cls_it model Model for classification of legal text written in Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_cls_it is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for classification of legal texts written in Italian. ### How to use Here is how to use this model to classify legal text written in Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_cls_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_cls_it", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "Regolamento (CE) n. 435/2005 della Commissione del 17 marzo 2005 relativo all'applicazione di un coefficiente di riduzione ai certificati di restituzione per le merci non comprese nell'allegato I del trattato come statuito all'articolo 8, paragrafo 5, del regolamento (CE) n. 1520/2000 LA COMMISSIONE DELLE COMUNITÀ EUROPEE, visto il trattato che istituisce la Comunità europea, visto il regolamento (CE) n. 3448/93 del Consiglio, del 6 dicembre 1993, sul regime di scambi per talune merci ottenute dalla trasformazione di prodotti agricoli [1], visto il regolamento (CE) n. 1520/2000 della Commissione, del 13 luglio 2000, che stabilisce, per taluni prodotti agricoli esportati sotto forma di merci non comprese nell'allegato I del trattato, le modalità comuni di applicazione relative al versamento delle restituzioni all'esportazione e i criteri per stabilirne l'importo [2], in particolare l'articolo 8, paragrafo 5, considerando quanto segue: (1) Dalle comunicazioni degli Stati membri di cui all'articolo 8, paragrafo 2, del regolamento (CE) n. 1520/2000 si evince che l'importo totale delle domande ricevute ammonta a 178002906 EUR, mentre l'importo disponibile per la tranche di titoli di restituzione di cui all'articolo 8, paragrafo 4, del regolamento (CE) n. 1520/2000 ammonta a 68116869 EUR. (2) Un coefficiente di riduzione è calcolato sulla base dell'articolo 8, paragrafi 3 e 4, del regolamento (CE) n. 1520/2000. Siffatto coefficiente dovrebbe pertanto essere applicato agli importi richiesti sotto forma di certificati di restituzione per il periodo dal 1o aprile 2005 come stabilito all'articolo 8, paragrafo 6, del regolamento (CE) n. 1520/2000, HA ADOTTATO IL PRESENTE REGOLAMENTO: Articolo 1 Gli importi delle domande di certificati di restituzione per il periodo dal 1o aprile 2005 sono soggetti a un coefficiente di riduzione pari a 0,618. Articolo 2 Il presente regolamento entra in vigore il 18 marzo 2005. Il presente regolamento è obbligatorio in tutti i suoi elementi e direttamente applicabile in ciascuno degli Stati membri. Fatto a Bruxelles, il 17 marzo 2005. Per la Commissione Günter Verheugen Vicepresidente [1] GU L 318 del 20.12.1993, pag. 18. Regolamento modificato da ultimo dal regolamento (CE) n. 2580/2000 (GU L 298 del 25.11.2000, pag. 5). [2] GU L 177 del 15.7.2000, pag. 1. Regolamento modificato da ultimo dal regolamento (CE) n. 886/2004 (GU L 168 del 1.5.2004, pag. 14). --------------------------------------------------" pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_cls_it model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 23 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | F1 score | |:-----:|:-----:| | legal_t5_small_cls_it | 0.6296| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian", "tags": ["classification Italian model"], "datasets": ["jrc-acquis"], "widget": [{"text": "Regolamento (CE) n. 435/2005 della Commissione del 17 marzo 2005 relativo all'applicazione di un coefficiente di riduzione ai certificati di restituzione per le merci non comprese nell'allegato I del trattato come statuito all'articolo 8, paragrafo 5, del regolamento (CE) n. 1520/2000 LA COMMISSIONE DELLE COMUNIT\u00c0 EUROPEE, visto il trattato che istituisce la Comunit\u00e0 europea, visto il regolamento (CE) n. 3448/93 del Consiglio, del 6 dicembre 1993, sul regime di scambi per talune merci ottenute dalla trasformazione di prodotti agricoli [1], visto il regolamento (CE) n. 1520/2000 della Commissione, del 13 luglio 2000, che stabilisce, per taluni prodotti agricoli esportati sotto forma di merci non comprese nell'allegato I del trattato, le modalit\u00e0 comuni di applicazione relative al versamento delle restituzioni all'esportazione e i criteri per stabilirne l'importo [2], in particolare l'articolo 8, paragrafo 5, considerando quanto segue: (1) Dalle comunicazioni degli Stati membri di cui all'articolo 8, paragrafo 2, del regolamento (CE) n. 1520/2000 si evince che l'importo totale delle domande ricevute ammonta a 178002906 EUR, mentre l'importo disponibile per la tranche di titoli di restituzione di cui all'articolo 8, paragrafo 4, del regolamento (CE) n. 1520/2000 ammonta a 68116869 EUR. (2) Un coefficiente di riduzione \u00e8 calcolato sulla base dell'articolo 8, paragrafi 3 e 4, del regolamento (CE) n. 1520/2000. Siffatto coefficiente dovrebbe pertanto essere applicato agli importi richiesti sotto forma di certificati di restituzione per il periodo dal 1o aprile 2005 come stabilito all'articolo 8, paragrafo 6, del regolamento (CE) n. 1520/2000, HA ADOTTATO IL PRESENTE REGOLAMENTO: Articolo 1 Gli importi delle domande di certificati di restituzione per il periodo dal 1o aprile 2005 sono soggetti a un coefficiente di riduzione pari a 0,618. Articolo 2 Il presente regolamento entra in vigore il 18 marzo 2005. Il presente regolamento \u00e8 obbligatorio in tutti i suoi elementi e direttamente applicabile in ciascuno degli Stati membri. Fatto a Bruxelles, il 17 marzo 2005. Per la Commissione G\u00fcnter Verheugen Vicepresidente [1] GU L 318 del 20.12.1993, pag. 18. Regolamento modificato da ultimo dal regolamento (CE) n. 2580/2000 (GU L 298 del 25.11.2000, pag. 5). [2] GU L 177 del 15.7.2000, pag. 1. Regolamento modificato da ultimo dal regolamento (CE) n. 886/2004 (GU L 168 del 1.5.2004, pag. 14). --------------------------------------------------"}]}
text2text-generation
SEBIS/legal_t5_small_cls_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "classification Italian model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #classification Italian model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_cls\_it model =============================== Model for classification of legal text written in Italian. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_cls\_it is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for classification of legal texts written in Italian. ### How to use Here is how to use this model to classify legal text written in Italian in PyTorch: Training data ------------- The legal\_t5\_small\_cls\_it model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to classify legal text written in Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_it model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Italian model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to classify legal text written in Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_it model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 65, 152, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Italian model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to classify legal text written in Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_it model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.09832566231489182, 0.14190812408924103, -0.002641058759763837, 0.08530985563993454, 0.10511822253465652, 0.023980293422937393, 0.11073829233646393, 0.13182812929153442, -0.027125028893351555, 0.08101639896631241, 0.030593285337090492, 0.027621084824204445, 0.08860765397548676, 0.09480810165405273, 0.020242098718881607, -0.22904081642627716, 0.0215018130838871, -0.03389764204621315, 0.0025106172543019056, 0.1554252803325653, 0.13192370533943176, -0.09172877669334412, 0.044896144419908524, -0.023136593401432037, -0.13684536516666412, 0.009281960316002369, -0.06489544361829758, -0.048604466021060944, 0.06797794252634048, 0.037709906697273254, 0.11672987788915634, 0.01157293375581503, 0.08973436802625656, -0.18426908552646637, 0.005240933038294315, 0.09059800207614899, 0.02183333784341812, 0.044715095311403275, 0.11344806104898453, -0.03598184883594513, 0.16475877165794373, 0.003497100668027997, 0.05733564496040344, 0.03600844740867615, -0.11784785985946655, -0.09757331758737564, -0.053863536566495895, 0.10289150476455688, 0.1243012323975563, 0.14109954237937927, -0.053665805608034134, 0.11330755054950714, -0.11208544671535492, 0.05631318315863609, 0.06655604392290115, -0.2163248360157013, -0.07255549728870392, 0.02342251129448414, 0.038662537932395935, 0.0894617885351181, -0.06252466887235641, -0.05351854860782623, 0.03457186743617058, 0.017055772244930267, -0.03419489040970802, 0.00356487650424242, 0.035875312983989716, -0.00979171134531498, -0.17142623662948608, -0.0933089554309845, 0.17800593376159668, 0.010203192941844463, -0.08072765171527863, -0.07284960895776749, -0.005227986723184586, -0.13434021174907684, 0.03263205662369728, -0.048267871141433716, 0.016687778756022453, 0.0027264482341706753, 0.07501263171434402, 0.01318327896296978, -0.12468157708644867, -0.0961872786283493, 0.03150402382016182, 0.12939301133155823, 0.08561185002326965, -0.01734359934926033, 0.01464733574539423, 0.15002745389938354, -0.01856464333832264, -0.0751185342669487, -0.015308883972465992, 0.0021235442254692316, -0.10603514313697815, -0.061006225645542145, -0.035005465149879456, -0.11175289750099182, -0.01853564940392971, 0.1097555011510849, 0.006465821992605925, 0.03440169617533684, 0.029689276590943336, 0.05724942684173584, 0.019621683284640312, 0.13691700994968414, -0.08141418546438217, -0.002379546407610178, -0.05516402795910835, 0.05329600349068642, -0.07060439884662628, 0.011468559503555298, -0.038903962820768356, -0.008670462295413017, 0.04605312645435333, 0.0734669640660286, -0.040370482951402664, 0.006927045527845621, -0.07208433002233505, -0.054445426911115646, 0.010276810266077518, -0.13872583210468292, -0.003819421399384737, 0.01917831227183342, -0.10118734836578369, -0.04289327934384346, 0.06016412377357483, -0.01031079888343811, -0.12821416556835175, 0.08225707709789276, -0.03957951068878174, -0.010086948052048683, -0.13500185310840607, -0.08526625484228134, 0.0019211187027394772, -0.07697116583585739, -0.048440877348184586, -0.06989500671625137, -0.1214398443698883, -0.09830489009618759, 0.0641748458147049, -0.0701485127210617, -0.01641141064465046, -0.05036069080233574, 0.000265695241978392, -0.01567336730659008, -0.0010761565063148737, 0.07153671979904175, -0.027689356356859207, 0.08603125065565109, -0.009603285230696201, 0.03208846226334572, 0.08799204230308533, 0.06676805764436722, -0.11152426153421402, 0.025710970163345337, -0.07132881879806519, 0.11778723448514938, -0.03001801110804081, -0.014364739879965782, -0.1621728390455246, -0.08062643557786942, -0.055905889719724655, 0.02496257983148098, 0.08757000416517258, 0.14449241757392883, -0.1728302240371704, 0.007620682939887047, 0.1648552268743515, -0.1165027767419815, -0.05928178131580353, 0.12961329519748688, -0.039791692048311234, 0.1380220502614975, 0.05842754617333412, 0.14917787909507751, 0.07887271791696548, -0.06916047632694244, -0.015304374508559704, -0.0522647425532341, -0.00529969297349453, 0.040774036198854446, 0.10596427321434021, -0.03336534649133682, -0.05005180835723877, -0.03382297232747078, -0.11904918402433395, -0.013297547586262226, -0.06797017902135849, -0.07660701125860214, 0.0014709869865328074, -0.05791005864739418, -0.027937009930610657, 0.054303668439388275, 0.01756957732141018, -0.03496811166405678, -0.13470137119293213, -0.04942559450864792, 0.11183609813451767, -0.07467665523290634, 0.01356363296508789, -0.09425587207078934, 0.025690831243991852, -0.05053833872079849, -0.029550954699516296, -0.1688491404056549, 0.04905826970934868, 0.05282668024301529, -0.03990430012345314, 0.06541377305984497, 0.03369985893368721, 0.0299097690731287, 0.05335241183638573, -0.012877443805336952, -0.052003566175699234, -0.05648384243249893, -0.017103755846619606, -0.11573189496994019, -0.15765409171581268, -0.04866040498018265, -0.013660505414009094, 0.12200794368982315, -0.19194751977920532, 0.040633171796798706, -0.008529165759682655, 0.07261496037244797, -0.025249116122722626, -0.04941560700535774, -0.006097091361880302, 0.009561702609062195, -0.008684776723384857, -0.06531614065170288, 0.04182877764105797, 0.022198930382728577, -0.00007676719542359933, 0.027179338037967682, -0.10134410858154297, -0.11456101387739182, 0.07346973568201065, 0.03708905354142189, -0.15722306072711945, -0.0326111763715744, -0.060236115008592606, -0.054150402545928955, -0.05025140941143036, 0.005285109858959913, 0.22047001123428345, -0.0005881276447325945, 0.1134459376335144, -0.1035015732049942, -0.041815079748630524, 0.0052881985902786255, -0.027074690908193588, -0.01786159910261631, 0.12947382032871246, 0.10259367525577545, -0.12943488359451294, 0.06430789828300476, 0.046213746070861816, -0.04852176457643509, 0.13744275271892548, 0.017692312598228455, -0.12416013330221176, -0.0023622422013431787, 0.09186148643493652, 0.012824906036257744, 0.07485134899616241, -0.16813939809799194, 0.031413108110427856, 0.01454430352896452, 0.02780335210263729, 0.060928553342819214, -0.1525377631187439, 0.016356339678168297, 0.04377258941531181, -0.026852386072278023, 0.005249604117125273, -0.04343944787979126, -0.05633207783102989, 0.10566085577011108, 0.034843359142541885, -0.042221952229738235, -0.0014166415203362703, -0.04462474212050438, -0.13690029084682465, 0.20300209522247314, -0.05333932489156723, -0.14450959861278534, -0.11996424198150635, 0.07472071051597595, 0.0341552272439003, 0.01700921729207039, 0.018462959676980972, -0.07195135951042175, -0.043868210166692734, -0.09859871119260788, 0.0704655796289444, -0.04744927957653999, -0.01689337193965912, -0.0864834189414978, 0.016637468710541725, 0.01126863807439804, -0.10859902203083038, 0.023909220471978188, -0.00777239678427577, -0.09537408500909805, -0.010938390158116817, -0.06997189670801163, 0.09572292119264603, 0.190548837184906, -0.04421016201376915, 0.038530267775058746, 0.028290478512644768, 0.1647188365459442, -0.1359732300043106, 0.026015562936663628, 0.09950194507837296, -0.024275997653603554, 0.0067877396941185, 0.09796142578125, 0.006460025440901518, -0.07936886698007584, 0.04893229529261589, 0.05220863223075867, -0.02057681791484356, -0.2850533127784729, -0.04740527272224426, -0.018635626882314682, -0.029663097113370895, 0.11415504664182663, 0.053608525544404984, 0.02332618646323681, 0.061024200171232224, -0.038673676550388336, -0.05342209339141846, 0.028160985559225082, 0.0679832473397255, -0.05173289030790329, 0.011996777728199959, 0.058536045253276825, -0.04799535870552063, -0.02803766541182995, 0.06857748329639435, 0.03348807245492935, 0.2236347645521164, -0.03223910927772522, 0.1180916279554367, 0.10421305149793625, 0.10250163078308105, 0.016020985320210457, 0.05794733017683029, -0.020590584725141525, 0.009813361801207066, 0.01159896794706583, -0.04852335527539253, -0.03697768226265907, 0.055831920355558395, -0.008344175294041634, 0.01869242452085018, -0.0988580659031868, -0.02702147141098976, 0.038657091557979584, 0.3141227662563324, 0.04630955308675766, -0.25636348128318787, -0.06207122653722763, -0.006241913419216871, -0.05174219608306885, -0.10012362152338028, 0.028758833184838295, 0.07988786697387695, -0.1242474913597107, -0.023723434656858444, -0.03720331937074661, 0.10083530843257904, -0.11159063875675201, -0.04041168466210365, 0.09488595277070999, 0.048239052295684814, -0.02913837507367134, 0.1286376416683197, -0.24721795320510864, 0.1739116907119751, -0.008088941685855389, 0.10971248894929886, -0.06302333623170853, 0.014968003146350384, -0.04217223450541496, 0.059838008135557175, 0.16199547052383423, 0.004379387944936752, 0.07016700506210327, -0.11081380397081375, -0.10639992356300354, -0.006155374459922314, 0.08479631692171097, -0.08943691849708557, 0.09331851452589035, -0.005413906183093786, 0.024917522445321083, 0.0052340226247906685, -0.06148017570376396, -0.09220258891582489, -0.1112687885761261, 0.03381626307964325, -0.06142421439290047, 0.04195832461118698, -0.04899580031633377, -0.06858284771442413, 0.007818171754479408, 0.19816741347312927, -0.15904931724071503, -0.038657866418361664, -0.10632462054491043, 0.045433711260557175, 0.08483757823705673, -0.034978337585926056, -0.013887598179280758, -0.0009333236375823617, 0.057430848479270935, 0.0007346141501329839, 0.017332972958683968, 0.09525088220834732, -0.0753379836678505, -0.15333260595798492, -0.0540444552898407, 0.12029891461133957, 0.13362924754619598, 0.05687130615115166, -0.0014171345392242074, 0.0052806418389081955, -0.019944388419389725, -0.08914213627576828, 0.005838505458086729, -0.0009286209824495018, 0.066119484603405, 0.06635161489248276, -0.07030060887336731, -0.005415210500359535, -0.10883687436580658, -0.049334898591041565, 0.0811346024274826, 0.10880220681428909, -0.052036918699741364, 0.02581985667347908, 0.11402858793735504, -0.12382031977176666, -0.170537531375885, 0.0699245035648346, 0.10093265026807785, 0.05186786502599716, -0.049367524683475494, -0.17577387392520905, 0.05428539589047432, 0.10558035224676132, -0.0010683045256882906, -0.009888073429465294, -0.3763144016265869, -0.11551173031330109, 0.0753360316157341, 0.0951441079378128, -0.06133614853024483, -0.0977916270494461, -0.013588063418865204, 0.027546396479010582, -0.012826260179281235, 0.08361288905143738, -0.015738165006041527, 0.08046557754278183, 0.015367587096989155, -0.0761130079627037, 0.045699138194322586, -0.05986495316028595, 0.13370700180530548, 0.08053557574748993, 0.06144535541534424, -0.05609893798828125, -0.00947677344083786, 0.01939322054386139, -0.007717098109424114, 0.134982168674469, 0.04154212400317192, 0.0258515365421772, -0.20643250644207, -0.0741148591041565, -0.07229835540056229, 0.019839106127619743, -0.07620832324028015, -0.0545760802924633, -0.021014900878071785, 0.10491728782653809, 0.04518662765622139, 0.007222872693091631, -0.010646865703165531, -0.09031393378973007, 0.022349994629621506, 0.13678975403308868, 0.1381223201751709, 0.10993851721286774, -0.08427543193101883, 0.026139061897993088, 0.042367905378341675, 0.06551965326070786, -0.12098027765750885, 0.004873060155659914, 0.1284669190645218, -0.011549970135092735, 0.1606217324733734, 0.01259680837392807, -0.1513187736272812, -0.020096732303500175, 0.07166464626789093, -0.10828043520450592, -0.10446518659591675, -0.02443419024348259, 0.017410870641469955, -0.09123971313238144, -0.0189060065895319, 0.06090158596634865, -0.10675284266471863, -0.015259173698723316, -0.005624951794743538, 0.04505676403641701, -0.08215654641389847, 0.1984892189502716, 0.06348463147878647, 0.05310845747590065, -0.05549642816185951, 0.12985709309577942, 0.11589887738227844, -0.11653974652290344, 0.00683848699554801, 0.15762636065483093, -0.07292565703392029, -0.0608346089720726, 0.03418892249464989, 0.11456335335969925, -0.04259351268410683, -0.09059564769268036, -0.07533127069473267, -0.052557285875082016, 0.05851782485842705, 0.00862711202353239, 0.05766128748655319, 0.03821626678109169, -0.0295222457498312, -0.014191544614732265, -0.16447338461875916, 0.08589114248752594, 0.0929405465722084, 0.009739701636135578, -0.04040789231657982, 0.1643308848142624, 0.03239606320858002, 0.015028140507638454, -0.0040695546194911, -0.039462193846702576, -0.053703613579273224, 0.04976316913962364, -0.017234856262803078, -0.014239986427128315, -0.030902275815606117, 0.0002368356508668512, -0.017580673098564148, -0.02064293622970581, -0.006160329561680555, 0.020736945793032646, -0.07406055182218552, -0.027358733117580414, -0.02985856868326664, 0.04809877648949623, -0.07690808922052383, 0.009494938887655735, 0.011980858631432056, -0.06478939205408096, 0.09669077396392822, 0.006353973411023617, -0.009591643698513508, -0.027817197144031525, -0.07462301850318909, 0.09918524324893951, -0.03455156460404396, 0.0005176950362510979, -0.029046764597296715, -0.12356656789779663, 0.05726753547787666, -0.0037069625686854124, -0.05299833416938782, 0.005987241398543119, 0.056336626410484314, -0.11939657479524612, 0.0636085569858551, -0.018762582913041115, -0.018771667033433914, -0.07379782199859619, 0.12230056524276733, 0.03318791091442108, 0.06982932984828949, 0.08813826739788055, -0.04382126405835152, 0.08535799384117126, -0.1407237946987152, -0.028090091422200203, 0.03065863810479641, 0.02773434855043888, -0.04899906367063522, -0.04669127240777016, 0.059951961040496826, -0.043427255004644394, 0.06867212057113647, 0.07619600743055344, 0.048954032361507416, 0.040000054985284805, -0.07020893692970276, 0.013751311227679253, 0.04926316812634468, 0.060034822672605515, -0.041559599339962006, 0.008897834457457066, 0.00772156473249197, 0.019783638417720795, -0.04344112426042557, 0.06386785209178925, 0.11314112693071365, 0.21811041235923767, 0.05295317992568016, 0.04921958968043327, 0.014503194950520992, -0.0970635935664177, -0.09364479780197144, 0.12909317016601562, -0.0013011237606406212, 0.03921787440776825, -0.03430972620844841, 0.11087546497583389, 0.09634196758270264, -0.17686954140663147, 0.06110657751560211, -0.021480286493897438, -0.10529971122741699, -0.07413729280233383, -0.15396445989608765, -0.028106022626161575, -0.06506770849227905, -0.006754075177013874, -0.10778678953647614, 0.04285570979118347, 0.05522020906209946, 0.0533013641834259, -0.05952826887369156, 0.14511431753635406, -0.0262683667242527, -0.05387355759739876, 0.10480469465255737, -0.007432493846863508, 0.09807968139648438, -0.07907776534557343, -0.010693145915865898, 0.023611295968294144, -0.03193236514925957, 0.06430622190237045, 0.012733398005366325, 0.025623692199587822, 0.0009454337996430695, 0.009751013480126858, -0.05994197726249695, 0.02148512750864029, 0.0029502916149795055, 0.09646163880825043, 0.11647391319274902, 0.030890390276908875, -0.03255625069141388, -0.012975882738828659, 0.20207804441452026, -0.03474860638380051, -0.06096428632736206, -0.15554696321487427, 0.1771937608718872, 0.057071536779403687, 0.017708435654640198, 0.029793359339237213, -0.11540094763040543, 0.0317656546831131, 0.17828577756881714, 0.11094649136066437, -0.04590408504009247, -0.02760993130505085, 0.03859423100948334, 0.003618381917476654, 0.038183726370334625, 0.08019047230482101, 0.04751591011881828, 0.14604254066944122, -0.1067802906036377, 0.0795687809586525, -0.06811811029911041, -0.022722981870174408, 0.016489647328853607, 0.1522156298160553, 0.029997125267982483, -0.012586959637701511, -0.08000456541776657, 0.12395875155925751, 0.00710698077455163, -0.18749003112316132, 0.07521893829107285, -0.09670233726501465, -0.1509818732738495, -0.02100474387407303, -0.016455084085464478, -0.014898213557898998, 0.07289773970842361, -0.019767876714468002, -0.03414258733391762, 0.08906964957714081, 0.04264017194509506, -0.06593744456768036, -0.15133720636367798, 0.06468954682350159, -0.042810119688510895, 0.18253105878829956, -0.0035163990687578917, 0.1106613427400589, 0.08925124257802963, 0.0042775957845151424, -0.1076182872056961, 0.07004941254854202, 0.04283877834677696, 0.02049867995083332, 0.04599148407578468, 0.12009762972593307, -0.033373501151800156, 0.09472077339887619, 0.04586777091026306, -0.11826463788747787, 0.04702647402882576, -0.1563526690006256, -0.05328892916440964, -0.16612796485424042, 0.04922999441623688, -0.060788143426179886, 0.1636684536933899, 0.22387024760246277, -0.04697262495756149, -0.0012835129164159298, -0.06388978660106659, 0.05602236092090607, 0.001973890233784914, 0.15032680332660675, 0.021826565265655518, -0.19762127101421356, 0.01298477128148079, -0.07192785292863846, 0.05127348378300667, -0.20947133004665375, -0.03624176234006882, 0.014013928361237049, -0.07645805180072784, -0.040648844093084335, 0.11190389096736908, 0.06324325501918793, 0.06523369997739792, -0.04906443506479263, -0.061106130480766296, -0.011911936104297638, 0.140177920460701, -0.10705150663852692, -0.08164367079734802 ]
null
null
transformers
# legal_t5_small_cls_sv model Model for classification of legal text written in Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_cls_sv is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for classification of legal texts written in Swedish. ### How to use Here is how to use this model to classify legal text written in Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_cls_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_cls_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "Rådets förordning (EG) nr 1973/2002 av den 5 november 2002 om ändring av förordning (EG) nr 2026/97 om skydd mot subventionerad import från länder som inte är medlemmar i Europeiska gemenskapen EUROPEISKA UNIONENS RÅD HAR ANTAGIT DENNA FÖRORDNING med beaktande av Fördraget om upprättandet av Europeiska gemenskapen, särskilt artikel 133 i detta, med beaktande av kommissionens förslag, och av följande skäl: (1) Rådet antog genom förordning (EG) nr 2026/97(1) gemensamma regler för skydd mot subventionerad import från länder som inte är medlemmar i Europeiska gemenskapen. (2) I artikel 6 i förordning (EG) nr 2026/97 anges vissa riktlinjer för beräkning av förmånen för mottagaren, inbegripet det riktmärke för marknaden enligt vilket förmånens storlek beräknas. Det bör klargöras vilka bestämmelser som bör följas i de fall ett sådant riktmärke för marknaden inte finns i det berörda landet. I en sådan situation bör riktmärket fastställas genom anpassning av de villkor som råder i det berörda landet på grundval av de faktiska uppgifter som är tillgängliga där. Om detta inte är praktiskt genomförbart på grund av att det inte finns några uppgifter om sådana priser och kostnader eller på grund av att dessa är otillförlitliga, bör riktmärket fastställas med hjälp av de villkor som gäller på andra marknader. (3) I artikel 4 i förordning (EG) nr 2026/97 anges att vissa subventioner som rör miljö, forskning och regional utveckling inte är utjämningsbara. I artikel 10.5 och 10.6 i den förordningen anges vidare att undersökningar kan inledas för att avgöra om subventioner är icke-utjämningsbara och att de inte bör inledas om de rör vissa icke-utjämningsbara subventioner. Motsvarande bestämmelser i WTO-avtalet beträffande subventioner och utjämningsåtgärder var avsedda att löpa ut den 31 december 1999, såvida inte WTO-medlemsstaterna beslutade annat. Inget sådant beslut har fattats och de relevanta bestämmelserna är därför inte längre tillämpliga. Det är därför nödvändigt att fastställa huruvida bestämmelserna rörande icke-utjämningsbara subventioner i förordning (EG) nr 2026/97 bör fortsätta att gälla. Gemenskapens viktigaste handelspartner tillämpar inte längre dessa bestämmelser i sina utjämningsundersökningar. Av denna anledning och i syfte att upprätthålla balansen mellan rättigheter och skyldigheter enligt nämnda WTO-avtal bör de bestämmelser i förordning (EG) nr 2026/97 som rör icke-utjämningsbara subventioner upphöra att gälla. (4) I artikel 28.5 i förordning (EG) nr 2026/97 anges att om tillgängliga uppgifter används skall upplysningarna kontrolleras genom att jämföras med uppgifter från flera källor. Det bör specificeras att dessa källor också kan utgöras av uppgifter om världsmarknaden eller andra representativa marknader. (5) Ur rättssäkerhetssynpunkt är det lämpligt att dessa ändringar tillämpas så snart som möjligt i samband med alla nya undersökningar. HÄRIGENOM FÖRESKRIVS FÖLJANDE. Artikel 1 Förordning (EG) nr 2026/97 ändras enligt följande: 1. I artikel 6 d skall följande text läggas till: %quot%Om det inte finns några sådana rådande marknadsvillkor för produkterna eller tjänsterna i fråga i det land som tillhandahåller eller köper dem, som kan användas som lämpliga riktmärken, skall en av följande bestämmelser tillämpas: i) De villkor som råder i landet i fråga skall justeras på grundval av de faktiska kostnader, priser och andra faktorer som är tillgängliga i det landet med hjälp av ett lämpligt belopp som avspeglar normala marknadsvillkor. ii) I tillämpliga fall skall de villkor användas som råder på marknaden i ett annat land eller på världsmarknaden och som är tillgängliga för mottagaren.%quot% 2. Artikel 4 och artikel 10.5 och 10.6 skall utgå. 3. I artikel 28.5 skall följande mening läggas till: %quot%Sådana uppgifter kan, i tillämpliga fall, inbegripa relevanta upplysningar om världsmarknaden eller andra representativa marknader.%quot% Artikel 2 Denna förordning träder i kraft dagen efter det att den har offentliggjorts i Europeiska gemenskapernas officiella tidning. Den skall tillämpas i samband med alla undersökningar som inleds i enlighet med förordning (EG) nr 2026/97 efter dagen för ikraftträdandet av denna förordning. Denna förordning är till alla delar bindande och direkt tillämplig i alla medlemsstater. Utfärdad i Bryssel den 5 november 2002. På rådets vägnar T. Pedersen Ordförande (1) EGT L 288, 21.10.1997, s. 1." pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_cls_sv model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 23 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | F1 score | |:-----:|:-----:| | legal_t5_small_cls_sv | 0.6449| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish", "tags": ["classification Swedish model"], "datasets": ["jrc-acquis"], "widget": [{"text": "R\u00e5dets f\u00f6rordning (EG) nr 1973/2002 av den 5 november 2002 om \u00e4ndring av f\u00f6rordning (EG) nr 2026/97 om skydd mot subventionerad import fr\u00e5n l\u00e4nder som inte \u00e4r medlemmar i Europeiska gemenskapen EUROPEISKA UNIONENS R\u00c5D HAR ANTAGIT DENNA F\u00d6RORDNING med beaktande av F\u00f6rdraget om uppr\u00e4ttandet av Europeiska gemenskapen, s\u00e4rskilt artikel 133 i detta, med beaktande av kommissionens f\u00f6rslag, och av f\u00f6ljande sk\u00e4l: (1) R\u00e5det antog genom f\u00f6rordning (EG) nr 2026/97(1) gemensamma regler f\u00f6r skydd mot subventionerad import fr\u00e5n l\u00e4nder som inte \u00e4r medlemmar i Europeiska gemenskapen. (2) I artikel 6 i f\u00f6rordning (EG) nr 2026/97 anges vissa riktlinjer f\u00f6r ber\u00e4kning av f\u00f6rm\u00e5nen f\u00f6r mottagaren, inbegripet det riktm\u00e4rke f\u00f6r marknaden enligt vilket f\u00f6rm\u00e5nens storlek ber\u00e4knas. Det b\u00f6r klarg\u00f6ras vilka best\u00e4mmelser som b\u00f6r f\u00f6ljas i de fall ett s\u00e5dant riktm\u00e4rke f\u00f6r marknaden inte finns i det ber\u00f6rda landet. I en s\u00e5dan situation b\u00f6r riktm\u00e4rket fastst\u00e4llas genom anpassning av de villkor som r\u00e5der i det ber\u00f6rda landet p\u00e5 grundval av de faktiska uppgifter som \u00e4r tillg\u00e4ngliga d\u00e4r. Om detta inte \u00e4r praktiskt genomf\u00f6rbart p\u00e5 grund av att det inte finns n\u00e5gra uppgifter om s\u00e5dana priser och kostnader eller p\u00e5 grund av att dessa \u00e4r otillf\u00f6rlitliga, b\u00f6r riktm\u00e4rket fastst\u00e4llas med hj\u00e4lp av de villkor som g\u00e4ller p\u00e5 andra marknader. (3) I artikel 4 i f\u00f6rordning (EG) nr 2026/97 anges att vissa subventioner som r\u00f6r milj\u00f6, forskning och regional utveckling inte \u00e4r utj\u00e4mningsbara. I artikel 10.5 och 10.6 i den f\u00f6rordningen anges vidare att unders\u00f6kningar kan inledas f\u00f6r att avg\u00f6ra om subventioner \u00e4r icke-utj\u00e4mningsbara och att de inte b\u00f6r inledas om de r\u00f6r vissa icke-utj\u00e4mningsbara subventioner. Motsvarande best\u00e4mmelser i WTO-avtalet betr\u00e4ffande subventioner och utj\u00e4mnings\u00e5tg\u00e4rder var avsedda att l\u00f6pa ut den 31 december 1999, s\u00e5vida inte WTO-medlemsstaterna beslutade annat. Inget s\u00e5dant beslut har fattats och de relevanta best\u00e4mmelserna \u00e4r d\u00e4rf\u00f6r inte l\u00e4ngre till\u00e4mpliga. Det \u00e4r d\u00e4rf\u00f6r n\u00f6dv\u00e4ndigt att fastst\u00e4lla huruvida best\u00e4mmelserna r\u00f6rande icke-utj\u00e4mningsbara subventioner i f\u00f6rordning (EG) nr 2026/97 b\u00f6r forts\u00e4tta att g\u00e4lla. Gemenskapens viktigaste handelspartner till\u00e4mpar inte l\u00e4ngre dessa best\u00e4mmelser i sina utj\u00e4mningsunders\u00f6kningar. Av denna anledning och i syfte att uppr\u00e4tth\u00e5lla balansen mellan r\u00e4ttigheter och skyldigheter enligt n\u00e4mnda WTO-avtal b\u00f6r de best\u00e4mmelser i f\u00f6rordning (EG) nr 2026/97 som r\u00f6r icke-utj\u00e4mningsbara subventioner upph\u00f6ra att g\u00e4lla. (4) I artikel 28.5 i f\u00f6rordning (EG) nr 2026/97 anges att om tillg\u00e4ngliga uppgifter anv\u00e4nds skall upplysningarna kontrolleras genom att j\u00e4mf\u00f6ras med uppgifter fr\u00e5n flera k\u00e4llor. Det b\u00f6r specificeras att dessa k\u00e4llor ocks\u00e5 kan utg\u00f6ras av uppgifter om v\u00e4rldsmarknaden eller andra representativa marknader. (5) Ur r\u00e4ttss\u00e4kerhetssynpunkt \u00e4r det l\u00e4mpligt att dessa \u00e4ndringar till\u00e4mpas s\u00e5 snart som m\u00f6jligt i samband med alla nya unders\u00f6kningar. H\u00c4RIGENOM F\u00d6RESKRIVS F\u00d6LJANDE. Artikel 1 F\u00f6rordning (EG) nr 2026/97 \u00e4ndras enligt f\u00f6ljande: 1. I artikel 6 d skall f\u00f6ljande text l\u00e4ggas till: %quot%Om det inte finns n\u00e5gra s\u00e5dana r\u00e5dande marknadsvillkor f\u00f6r produkterna eller tj\u00e4nsterna i fr\u00e5ga i det land som tillhandah\u00e5ller eller k\u00f6per dem, som kan anv\u00e4ndas som l\u00e4mpliga riktm\u00e4rken, skall en av f\u00f6ljande best\u00e4mmelser till\u00e4mpas: i) De villkor som r\u00e5der i landet i fr\u00e5ga skall justeras p\u00e5 grundval av de faktiska kostnader, priser och andra faktorer som \u00e4r tillg\u00e4ngliga i det landet med hj\u00e4lp av ett l\u00e4mpligt belopp som avspeglar normala marknadsvillkor. ii) I till\u00e4mpliga fall skall de villkor anv\u00e4ndas som r\u00e5der p\u00e5 marknaden i ett annat land eller p\u00e5 v\u00e4rldsmarknaden och som \u00e4r tillg\u00e4ngliga f\u00f6r mottagaren.%quot% 2. Artikel 4 och artikel 10.5 och 10.6 skall utg\u00e5. 3. I artikel 28.5 skall f\u00f6ljande mening l\u00e4ggas till: %quot%S\u00e5dana uppgifter kan, i till\u00e4mpliga fall, inbegripa relevanta upplysningar om v\u00e4rldsmarknaden eller andra representativa marknader.%quot% Artikel 2 Denna f\u00f6rordning tr\u00e4der i kraft dagen efter det att den har offentliggjorts i Europeiska gemenskapernas officiella tidning. Den skall till\u00e4mpas i samband med alla unders\u00f6kningar som inleds i enlighet med f\u00f6rordning (EG) nr 2026/97 efter dagen f\u00f6r ikrafttr\u00e4dandet av denna f\u00f6rordning. Denna f\u00f6rordning \u00e4r till alla delar bindande och direkt till\u00e4mplig i alla medlemsstater. Utf\u00e4rdad i Bryssel den 5 november 2002. P\u00e5 r\u00e5dets v\u00e4gnar T. Pedersen Ordf\u00f6rande (1) EGT L 288, 21.10.1997, s. 1."}]}
text2text-generation
SEBIS/legal_t5_small_cls_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "classification Swedish model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #classification Swedish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_cls\_sv model =============================== Model for classification of legal text written in Swedish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_cls\_sv is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for classification of legal texts written in Swedish. ### How to use Here is how to use this model to classify legal text written in Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_cls\_sv model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to classify legal text written in Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_sv model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Swedish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to classify legal text written in Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_sv model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 65, 152, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #classification Swedish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to classify legal text written in Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_cls\\_sv model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0724887028336525, 0.08739692717790604, -0.0025085736997425556, 0.08550391346216202, 0.09115134924650192, -0.014143074862658978, 0.0848684161901474, 0.10129000246524811, -0.05376921966671944, 0.06417066603899002, 0.06522238254547119, 0.030318306758999825, 0.09180647134780884, 0.07605006545782089, 0.03494473546743393, -0.25688302516937256, 0.01988609880208969, -0.048413075506687164, -0.003377343527972698, 0.14414918422698975, 0.13615824282169342, -0.0766497477889061, 0.032898835837841034, -0.01596156135201454, -0.0781060978770256, -0.005346925929188728, -0.06512568891048431, -0.040808651596307755, 0.061670154333114624, 0.029847484081983566, 0.10829077661037445, 0.02446805126965046, 0.09308744966983795, -0.1886221021413803, 0.00014782660582568496, 0.043329279869794846, 0.02436484396457672, 0.03652697429060936, 0.09289862960577011, -0.0013340187724679708, 0.18391096591949463, -0.02487967722117901, 0.06460145860910416, 0.023437930271029472, -0.08742115646600723, -0.1381509155035019, -0.055350109934806824, 0.09602942317724228, 0.13714703917503357, 0.13130739331245422, -0.06616129726171494, 0.0880269929766655, -0.1020834818482399, 0.07803177833557129, 0.05918262153863907, -0.22365285456180573, -0.06702258437871933, 0.09125877171754837, 0.0689605250954628, 0.09576620906591415, -0.07284431904554367, -0.03581211343407631, 0.04353756457567215, 0.04564426839351654, 0.022213535383343697, -0.02216659113764763, -0.008250894024968147, 0.0024989969097077847, -0.18038949370384216, -0.07132986932992935, 0.18618535995483398, 0.006736139301210642, -0.07149971276521683, -0.08927031606435776, 0.012937258929014206, -0.12531724572181702, 0.04588967561721802, -0.05151815339922905, 0.03027554228901863, -0.020856499671936035, 0.058306269347667694, -0.03836219012737274, -0.14232538640499115, -0.07346503436565399, 0.045831598341464996, 0.05531087517738342, 0.04736269637942314, -0.010122294537723064, 0.031151140108704567, 0.12092598527669907, -0.043221160769462585, -0.08566144108772278, -0.03789234906435013, -0.022729456424713135, -0.11809050291776657, -0.06265920400619507, -0.04044848307967186, -0.18034525215625763, -0.016244450584053993, 0.10295897722244263, -0.007941131480038166, 0.039583079516887665, 0.04172765836119652, 0.051447298377752304, 0.026599237695336342, 0.16210046410560608, -0.06766943633556366, -0.011681288480758667, -0.06330202519893646, 0.0261826291680336, -0.05490059405565262, -0.0009695079061202705, -0.03579823300242424, -0.0026783987414091825, 0.07996221631765366, 0.047577232122421265, -0.06941857188940048, 0.016886558383703232, -0.04496388882398605, -0.02949007973074913, -0.016911707818508148, -0.12952721118927002, -0.026529811322689056, 0.0007333648391067982, -0.07769189029932022, -0.08222051709890366, 0.09047417342662811, 0.011404931545257568, -0.12089736014604568, 0.10509102791547775, -0.012294960208237171, 0.004966709762811661, -0.09770183265209198, -0.10550416260957718, 0.007173680234700441, -0.08444811403751373, -0.03709600493311882, -0.061640284955501556, -0.12175142019987106, -0.11950843036174774, 0.0782947689294815, -0.05043552443385124, -0.04365788772702217, -0.07298243045806885, -0.03215158358216286, 0.00018961135356221348, -0.028360050171613693, 0.09854044765233994, -0.03408012166619301, 0.061929382383823395, -0.05534568056464195, 0.039742372930049896, 0.09668600559234619, 0.055455081164836884, -0.11960551142692566, 0.02026509679853916, -0.10516220331192017, 0.13566073775291443, -0.08533341437578201, -0.00786279421299696, -0.1503332257270813, -0.0925062969326973, -0.023987609893083572, 0.04001728817820549, 0.07114502042531967, 0.11691169440746307, -0.1818096935749054, 0.004167228005826473, 0.15601567924022675, -0.12422754615545273, -0.044997356832027435, 0.10610383749008179, -0.03189661353826523, 0.11728441715240479, 0.07157720625400543, 0.1845172792673111, 0.0788603201508522, -0.06028350442647934, -0.011737548746168613, -0.0086654769256711, -0.016133060678839684, 0.03182673081755638, 0.11389748752117157, -0.054678093641996384, -0.06614535301923752, -0.01713978312909603, -0.1038312092423439, -0.02854822389781475, -0.03673173114657402, -0.07336709648370743, 0.011995991691946983, -0.03927098214626312, -0.049417201429605484, 0.05321136862039566, 0.008371814154088497, -0.050580672919750214, -0.10389573872089386, 0.015792617574334145, 0.07357119023799896, -0.06493087112903595, 0.026782063767313957, -0.04293931648135185, -0.024442413821816444, -0.07682795077562332, -0.03192378953099251, -0.16592712700366974, 0.025006849318742752, 0.01572701521217823, -0.0338653139770031, 0.07196352630853653, 0.08801931142807007, 0.041074998676776886, 0.05607135221362114, -0.013440258800983429, -0.04125072434544563, -0.04810987785458565, -0.014966323040425777, -0.11711268872022629, -0.15778301656246185, -0.030321083962917328, -0.028637802228331566, 0.10176465660333633, -0.1835155189037323, 0.0171569362282753, -0.019107941538095474, 0.05453410744667053, -0.011788132600486279, -0.047883737832307816, 0.050893306732177734, 0.023317851126194, 0.012835189700126648, -0.05944601446390152, 0.05006033554673195, 0.013343947939574718, -0.049508921802043915, 0.0750063806772232, -0.06610538065433502, -0.14971737563610077, 0.06390020251274109, 0.0491192601621151, -0.15810365974903107, -0.01977565512061119, -0.0568743459880352, -0.05074679106473923, -0.060671478509902954, 0.017785698175430298, 0.22435861825942993, 0.013389491476118565, 0.11266015470027924, -0.10382035374641418, -0.057388339191675186, -0.004461301490664482, -0.0652826577425003, -0.008145136758685112, 0.16679972410202026, 0.06732557713985443, -0.15005239844322205, 0.059963759034872055, -0.008901258930563927, -0.04525669664144516, 0.2021297961473465, 0.009239242412149906, -0.10677303373813629, 0.0030486476607620716, 0.06822880357503891, -0.0014369062846526504, 0.10242593288421631, -0.12447094917297363, 0.03143947571516037, 0.028557665646076202, 0.028238244354724884, 0.04699823260307312, -0.1373995542526245, 0.006773711647838354, 0.04474617540836334, -0.04653064161539078, 0.022334443405270576, -0.01808653585612774, -0.05370495840907097, 0.06741035729646683, 0.019335836172103882, -0.02441413141787052, -0.00510235084220767, -0.04918178170919418, -0.15705586969852448, 0.21817229688167572, -0.06643164157867432, -0.15699541568756104, -0.14366815984249115, 0.10074679553508759, 0.009858987294137478, 0.01503788586705923, 0.05013960972428322, -0.08890494704246521, -0.06448279321193695, -0.125453919172287, 0.10019007325172424, -0.009265444241464138, -0.027542321011424065, -0.09786739200353622, 0.003959409426897764, 0.03570343926548958, -0.11585339158773422, 0.021575141698122025, -0.025455879047513008, -0.05795659124851227, 0.005811688955873251, -0.031490810215473175, 0.07459625601768494, 0.1437729150056839, -0.025341764092445374, 0.022348694503307343, 0.029679138213396072, 0.19548191130161285, -0.10174959152936935, 0.030999718233942986, 0.060110386461019516, -0.04570837691426277, 0.013479683548212051, 0.11843298375606537, 0.006627188995480537, -0.05918281152844429, 0.04692361503839493, 0.06568114459514618, -0.04440508037805557, -0.29401910305023193, -0.04123828932642937, -0.007322081830352545, -0.009602488949894905, 0.11090520769357681, 0.05979323759675026, -0.0172053724527359, 0.07018521428108215, -0.02339654602110386, -0.06176914647221565, 0.03126024827361107, 0.06436099112033844, -0.05931008979678154, 0.002353872172534466, 0.06661556661128998, -0.045615941286087036, -0.002470587147399783, 0.05690726265311241, -0.007228435017168522, 0.19203850626945496, -0.04291548952460289, 0.063620425760746, 0.11099238693714142, 0.07922876626253128, 0.008603260852396488, 0.0593525692820549, -0.03879871964454651, 0.009449662640690804, 0.028437048196792603, -0.052239950746297836, -0.061027657240629196, 0.058432940393686295, -0.012562746182084084, 0.029056129977107048, -0.07596536725759506, -0.004998140502721071, 0.030670519918203354, 0.29721906781196594, 0.06121298670768738, -0.2084539532661438, -0.10161831974983215, 0.01860189065337181, -0.0701960027217865, -0.09280180931091309, 0.026507828384637833, 0.09399731457233429, -0.14488103985786438, 0.02429560013115406, -0.057895127683877945, 0.10477935522794724, -0.07636892050504684, -0.030124692246317863, 0.08242945373058319, 0.04260252043604851, -0.02165619470179081, 0.12301388382911682, -0.2393878847360611, 0.1686561107635498, -0.01457996852695942, 0.09654203802347183, -0.05316721647977829, 0.019010918214917183, -0.05057595297694206, 0.046875618398189545, 0.18032969534397125, 0.031915083527565, 0.00007894670852692798, -0.08560242503881454, -0.08922575414180756, 0.00485604302957654, 0.04319191351532936, -0.10126103460788727, 0.09616052359342575, 0.013875112868845463, 0.040446605533361435, -0.024814018979668617, -0.09340279549360275, -0.088081493973732, -0.08619918674230576, 0.011388623155653477, -0.09909849613904953, 0.05416527017951012, -0.04938419163227081, -0.061238862574100494, -0.050530824810266495, 0.19523011147975922, -0.18120728433132172, -0.09977306425571442, -0.10831116884946823, 0.04500621557235718, 0.07761427015066147, -0.03560400754213333, -0.022375624626874924, 0.015470637939870358, 0.05524687096476555, -0.02762996405363083, 0.021180156618356705, 0.06966830790042877, -0.06665486842393875, -0.15724937617778778, -0.008118867874145508, 0.138946995139122, 0.1373135894536972, 0.05447414517402649, -0.004899289458990097, 0.035477571189403534, -0.0190765168517828, -0.11272954195737839, 0.01020242553204298, 0.02520616166293621, 0.07591046392917633, 0.03948765620589256, -0.028904899954795837, 0.0013265243032947183, -0.08768550306558609, -0.05521567538380623, 0.10694849491119385, 0.1132807657122612, -0.050254981964826584, 0.09647220373153687, 0.1296694129705429, -0.11639730632305145, -0.20245875418186188, 0.031191688030958176, 0.12442652881145477, 0.06779325008392334, -0.0047147138975560665, -0.15348288416862488, 0.08374551683664322, 0.07794184237718582, 0.0036056942772120237, -0.0629579946398735, -0.32249096035957336, -0.1401943862438202, 0.08032345771789551, 0.08431405574083328, -0.06336062401533127, -0.06558914482593536, -0.031469717621803284, 0.026195146143436432, -0.010471220128238201, 0.08990249037742615, -0.037692878395318985, 0.09761861711740494, 0.022312568500638008, -0.032068781554698944, 0.05455571040511131, -0.06598032265901566, 0.14757683873176575, 0.059659648686647415, 0.04898706078529358, -0.09146770089864731, 0.04040254279971123, 0.018973028287291527, -0.030472660437226295, 0.17984847724437714, 0.01142042689025402, 0.030646149069070816, -0.21585367619991302, -0.07137864083051682, -0.08167073130607605, 0.03979954123497009, -0.07510878890752792, -0.07386311143636703, -0.04249389097094536, 0.1054668202996254, 0.07141368091106415, -0.023338889703154564, 0.05115988105535507, -0.11895471811294556, 0.02269279770553112, 0.13514645397663116, 0.16317254304885864, 0.11071950942277908, -0.08755817264318466, 0.037071071565151215, 0.03297880291938782, 0.07184988260269165, -0.14706069231033325, -0.0012900313595309854, 0.1321767270565033, -0.013895346783101559, 0.15509456396102905, -0.02403705194592476, -0.1470823436975479, -0.014653696678578854, 0.054251689463853836, -0.13477082550525665, -0.1017572209239006, -0.02643134817481041, -0.050773557275533676, -0.07805608212947845, -0.04473550617694855, 0.0867566168308258, -0.12968122959136963, 0.0006561392219737172, 0.0009002229780890048, 0.05791419371962547, -0.08253394067287445, 0.18730762600898743, 0.07492003589868546, 0.0775703489780426, -0.06544893980026245, 0.12769892811775208, 0.08540095388889313, -0.09221143275499344, 0.04107445478439331, 0.1507289558649063, -0.08454341441392899, -0.05762679874897003, 0.041234422475099564, 0.1544008105993271, -0.048158250749111176, -0.07987266778945923, -0.047501787543296814, -0.07802187651395798, 0.05379082262516022, 0.018047688528895378, 0.06295522302389145, 0.02355055883526802, -0.011342864483594894, -0.018072770908474922, -0.1283424347639084, 0.10400596261024475, 0.06511236727237701, -0.0008643405744805932, -0.04924796149134636, 0.15660032629966736, 0.005663677584379911, 0.04473471641540527, -0.016224607825279236, -0.004059555474668741, -0.05026469752192497, 0.03138865530490875, -0.03667011111974716, -0.001328331883996725, -0.04456106945872307, 0.017499729990959167, -0.014206605032086372, -0.01805960200726986, 0.005886920727789402, 0.029835907742381096, -0.06086771562695503, -0.02047961950302124, -0.049334894865751266, 0.04322478175163269, -0.08065972477197647, -0.016389723867177963, -0.004418315831571817, -0.03563510254025459, 0.08149073272943497, -0.0052607301622629166, -0.014503181912004948, 0.01002286747097969, -0.05192592367529869, 0.11244072765111923, -0.009100239723920822, -0.012807552702724934, -0.018787788227200508, -0.10335793346166611, 0.003282916499301791, -0.022692617028951645, -0.04946192353963852, 0.01357103418558836, 0.049988046288490295, -0.12164416909217834, 0.02833021990954876, -0.011136780492961407, -0.007522982079535723, -0.07925067842006683, 0.12355294078588486, 0.020596714690327644, 0.08243643492460251, 0.1429060697555542, -0.07243248075246811, 0.09143313765525818, -0.14622798562049866, -0.029042404145002365, 0.034744516015052795, 0.017550595104694366, -0.05523709952831268, -0.047961387783288956, 0.047303538769483566, -0.07800322026014328, 0.08256640285253525, 0.08214650303125381, 0.04592178016901016, 0.047862786799669266, -0.0698925331234932, 0.026052076369524002, 0.0519399531185627, 0.06274774670600891, -0.05696279928088188, 0.011146584525704384, -0.0023460742086172104, 0.019793257117271423, -0.036490559577941895, 0.02658494934439659, 0.13998176157474518, 0.17523285746574402, 0.05583455041050911, 0.0596482940018177, 0.008244996890425682, -0.057675231248140335, -0.09775315970182419, 0.14237311482429504, 0.031234851107001305, 0.04781825840473175, -0.007166758179664612, 0.08197228610515594, 0.11366378515958786, -0.19177323579788208, 0.0811859592795372, -0.012114416807889938, -0.10813000798225403, -0.09526421874761581, -0.18627019226551056, -0.055900998413562775, -0.036962900310754776, 0.0025261600967496634, -0.12100925296545029, 0.026489585638046265, 0.0606195330619812, 0.06644920259714127, -0.02867167256772518, 0.14224238693714142, -0.039979882538318634, -0.0606897696852684, 0.08948829025030136, -0.00874610897153616, 0.07097621262073517, -0.06095724180340767, 0.010736028663814068, 0.054046887904405594, -0.0054718004539608955, 0.028671732172369957, 0.005312175489962101, 0.027615180239081383, -0.014821527525782585, 0.007951038889586926, -0.07195686548948288, 0.02582094445824623, 0.0300156120210886, 0.11273134499788284, 0.07302536070346832, 0.057573720812797546, -0.04391095042228699, -0.02636071853339672, 0.21859672665596008, -0.03775012865662575, -0.09249068796634674, -0.16673173010349274, 0.17105798423290253, 0.03424089401960373, 0.04219719395041466, 0.03202603757381439, -0.11295861005783081, 0.04820026829838753, 0.16106562316417694, 0.12853273749351501, -0.004628783091902733, -0.014308961108326912, -0.00577596016228199, -0.0038637302350252867, 0.029675472527742386, 0.07816943526268005, 0.02125662751495838, 0.12256079912185669, -0.09659507125616074, 0.07228495925664902, -0.08010867238044739, -0.038253165781497955, 0.010834441520273685, 0.14074218273162842, 0.03748909384012222, -0.019123926758766174, -0.06142682209610939, 0.13401184976100922, -0.03543322905898094, -0.19858793914318085, 0.06934112310409546, -0.05651768669486046, -0.16283093392848969, -0.02378537505865097, 0.027828825637698174, 0.0028704730793833733, 0.04064984619617462, 0.003599179908633232, -0.006112989038228989, 0.06228111311793327, 0.04389337822794914, -0.05848638713359833, -0.10571853071451187, 0.053944267332553864, -0.0040619694627821445, 0.13813698291778564, 0.022308487445116043, 0.09305333346128464, 0.07927040755748749, -0.005318327806890011, -0.09013834595680237, 0.08445509523153305, 0.055314719676971436, 0.005657196510583162, 0.05456741154193878, 0.1622798591852188, -0.0178823359310627, 0.1090819239616394, 0.041631922125816345, -0.14467903971672058, 0.012492375448346138, -0.10251199454069138, -0.062027961015701294, -0.12554329633712769, 0.07354102283716202, -0.06849849969148636, 0.15701325237751007, 0.23667976260185242, -0.034797798842191696, -0.004247426986694336, -0.08462356775999069, 0.06236612796783447, -0.02983011119067669, 0.11518537998199463, 0.036685530096292496, -0.1674284189939499, 0.003525912296026945, -0.05436194688081741, 0.03147082030773163, -0.19612672924995422, -0.034775376319885254, 0.011203613132238388, -0.06631318479776382, -0.015260248444974422, 0.11828041076660156, 0.04704953357577324, 0.04813261702656746, -0.035783860832452774, -0.0012497386196628213, 0.014796644449234009, 0.13784515857696533, -0.12103363126516342, -0.08799201250076294 ]
null
null
transformers
# legal_t5_small_multitask_cs_de model Model on translating legal text from Cszech to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_cs_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Deustch. ### How to use Here is how to use this model to translate legal text from Cszech to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_cs_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_cs_de", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Postavení žen v ozbrojených konfliktech a jejich úloha při obnově zemí po ukončení konfliktu a v demokratickém procesu v těchto zemích" pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_cs_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_cs_de | 43.145| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Deustch", "tags": ["translation Cszech Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Postaven\u00ed \u017een v ozbrojen\u00fdch konfliktech a jejich \u00faloha p\u0159i obnov\u011b zem\u00ed po ukon\u010den\u00ed konfliktu a v demokratick\u00e9m procesu v t\u011bchto zem\u00edch"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_cs_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_cs\_de model ========================================= Model on translating legal text from Cszech to Deustch. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_cs\_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Deustch. ### How to use Here is how to use this model to translate legal text from Cszech to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_cs\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 61, 202, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08151086419820786, 0.10203595459461212, -0.0038114723283797503, 0.09238925576210022, 0.06262967735528946, 0.024218827486038208, 0.0685139149427414, 0.11192533373832703, -0.03264607861638069, 0.08692367374897003, 0.051928527653217316, -0.02921624854207039, 0.07047312706708908, 0.045758266001939774, 0.035492923110723495, -0.23370470106601715, -0.005591028369963169, -0.005548038985580206, -0.056675128638744354, 0.09425792098045349, 0.10512112826108932, -0.05444002524018288, 0.04601646959781647, -0.049887578934431076, -0.07170860469341278, 0.016867469996213913, -0.06780599057674408, -0.045609649270772934, 0.09928447008132935, 0.08220690488815308, 0.10187589377164841, -0.010097071528434753, 0.07120233029127121, -0.15367203950881958, -0.0010889834957197309, 0.07139135897159576, -0.006733065936714411, 0.054572634398937225, 0.09234505891799927, 0.02164469100534916, 0.194769948720932, -0.07960256934165955, 0.03676276281476021, 0.036957889795303345, -0.12138860672712326, -0.12340684980154037, -0.06418897211551666, 0.04198339954018593, 0.10802643746137619, 0.12044482678174973, -0.04059956595301628, 0.03737330809235573, -0.033521682024002075, 0.07507770508527756, 0.07818509638309479, -0.2331829071044922, -0.03919971361756325, 0.07173218578100204, 0.04944717511534691, 0.08501119911670685, -0.04891408607363701, -0.02087235078215599, 0.05191900581121445, 0.0661715567111969, 0.06225241720676422, -0.051940083503723145, 0.002696864539757371, -0.01723564974963665, -0.11804241687059402, -0.05836934223771095, 0.12930451333522797, 0.0351036936044693, -0.03861634060740471, -0.10140667110681534, -0.0750264972448349, -0.0907297432422638, -0.00025996274780482054, -0.030233440920710564, 0.021135978400707245, -0.0014819903299212456, 0.04360219091176987, -0.03448944166302681, -0.11136073619127274, -0.06911999732255936, -0.07336430996656418, 0.02147548645734787, 0.03529103472828865, 0.011248641647398472, 0.0049070254899561405, 0.08448799699544907, -0.1387850046157837, -0.09089134633541107, 0.0014787917025387287, 0.012308353558182716, -0.10717091709375381, 0.014580964110791683, -0.009715273976325989, -0.16647696495056152, -0.007835221476852894, 0.020114203914999962, -0.05988897755742073, 0.03160342574119568, 0.04534906521439552, 0.03594941273331642, 0.035588547587394714, 0.14127016067504883, -0.09159655123949051, -0.1263740211725235, -0.021421751007437706, 0.015058128163218498, 0.007464870344847441, 0.013234236277639866, -0.07046233862638474, -0.048559822142124176, 0.020777389407157898, 0.047347668558359146, 0.0006774246576242149, 0.00046681068488396704, -0.01258446741849184, -0.016851624473929405, 0.1086563915014267, -0.0986505001783371, 0.023934032768011093, 0.0066089872270822525, -0.07645106315612793, 0.011946442537009716, 0.05304798111319542, -0.03756086155772209, -0.09650016576051712, 0.07468093186616898, -0.038270723074674606, -0.02776406519114971, -0.11062987148761749, -0.17244459688663483, 0.012100000865757465, -0.01948084495961666, -0.04854877293109894, -0.10151523351669312, -0.13717731833457947, -0.07956947386264801, 0.023725271224975586, -0.0699247494339943, -0.012728553265333176, -0.0754755288362503, 0.0022886190563440323, 0.02520141378045082, -0.00937211886048317, 0.06688529253005981, -0.02930130437016487, 0.046283017843961716, 0.03189539164304733, 0.08018666505813599, -0.004470350220799446, 0.033079516142606735, -0.08927799761295319, 0.057059504091739655, -0.10338137298822403, 0.14248381555080414, -0.0060288747772574425, -0.007192052435129881, -0.12719815969467163, -0.06174785643815994, -0.07699094712734222, 0.04433800280094147, 0.08463700115680695, 0.13353204727172852, -0.21795302629470825, -0.019342374056577682, 0.2324320524930954, -0.07594810426235199, -0.05984498932957649, 0.13124604523181915, -0.026676416397094727, 0.030271543189883232, 0.08704401552677155, 0.10349883139133453, 0.06873922795057297, -0.02825932763516903, -0.052325181663036346, 0.0025151309091597795, 0.02090841345489025, 0.03770557418465614, 0.08955015242099762, -0.05650317296385765, 0.08950307965278625, -0.005043859127908945, 0.05025576427578926, 0.020057437941432, -0.03167586401104927, -0.03685761243104935, -0.007583999074995518, -0.03939766809344292, -0.012024031020700932, 0.013744926080107689, 0.017001476138830185, -0.0670277401804924, -0.08440647274255753, 0.008251214399933815, 0.10571040213108063, -0.07127272337675095, 0.018513988703489304, 0.003357896115630865, -0.03852155804634094, -0.09765712916851044, 0.01891331560909748, -0.14828890562057495, -0.018975619226694107, 0.031910490244627, -0.06460762023925781, 0.0972789078950882, 0.06756076216697693, 0.06167382374405861, 0.08338732272386551, -0.05835888907313347, -0.017444103956222534, -0.023246651515364647, -0.02815057896077633, -0.1147502139210701, -0.0894969254732132, -0.03842953220009804, -0.018754547461867332, 0.03753121942281723, -0.1766711324453354, 0.02367660589516163, -0.04234330728650093, 0.07976210862398148, 0.01295255497097969, -0.014526532031595707, 0.020350107923150063, 0.05654045566916466, -0.02868487499654293, -0.05150742828845978, 0.01989968679845333, -0.01055861171334982, -0.04825552925467491, 0.07963280379772186, -0.13371685147285461, -0.10391522943973541, 0.09163441509008408, 0.06353552639484406, -0.11080080270767212, -0.013923351652920246, -0.008073253557085991, -0.05967898666858673, -0.05146053433418274, -0.08870349079370499, 0.19463075697422028, 0.042342040687799454, 0.14370416104793549, -0.09901496767997742, -0.0379268042743206, 0.011082759127020836, -0.012665007263422012, -0.0018706998089328408, 0.14683249592781067, 0.06320428103208542, -0.15652461349964142, 0.0939229354262352, 0.04898982122540474, -0.04186830297112465, 0.11886546015739441, 0.03339501470327377, -0.11311880499124527, -0.017167482525110245, 0.03067958354949951, 0.005467064678668976, 0.06408455222845078, -0.08378387242555618, -0.011266648769378662, 0.030124623328447342, 0.07022498548030853, 0.053938791155815125, -0.09761600196361542, 0.06393991410732269, 0.06990980356931686, -0.04238886013627052, 0.021879004314541817, -0.029478665441274643, -0.029347624629735947, 0.10372365266084671, 0.018734421581029892, -0.037276580929756165, -0.030639614909887314, -0.03242635726928711, -0.10472731292247772, 0.19077400863170624, -0.0709836483001709, -0.19563083350658417, -0.13858073949813843, 0.061399683356285095, -0.016791781410574913, 0.026708271354436874, 0.036472614854574203, -0.03685266897082329, -0.03890697658061981, -0.09933312982320786, 0.0958651453256607, -0.0811927393078804, -0.035834040492773056, -0.09466198831796646, 0.032223738729953766, -0.00649630930274725, -0.1396726667881012, 0.01724880002439022, 0.010615486651659012, -0.02900109998881817, 0.012703156098723412, -0.047271110117435455, 0.09336746484041214, 0.15619415044784546, -0.03241009637713432, -0.03627771884202957, -0.006937364581972361, 0.13325358927249908, -0.0827711820602417, 0.07379312813282013, 0.07323100417852402, 0.047977302223443985, 0.03500506654381752, 0.12407294660806656, 0.030734671279788017, -0.051074087619781494, 0.02907862327992916, 0.04874357581138611, -0.03454343602061272, -0.2511996626853943, -0.06668514758348465, -0.07733741402626038, 0.03669614717364311, 0.09625052660703659, 0.02963840588927269, -0.03244432806968689, 0.02766982652246952, -0.03909480944275856, 0.06474240124225616, 0.0001726241025608033, 0.059635091572999954, 0.060557201504707336, -0.008237768895924091, 0.08465174585580826, -0.06848759949207306, -0.03603733703494072, 0.09925995767116547, 0.03182392567396164, 0.19473639130592346, -0.04717377573251724, 0.20543886721134186, 0.029905028641223907, 0.027174044400453568, 0.018331443890929222, 0.09405415505170822, -0.04413343220949173, 0.028369182720780373, -0.03341473266482353, -0.05624505877494812, -0.012418683618307114, 0.057929232716560364, 0.018949976190924644, -0.0010643430287018418, -0.06356219947338104, -0.0422067753970623, 0.04919438809156418, 0.21631395816802979, 0.10833548754453659, -0.21328000724315643, -0.07060252875089645, 0.008656708523631096, -0.0685100182890892, -0.08211275190114975, 0.009174675680696964, 0.15521155297756195, -0.10088706761598587, -0.0051343850791454315, 0.009516085498034954, 0.1256970465183258, -0.11427756398916245, -0.019353194162249565, 0.010995088145136833, 0.03892261162400246, -0.02297382988035679, 0.1105668768286705, -0.25601571798324585, 0.1566876322031021, 0.013528781943023205, 0.05156988650560379, -0.03881310671567917, 0.021321041509509087, -0.03300365433096886, -0.0109397042542696, 0.07940433919429779, 0.022369815036654472, -0.05601220205426216, -0.10315034538507462, -0.10709154605865479, -0.008571363985538483, 0.05434378236532211, -0.05370862036943436, 0.0986577570438385, 0.04969925060868263, 0.0026576330419629812, -0.003844884689897299, 0.031067991629242897, -0.021944040432572365, -0.16151976585388184, -0.011070962063968182, -7.743860805931035e-7, -0.028602218255400658, -0.0199955552816391, -0.03252968564629555, -0.009176049381494522, 0.1809307038784027, -0.11092545837163925, -0.08677887916564941, -0.07664267718791962, 0.038421906530857086, 0.130964457988739, -0.0807512104511261, 0.02450014278292656, 0.007214247714728117, 0.047394417226314545, -0.03150760382413864, -0.0366995744407177, 0.0856848806142807, -0.05862567573785782, -0.0893743485212326, -0.06377995014190674, 0.1438077986240387, 0.03750830516219139, 0.057039059698581696, -0.012398861348628998, 0.030808517709374428, -0.0036706835962831974, -0.09588420391082764, -0.014136198908090591, 0.03350694477558136, 0.11536885052919388, 0.0565154068171978, -0.052459169179201126, -0.03684511408209801, -0.05654123052954674, -0.059477273374795914, 0.12922902405261993, 0.16609647870063782, -0.05670678988099098, 0.0159542765468359, 0.14759984612464905, -0.09130172431468964, -0.1543785035610199, -0.0364605113863945, 0.07587765902280807, 0.09213678538799286, -0.029462462291121483, -0.17736370861530304, 0.014712715521454811, 0.11791033297777176, 0.0049217138439416885, 0.07786647230386734, -0.36330646276474, -0.12957428395748138, 0.0341917984187603, 0.0163935087621212, -0.010876210406422615, -0.13700048625469208, -0.0512845516204834, -0.05194990709424019, -0.08782880008220673, 0.10952088236808777, -0.05982301011681557, 0.09732458740472794, -0.008849876001477242, 0.04118083417415619, 0.04935523122549057, -0.03773989528417587, 0.1363295614719391, 0.017790548503398895, 0.03827035427093506, -0.06130307540297508, 0.0287273358553648, 0.03258255496621132, -0.03115282952785492, 0.15402239561080933, -0.07954155653715134, 0.05136910825967789, -0.16489796340465546, -0.0566520132124424, -0.05579311400651932, 0.004478350281715393, -0.04642597958445549, -0.07122816145420074, -0.06227705255150795, 0.023330090567469597, 0.029071545228362083, -0.01172537263482809, -0.001020363182760775, -0.04172634705901146, 0.012008083052933216, 0.16155637800693512, 0.08041360229253769, 0.05262952670454979, -0.11541543900966644, 0.005581161472946405, 0.00652889721095562, 0.0692116990685463, -0.14860866963863373, 0.0066885026171803474, 0.137905552983284, 0.02151203155517578, 0.12469320744276047, -0.003544633509591222, -0.12728089094161987, 0.015945978462696075, 0.060569893568754196, -0.06677540391683578, -0.12036178261041641, -0.003880684496834874, 0.002428903477266431, -0.08894486725330353, -0.019086096435785294, 0.11481280624866486, -0.0818299651145935, -0.021089889109134674, 0.0026837754994630814, 0.0327349454164505, -0.058510489761829376, 0.21429415047168732, 0.03114321455359459, 0.05889740213751793, -0.04958092421293259, 0.08456321060657501, 0.1346873641014099, -0.13718634843826294, 0.026972969993948936, 0.20872583985328674, -0.06478112190961838, -0.05094485357403755, 0.021052438765764236, 0.11206334084272385, -0.05689391866326332, -0.04759278893470764, -0.018942121416330338, -0.05664737522602081, 0.029548900201916695, 0.006328115239739418, 0.03423800691962242, 0.04855697229504585, -0.012039827182888985, -0.02224918082356453, -0.10473260283470154, 0.08518952876329422, 0.07083506882190704, 0.03749193251132965, -0.031592488288879395, 0.12090786546468735, 0.00240802438929677, 0.006489320658147335, -0.005236743483692408, 0.010462217964231968, -0.06806343793869019, -0.00016429087554570287, -0.1033199205994606, 0.005489246919751167, -0.041523393243551254, -0.01796644926071167, -0.028346514329314232, 0.007368326652795076, -0.0041882009245455265, -0.005129522643983364, -0.03605244681239128, -0.0556446872651577, -0.06628042459487915, 0.03316386416554451, -0.09249169379472733, -0.03437318652868271, 0.005611632950603962, -0.04559723287820816, 0.07197690010070801, 0.01859080232679844, 0.008262034505605698, 0.010159383527934551, -0.0006879348075017333, 0.06354612112045288, 0.007635212503373623, 0.044656384736299515, 0.008039167150855064, -0.064706951379776, 0.023177476599812508, 0.010300417430698872, 0.00946264062076807, -0.0030872547067701817, -0.0007020687335170805, -0.1316918581724167, -0.007978241890668869, -0.04999619722366333, -0.039904531091451645, -0.07134145498275757, 0.07830137014389038, 0.03574172034859657, 0.07100078463554382, 0.10001828521490097, -0.06849220395088196, 0.08735182881355286, -0.1878116875886917, -0.01005349401384592, 0.01411850843578577, -0.006388674955815077, 0.013466965407133102, -0.024697043001651764, 0.06028372794389725, -0.059391558170318604, 0.1128329485654831, 0.02317955531179905, 0.04206492379307747, 0.01950564607977867, -0.08664226531982422, 0.020529016852378845, 0.02315625548362732, 0.12198970466852188, -0.013458885252475739, -0.04132922366261482, -0.0835205465555191, 0.074540875852108, 0.008338304236531258, 0.11223626881837845, 0.07900125533342361, 0.14041019976139069, 0.07969103008508682, 0.05343947559595108, 0.014013244770467281, -0.08549445867538452, -0.07901397347450256, 0.029669931158423424, -0.0009358960669487715, 0.05462585762143135, -0.02269531413912773, 0.13934749364852905, 0.13491879403591156, -0.16480396687984467, 0.10727627575397491, -0.01895458810031414, -0.08934425562620163, -0.04611508920788765, -0.10509973764419556, -0.05352948606014252, -0.03942878544330597, -0.04076576605439186, -0.10669912397861481, 0.014657193794846535, 0.05332249775528908, 0.04788476228713989, -0.03824117034673691, 0.11007405817508698, -0.0032528084702789783, -0.07880085706710815, 0.06762803345918655, 0.029818005859851837, 0.06897434592247009, 0.032917097210884094, 0.01791817508637905, 0.037226151674985886, -0.0018424117006361485, 0.05293003097176552, 0.04176286607980728, 0.0006396353128366172, 0.006159692537039518, 0.0014438346261158586, -0.05371854826807976, -0.04081195220351219, 0.025461925193667412, 0.06556840240955353, 0.17482808232307434, 0.057592831552028656, -0.07311908155679703, -0.02571762353181839, 0.1779874861240387, -0.06681181490421295, -0.10332486778497696, -0.11724262684583664, 0.20946577191352844, 0.020446037873625755, 0.024145223200321198, -0.0055323936976492405, -0.11384060233831406, -0.009493633173406124, 0.12239499390125275, 0.19213376939296722, -0.039070695638656616, -0.03467966616153717, 0.017014650627970695, -0.008428085595369339, 0.011303928680717945, 0.05902445688843727, 0.06043919175863266, 0.2778712213039398, -0.06684854626655579, 0.07900800555944443, -0.04001467674970627, 0.005423614289611578, -0.030219247564673424, 0.1583760380744934, -0.011978000402450562, 0.010220934636890888, -0.043177854269742966, 0.08014422655105591, 0.02691204845905304, -0.1605624556541443, -0.00025990084395743906, -0.11624637246131897, -0.11682770401239395, 0.010328233242034912, -0.040871184319257736, 0.03370063006877899, 0.07763013988733292, -0.0035757524892687798, 0.031146910041570663, 0.05631058290600777, 0.0028138654306530952, -0.10856441408395767, -0.0970432236790657, 0.019533727318048477, -0.02771228738129139, 0.09609340131282806, 0.008977586403489113, 0.12844376266002655, 0.10961925983428955, 0.017085041850805283, -0.08468733727931976, 0.089256651699543, 0.03308834135532379, 0.034988485276699066, 0.06774764508008957, 0.09893722087144852, -0.020451659336686134, 0.017457587644457817, 0.04588758945465088, -0.06967530399560928, 0.012212119996547699, -0.039955370128154755, -0.022028066217899323, -0.14446140825748444, 0.06866425275802612, -0.031105274334549904, 0.13880418241024017, 0.1946122944355011, -0.02824477292597294, -0.011827923357486725, -0.03632879629731178, 0.009302993305027485, 0.003026994876563549, 0.08892358839511871, -0.020563410595059395, -0.16113701462745667, 0.02165149338543415, -0.0463688038289547, 0.03930797055363655, -0.2715018093585968, -0.04005502536892891, 0.00605674646794796, -0.04504948854446411, -0.03982891887426376, 0.09669383615255356, 0.06009300798177719, 0.029615234583616257, -0.03960784897208214, -0.10205183178186417, -0.024792708456516266, 0.11610814183950424, -0.10972816497087479, -0.11251820623874664 ]
null
null
transformers
# legal_t5_small_multitask_cs_en model Model on translating legal text from Cszech to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_cs_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to English. ### How to use Here is how to use this model to translate legal text from Cszech to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_cs_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_cs_en", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Komise musí vypracovat zprávu o hodnotících zprávách týkajících se uplatňování této směrnice v členských státech." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_cs_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_cs_en | 37.136| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech English", "tags": ["translation Cszech English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Komise mus\u00ed vypracovat zpr\u00e1vu o hodnot\u00edc\u00edch zpr\u00e1v\u00e1ch t\u00fdkaj\u00edc\u00edch se uplat\u0148ov\u00e1n\u00ed t\u00e9to sm\u011brnice v \u010dlensk\u00fdch st\u00e1tech."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_cs_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_cs\_en model ========================================= Model on translating legal text from Cszech to English. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_cs\_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to English. ### How to use Here is how to use this model to translate legal text from Cszech to English in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_cs\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07372314482927322, 0.13635998964309692, -0.0033720044884830713, 0.07970546931028366, 0.07114332169294357, 0.012231037020683289, 0.01164090633392334, 0.11199681460857391, -0.04072355851531029, 0.07418445497751236, 0.05507197603583336, 0.013170589692890644, 0.050598401576280594, 0.032907772809267044, 0.045411184430122375, -0.19389572739601135, -0.010286039672791958, -0.024094589054584503, -0.03543533757328987, 0.10954438149929047, 0.0826958566904068, -0.05822596326470375, 0.0510343499481678, -0.03481645882129669, -0.08166296035051346, 0.01928112469613552, -0.08175580203533173, -0.04127795994281769, 0.10123990476131439, 0.07168159633874893, 0.07666222006082535, -0.017294535413384438, 0.062204569578170776, -0.1958889663219452, 0.0001521568192401901, 0.06154486909508705, -0.011134032160043716, 0.052523933351039886, 0.0923575833439827, -0.004203335847705603, 0.19396892189979553, -0.05735030025243759, 0.030610648915171623, 0.05633164942264557, -0.10835971683263779, -0.10323020070791245, -0.06925396621227264, 0.03780299052596092, 0.09564752876758575, 0.13685938715934753, -0.03204980492591858, 0.03414929658174515, -0.00857734028249979, 0.08046137541532516, 0.11364088207483292, -0.2577257454395294, -0.020446889102458954, 0.04501102864742279, 0.05185891315340996, 0.0852673128247261, -0.05090470612049103, 0.008144482038915157, 0.054326627403497696, 0.06515131890773773, 0.07136409729719162, -0.04520730674266815, 0.0053764814510941505, -0.006281137932091951, -0.12679505348205566, -0.07011234015226364, 0.16251938045024872, 0.03266338258981705, -0.02625858597457409, -0.10175318270921707, -0.06112920492887497, -0.07996819168329239, 0.0017694573616608977, -0.018761111423373222, 0.005778503138571978, -0.004024052992463112, 0.029760140925645828, -0.06850621849298477, -0.11688971519470215, -0.06962879002094269, -0.07164797931909561, 0.06767130643129349, 0.03076246567070484, 0.020130444318056107, 0.03255832940340042, 0.07609090954065323, -0.11160404980182648, -0.07482534646987915, 0.007874363102018833, 0.017554063349962234, -0.09153331071138382, 0.018405985087156296, -0.0033480357378721237, -0.17669349908828735, -0.007208041846752167, -0.02912311442196369, -0.07004888355731964, 0.025300724431872368, 0.045292943716049194, 0.0353773832321167, 0.051107488572597504, 0.11968279629945755, -0.09999926388263702, -0.10701040178537369, -0.025465410202741623, -0.00703760702162981, -0.007210114039480686, 0.013033227995038033, -0.06469787657260895, -0.02690846286714077, 0.01297422032803297, 0.05993129312992096, -0.0003118508611805737, 0.009932396933436394, -0.005915473215281963, -0.028742127120494843, 0.1252897083759308, -0.09608771651983261, 0.008851288817822933, 0.007340768817812204, -0.0891648381948471, -0.02820003777742386, 0.06579534709453583, -0.03303888440132141, -0.10067671537399292, 0.048609402030706406, -0.04652508720755577, -0.006212734151631594, -0.09881415218114853, -0.17565348744392395, 0.004716519732028246, -0.005100400652736425, -0.060875944793224335, -0.09442703425884247, -0.134812593460083, -0.07093704491853714, 0.017127549275755882, -0.04941388964653015, 0.007716254331171513, -0.08982788026332855, 0.0026944405399262905, 0.03200715035200119, -0.012699775397777557, 0.0671313926577568, -0.04405678063631058, 0.04812345281243324, 0.007298625539988279, 0.05799786001443863, 0.007866115309298038, 0.03262288495898247, -0.07626122236251831, 0.04274532571434975, -0.06918293237686157, 0.14188998937606812, -0.00789804756641388, 0.011758364737033844, -0.13791050016880035, -0.04987581819295883, -0.07913430035114288, 0.04147206246852875, 0.07263442128896713, 0.12965470552444458, -0.19826248288154602, -0.03457179665565491, 0.20661719143390656, -0.0622953325510025, -0.07207401096820831, 0.11168772727251053, -0.028233474120497704, 0.03864167630672455, 0.08050528168678284, 0.08510027080774307, 0.02680642157793045, -0.04294075816869736, -0.06368452310562134, -0.010813347063958645, 0.014602718874812126, 0.015638019889593124, 0.09592138230800629, -0.07123798877000809, 0.08765581250190735, 0.008148170076310635, 0.0561162531375885, 0.020489996299147606, -0.04532438516616821, -0.036644428968429565, -0.003795563941821456, -0.04397047683596611, -0.03031170926988125, 0.008953527547419071, 0.020374149084091187, -0.06396526843309402, -0.07982753962278366, 0.03669853135943413, 0.1023862361907959, -0.06573736667633057, 0.03246530145406723, 0.010588893666863441, -0.0508599579334259, -0.12108114361763, 0.019412048161029816, -0.16180521249771118, 0.013029496185481548, 0.026951633393764496, -0.021618224680423737, 0.10429350286722183, 0.0240909643471241, 0.053979918360710144, 0.08430871367454529, -0.05119013413786888, -0.0204598531126976, -0.012270300649106503, -0.018235718831419945, -0.11376448720693588, -0.09744111448526382, -0.037807218730449677, -0.01207532174885273, 0.022228041663765907, -0.15702266991138458, 0.016787678003311157, -0.03374621644616127, 0.08335889130830765, 0.010043254122138023, -0.03606957197189331, 0.03329553082585335, 0.05588372424244881, -0.029771186411380768, -0.029518265277147293, 0.03292137011885643, -0.0071648843586444855, -0.04831956326961517, 0.09948436915874481, -0.12790870666503906, -0.10277686268091202, 0.10628154873847961, 0.025315364822745323, -0.08646500110626221, -0.007186048664152622, -0.009290320798754692, -0.05104639753699303, -0.04979102313518524, -0.07418197393417358, 0.1986580342054367, 0.05340835452079773, 0.15950077772140503, -0.10476600378751755, -0.05869592726230621, 0.01498955488204956, -0.02006208896636963, -0.012381594628095627, 0.15707869827747345, 0.0344046950340271, -0.1764078438282013, 0.08563872426748276, 0.03350033983588219, -0.013634715229272842, 0.129886195063591, 0.05749715119600296, -0.1077590137720108, -0.01968608796596527, 0.03058168664574623, -0.0018516839481890202, 0.043016623705625534, -0.089578777551651, -0.03303654491901398, 0.02976127155125141, 0.06994079798460007, 0.06798550486564636, -0.10027766972780228, 0.06893245875835419, 0.07243350148200989, -0.0365108884871006, 0.06445300579071045, -0.03949742019176483, -0.04308022931218147, 0.1039455235004425, 0.012702389620244503, -0.011062667705118656, -0.044355664402246475, -0.038559138774871826, -0.11239439994096756, 0.18687181174755096, -0.09771896153688431, -0.2290440946817398, -0.1411478966474533, 0.010472182184457779, -0.03767798840999603, 0.013730224221944809, 0.03657858446240425, -0.0493549145758152, -0.041328445076942444, -0.10945510864257812, 0.08782445639371872, -0.1099245697259903, -0.04673982039093971, -0.08314112573862076, 0.04745980352163315, -0.0165473110973835, -0.14341282844543457, 0.02215397171676159, 0.003786322893574834, -0.04477250948548317, 0.0023181189317256212, -0.04628463089466095, 0.10985331982374191, 0.1562909036874771, -0.024028576910495758, -0.022156905382871628, 0.0002533863880671561, 0.12375092506408691, -0.05941520258784294, 0.04271915927529335, 0.05238393321633339, 0.05238968878984451, 0.017527639865875244, 0.10865626484155655, 0.04192113131284714, -0.049118030816316605, 0.042844921350479126, 0.05196208879351616, -0.02688107267022133, -0.2652376890182495, -0.09713338315486908, -0.06272847950458527, -0.014904373325407505, 0.09302902221679688, 0.05290648713707924, -0.03421946242451668, 0.0050707021728158, -0.041403818875551224, 0.023971790447831154, 0.003686384065076709, 0.06362102180719376, 0.05042530596256256, -0.0199546217918396, 0.078936368227005, -0.06171801686286926, -0.02288028784096241, 0.08813638240098953, 0.03809434920549393, 0.1849890649318695, -0.03165587782859802, 0.2455194741487503, 0.05319361761212349, 0.03796463459730148, 0.011339414864778519, 0.07811208069324493, -0.04671014845371246, 0.029021743685007095, -0.023801418021321297, -0.061296459287405014, -0.009352980181574821, 0.06633637100458145, 0.015798643231391907, 0.023129427805542946, -0.05874679982662201, -0.06167744845151901, 0.07178857177495956, 0.21934159100055695, 0.0742735043168068, -0.17831118404865265, -0.06253372132778168, 0.0022203088738024235, -0.08598149567842484, -0.0707968920469284, 0.012641765177249908, 0.14722475409507751, -0.0863620936870575, -0.014076747000217438, 0.025010449811816216, 0.1348849982023239, -0.12101767957210541, -0.027375100180506706, 0.002520577749237418, 0.01241385843604803, -0.02656661532819271, 0.1127910166978836, -0.24257396161556244, 0.18037138879299164, 0.03098103776574135, 0.05761682614684105, -0.03997258469462395, 0.007683538366109133, -0.040528129786252975, -0.02357671596109867, 0.10186026990413666, 0.021933944895863533, -0.03677866607904434, -0.11041731387376785, -0.10918157547712326, -0.021354807540774345, 0.06436768174171448, -0.0601564459502697, 0.10175182670354843, 0.061298273503780365, -0.006022674962878227, -0.011989363469183445, 0.06607881933450699, -0.02169405110180378, -0.16353827714920044, -0.012862155213952065, -0.0034285627771168947, -0.03962899371981621, -0.012862453237175941, -0.04436161741614342, -0.036036666482686996, 0.22119909524917603, -0.12118247896432877, -0.07964633405208588, -0.08034072816371918, 0.02159617282450199, 0.12395676970481873, -0.07630539685487747, 0.03166938200592995, 0.010322213172912598, 0.04036731645464897, -0.04854121804237366, -0.03835625573992729, 0.08121577650308609, -0.05096116289496422, -0.074180006980896, -0.05493585392832756, 0.1647912859916687, 0.04555134475231171, 0.05907076224684715, -0.030051903799176216, 0.04960812255740166, -0.0023208328057080507, -0.09607940167188644, -0.007365670520812273, 0.05633963271975517, 0.13500402867794037, 0.06056941673159599, -0.0654854029417038, -0.06685331463813782, -0.07343214750289917, -0.08113644272089005, 0.16606469452381134, 0.17383641004562378, -0.05094573274254799, 0.03183020278811455, 0.1690116822719574, -0.11719860881567001, -0.15629230439662933, -0.050987228751182556, 0.06311076134443283, 0.0755012184381485, -0.03700528293848038, -0.17535927891731262, 0.020149612799286842, 0.11946693062782288, -0.0009240441722795367, 0.05730640888214111, -0.368686705827713, -0.14379450678825378, 0.029823062941432, 0.03932711482048035, -0.005912662483751774, -0.13271835446357727, -0.056385014206171036, -0.05994204431772232, -0.10977265983819962, 0.11771313101053238, -0.03755246847867966, 0.105274997651577, -0.008981654420495033, 0.028858676552772522, 0.03856365382671356, -0.041220709681510925, 0.11783427745103836, 0.0127302510663867, 0.030708955600857735, -0.04733665660023689, 0.028859570622444153, 0.012780836783349514, -0.025286750867962837, 0.15822625160217285, -0.06004159525036812, 0.05820569396018982, -0.16498149931430817, -0.05746941268444061, -0.06743160635232925, 0.0096824262291193, -0.03655257076025009, -0.06753925234079361, -0.052878811955451965, 0.02186196856200695, 0.03631303831934929, -0.01210394874215126, 0.02727329358458519, -0.06235920265316963, 0.05304418504238129, 0.17887729406356812, 0.07028854638338089, 0.01168421097099781, -0.0992044061422348, 0.001773372758179903, -0.007778073661029339, 0.06176908686757088, -0.15658950805664062, -0.0027176374569535255, 0.14136749505996704, 0.0462348572909832, 0.12697815895080566, -0.013856730423867702, -0.13066473603248596, 0.010924038477241993, 0.05777611956000328, -0.09065156430006027, -0.11085213720798492, -0.014101804234087467, 0.047648970037698746, -0.0676887184381485, -0.015355782583355904, 0.11521556973457336, -0.077609583735466, -0.03134675696492195, -0.006086042616516352, 0.025566356256604195, -0.06214499846100807, 0.217389315366745, 0.043507084250450134, 0.05655482038855553, -0.05697592347860336, 0.10583125799894333, 0.12940393388271332, -0.10993169248104095, 0.03281281515955925, 0.20496419072151184, -0.06946716457605362, -0.05196619778871536, 0.010915241204202175, 0.10617715865373611, -0.04070884734392166, -0.0448802150785923, -0.012362536042928696, -0.02583364211022854, 0.026512933894991875, 0.03243580460548401, 0.0390142947435379, 0.04653028026223183, -0.01463715173304081, -0.037431828677654266, -0.08744523674249649, 0.09005023539066315, 0.05037570744752884, 0.004031355492770672, -0.020895756781101227, 0.10674864798784256, 0.01566513068974018, 0.010555469430983067, -0.012671031057834625, -0.006995719391852617, -0.050928983837366104, 0.020261354744434357, -0.07351813465356827, 0.00012169175170129165, -0.06692077219486237, -0.023893197998404503, -0.03369536250829697, 0.006156729534268379, -0.016550421714782715, 0.000562189903575927, -0.043672557920217514, -0.05310601741075516, -0.05672841519117355, 0.03225882351398468, -0.09300220757722855, -0.040175847709178925, 0.004620198626071215, -0.02980807051062584, 0.06164996698498726, 0.03617057949304581, 0.0020260056480765343, 0.03479450196027756, -0.009015766903758049, 0.0496903732419014, 0.021612711250782013, 0.04624319076538086, 0.01007807906717062, -0.05879215896129608, 0.016416184604167938, 0.025764234364032745, 0.001134976977482438, -0.0022486960515379906, 0.005065248813480139, -0.1328645795583725, -0.0578751303255558, -0.05710408464074135, -0.020979508757591248, -0.06775989383459091, 0.08006792515516281, 0.07010576128959656, 0.06625540554523468, 0.08837633579969406, -0.061231475323438644, 0.07319673895835876, -0.17690692842006683, -0.0026321655604988337, 0.006896111182868481, -0.02675994485616684, -0.024602925404906273, -0.002333342330530286, 0.048629455268383026, -0.06832126528024673, 0.13948892056941986, -0.0034244440030306578, 0.04299914091825485, 0.026340829208493233, -0.06187622994184494, 0.002700118813663721, 0.01387032214552164, 0.1269354671239853, -0.007157322484999895, -0.031615130603313446, -0.051512520760297775, 0.0834282785654068, -0.004810909740626812, 0.11465774476528168, 0.0513681061565876, 0.12377531081438065, 0.13793934881687164, 0.0461006835103035, 0.01114560104906559, -0.08357805013656616, -0.0478118360042572, 0.0443410649895668, -0.010082568041980267, 0.05938413366675377, -0.026335274800658226, 0.11432085186243057, 0.16469986736774445, -0.16358305513858795, 0.11339981108903885, 0.005660343449562788, -0.07795771211385727, -0.05169636756181717, -0.10644938051700592, -0.056410644203424454, -0.0379563644528389, -0.03381197154521942, -0.11586268246173859, 0.005537005141377449, 0.04517277702689171, 0.05455659702420235, -0.023356597870588303, 0.09701526165008545, 0.0242591742426157, -0.10116517543792725, 0.064789779484272, 0.0025636046193540096, 0.07243608683347702, -0.01680062711238861, 0.04166562110185623, 0.06113841012120247, -0.018052298575639725, 0.05338361859321594, 0.058434538543224335, -0.02012772299349308, 0.016217749565839767, 0.00000824589551484678, -0.06907939910888672, -0.04157811775803566, 0.029453840106725693, 0.07506899535655975, 0.19320723414421082, 0.06165098771452904, -0.08589616417884827, -0.02189014106988907, 0.18197505176067352, -0.05288887023925781, -0.0851680338382721, -0.0940038189291954, 0.21096213161945343, 0.039535801857709885, 0.021384689956903458, -0.0005093515501357615, -0.0975821241736412, -0.008776284754276276, 0.12433003634214401, 0.19496490061283112, -0.0347876101732254, -0.029836531728506088, 0.022913111373782158, -0.009448069147765636, 0.007852704264223576, 0.02267845906317234, 0.057755157351493835, 0.27783310413360596, -0.07811107486486435, 0.05547665059566498, -0.0612218864262104, 0.0033505475148558617, 0.0007079325150698423, 0.1351744383573532, -0.008901195600628853, 0.007235507946461439, -0.03954558074474335, 0.08006417751312256, -0.007538963109254837, -0.1612417846918106, -0.016183331608772278, -0.1085568517446518, -0.10747703909873962, 0.013774373568594456, -0.035693295300006866, 0.04274989664554596, 0.0663820430636406, 0.00906338356435299, 0.03824280574917793, 0.02732797898352146, 0.012297598645091057, -0.12270037829875946, -0.12146849185228348, 0.023545781150460243, -0.024920133873820305, 0.08630521595478058, 0.002131934044882655, 0.11766441911458969, 0.09608234465122223, 0.02098325453698635, -0.08295319974422455, 0.10419850796461105, 0.02717391401529312, 0.021542252972722054, 0.07262653857469559, 0.11131830513477325, -0.001560563687235117, 0.04722141474485397, 0.04138704761862755, -0.04962097853422165, 0.037226736545562744, -0.05564941093325615, -0.01583218015730381, -0.13817469775676727, 0.07193588465452194, -0.02344258315861225, 0.14166851341724396, 0.16600441932678223, -0.021933192387223244, -0.0060154469683766365, -0.045764513313770294, 0.008092946372926235, -0.007384204771369696, 0.08395024389028549, -0.020250294357538223, -0.20096561312675476, 0.029898129403591156, -0.05394526943564415, 0.03034818172454834, -0.26559239625930786, -0.04225846379995346, 0.028316637501120567, -0.05318954214453697, -0.03211537003517151, 0.086424820125103, 0.078302763402462, 0.035560742020606995, -0.035258699208498, -0.07946007698774338, -0.0013971555745229125, 0.10846589505672455, -0.12687528133392334, -0.11403930187225342 ]
null
null
transformers
# legal_t5_small_multitask_cs_es model Model on translating legal text from Cszech to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_cs_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Spanish. ### How to use Here is how to use this model to translate legal text from Cszech to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_cs_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_cs_es", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Antonio Tajani (místopředseda Komise) ." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_cs_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_cs_es | 48.559| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Spanish", "tags": ["translation Cszech Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Antonio Tajani (m\u00edstop\u0159edseda Komise) ."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_cs_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_cs\_es model ========================================= Model on translating legal text from Cszech to Spanish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_cs\_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Spanish. ### How to use Here is how to use this model to translate legal text from Cszech to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_cs\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08698368817567825, 0.13893529772758484, -0.0037617075722664595, 0.08687391132116318, 0.07587919384241104, 0.006045408081263304, -0.0012170859845355153, 0.11112712323665619, -0.042562372982501984, 0.08379930257797241, 0.04565439745783806, 0.033406466245651245, 0.04758290573954582, 0.016317732632160187, 0.03192216530442238, -0.1913832128047943, -0.007716230116784573, -0.029027195647358894, -0.041574712842702866, 0.10211937129497528, 0.07456964999437332, -0.051192231476306915, 0.04651790112257004, -0.04172172397375107, -0.07964000850915909, 0.030634429305791855, -0.07151547074317932, -0.0623912513256073, 0.10499163717031479, 0.07526491582393646, 0.08304120600223541, -0.010139289312064648, 0.05871379375457764, -0.17453429102897644, -0.003445451380684972, 0.05537773668766022, -0.018423497676849365, 0.05439941957592964, 0.10806697607040405, -0.01358992513269186, 0.19546206295490265, -0.0665794163942337, 0.029033169150352478, 0.05529855564236641, -0.12895329296588898, -0.10856864601373672, -0.07569243758916855, 0.013946114107966423, 0.09316541254520416, 0.12972533702850342, -0.02646866999566555, 0.02787124365568161, -0.009642642922699451, 0.06299041956663132, 0.11346732079982758, -0.24819540977478027, -0.026869816705584526, 0.03881695121526718, 0.05410907790064812, 0.09199915826320648, -0.03361928462982178, 0.018595697358250618, 0.06313391774892807, 0.05544348433613777, 0.06205570697784424, -0.05517442524433136, -0.012717767618596554, -0.00806165486574173, -0.11987447738647461, -0.08200962096452713, 0.1468699425458908, 0.02248808555305004, -0.02055167593061924, -0.09815960377454758, -0.06728930026292801, -0.0686221495270729, 0.005793600808829069, -0.020230792462825775, 0.011908198706805706, -0.011637969873845577, 0.03989838808774948, -0.056283801794052124, -0.11065690964460373, -0.06836125254631042, -0.0730726569890976, 0.06771762669086456, 0.024812115356326103, 0.01037548016756773, 0.04532987251877785, 0.08135347813367844, -0.0971359983086586, -0.08171892166137695, 0.01768115535378456, 0.029934290796518326, -0.08670631051063538, 0.023958323523402214, -0.0054156845435500145, -0.1829213798046112, 0.00136688316706568, -0.04699685424566269, -0.0949651226401329, 0.018726831302046776, 0.04405217617750168, 0.033703647553920746, 0.04929036647081375, 0.11470556259155273, -0.0940699353814125, -0.10175395756959915, -0.0320790559053421, -0.01984378509223461, -0.0044222306460142136, 0.01071455143392086, -0.08115412294864655, -0.0398324616253376, 0.004626357462257147, 0.07497139275074005, 0.014483083970844746, -0.0008252679253928363, -0.01234403345733881, -0.03823063150048256, 0.12467450648546219, -0.09175508469343185, 0.01722448505461216, 0.005949354264885187, -0.08972208201885223, -0.02327323704957962, 0.049748439341783524, -0.040637437254190445, -0.10090348869562149, 0.03660942614078522, -0.045362625271081924, -0.017122194170951843, -0.10875449329614639, -0.17563089728355408, 0.015764368698000908, -0.013245387934148312, -0.05896001309156418, -0.09810979664325714, -0.12013813108205795, -0.08754786103963852, 0.01719336211681366, -0.0712466761469841, 0.01793360337615013, -0.09980548918247223, 0.015052435919642448, 0.03279431164264679, -0.017620619386434555, 0.06877883523702621, -0.04493312910199165, 0.055000726133584976, 0.01904190331697464, 0.06322096288204193, 0.006529165431857109, 0.030539091676473618, -0.07175309211015701, 0.051796503365039825, -0.0855388194322586, 0.15672029554843903, -0.007251454517245293, 0.009800027124583721, -0.14520889520645142, -0.047011848539114, -0.08034336566925049, 0.04983760789036751, 0.07575055956840515, 0.14168600738048553, -0.19385872781276703, -0.0399702787399292, 0.21578308939933777, -0.05264246463775635, -0.06455167382955551, 0.1013159230351448, -0.023507412523031235, 0.05038418993353844, 0.08193379640579224, 0.09473961591720581, 0.018783602863550186, -0.046240273863077164, -0.05132690444588661, -0.014167308807373047, 0.022934120148420334, 0.0023308368399739265, 0.10394361615180969, -0.08355014026165009, 0.09087637811899185, 0.010276601649820805, 0.034313641488552094, 0.015426872298121452, -0.039588604122400284, -0.033781133592128754, 0.008600649423897266, -0.028176473453640938, -0.032962460070848465, 0.009797233156859875, 0.02507561817765236, -0.057682301849126816, -0.0715259313583374, 0.03998789191246033, 0.08632873743772507, -0.06020523980259895, 0.028143491595983505, 0.012164956890046597, -0.03533236309885979, -0.13840356469154358, 0.01688782498240471, -0.16368307173252106, 0.015579845756292343, 0.02336084470152855, -0.009657982736825943, 0.09796898066997528, 0.020271651446819305, 0.04976705089211464, 0.08062652498483658, -0.0483659952878952, -0.0168591421097517, -0.018231069669127464, -0.021259944885969162, -0.10281465947628021, -0.10195799916982651, -0.03590798377990723, -0.011121448129415512, 0.017495740205049515, -0.15701864659786224, 0.01149146631360054, -0.028547149151563644, 0.06880684196949005, 0.0017275240970775485, -0.02745143510401249, 0.02357267402112484, 0.07040237635374069, -0.029595263302326202, -0.03868987038731575, 0.04199234023690224, -0.0013379935408011079, -0.03580624610185623, 0.09470771253108978, -0.13293631374835968, -0.10175871849060059, 0.10107815265655518, 0.02310863509774208, -0.08925033360719681, -0.024558713659644127, -0.002106452826410532, -0.04246408864855766, -0.05311796814203262, -0.07093130797147751, 0.20550444722175598, 0.05006333068013191, 0.1662863790988922, -0.12470898032188416, -0.05293113738298416, 0.02028503268957138, -0.024757537990808487, -0.02143193781375885, 0.16394856572151184, 0.041080497205257416, -0.1739504635334015, 0.09398221224546432, 0.019598284736275673, -0.006820184178650379, 0.13388100266456604, 0.06254169344902039, -0.10868888348340988, -0.019890841096639633, 0.04210498183965683, 0.016031403094530106, 0.04561559855937958, -0.08862487971782684, -0.030359527096152306, 0.02618981897830963, 0.07007916271686554, 0.0784783586859703, -0.10388410836458206, 0.06486458331346512, 0.0709071010351181, -0.03592409938573837, 0.060626912862062454, -0.03700723126530647, -0.0454997718334198, 0.10697326809167862, 0.027490176260471344, -0.018996192142367363, -0.048802606761455536, -0.036532849073410034, -0.11210503429174423, 0.1877337098121643, -0.09553104639053345, -0.24200446903705597, -0.1514468938112259, 0.013557774014770985, -0.033359479159116745, 0.035128384828567505, 0.039950963109731674, -0.05372112989425659, -0.027202915400266647, -0.08413299918174744, 0.09980972111225128, -0.09512293338775635, -0.05758359655737877, -0.0861775204539299, 0.056873682886362076, -0.028010370209813118, -0.13992880284786224, 0.02211502194404602, 0.007845742627978325, -0.049576420336961746, -0.00806716550141573, -0.06102844700217247, 0.12803930044174194, 0.16478729248046875, -0.01571914181113243, -0.02572101727128029, -0.0007844209321774542, 0.11221399158239365, -0.06672517210245132, 0.032643117010593414, 0.06538847088813782, 0.06618016958236694, 0.010113843716681004, 0.10784978419542313, 0.04267939552664757, -0.05439814552664757, 0.0265862587839365, 0.04699370637536049, -0.03321235254406929, -0.26752588152885437, -0.10581240057945251, -0.06366267800331116, -0.031209707260131836, 0.10280197113752365, 0.04737146198749542, -0.027614818885922432, 0.02205079048871994, -0.037134770303964615, 0.02301100641489029, -0.0032156729139387608, 0.06530487537384033, 0.060471419245004654, -0.020819690078496933, 0.0658702403306961, -0.0623079389333725, -0.03723772615194321, 0.0957692563533783, 0.057911183685064316, 0.17984913289546967, -0.03224847838282585, 0.25413161516189575, 0.05594976246356964, 0.03678221255540848, -0.0010632871417328715, 0.07610724866390228, -0.04182320088148117, 0.02868649736046791, -0.03375072404742241, -0.0647207498550415, -0.017188187688589096, 0.049449093639850616, 0.005258642137050629, 0.01871560700237751, -0.07870812714099884, -0.07974196970462799, 0.07108046114444733, 0.20930969715118408, 0.06101559102535248, -0.18729449808597565, -0.05834023281931877, 0.00042754923924803734, -0.06938882172107697, -0.07606630772352219, 0.0023702564649283886, 0.1523870974779129, -0.09332053363323212, -0.009585734456777573, 0.01821746863424778, 0.13867948949337006, -0.12242408841848373, -0.02387048304080963, -0.0015368122840300202, 0.014380025677382946, -0.018726058304309845, 0.12039277702569962, -0.23152275383472443, 0.19422917068004608, 0.03328438848257065, 0.05556106939911842, -0.0418599471449852, 0.010184652172029018, -0.05594857782125473, -0.0283267293125391, 0.10426487028598785, 0.02098955772817135, -0.030926985666155815, -0.10612568259239197, -0.10278645902872086, -0.023834165185689926, 0.05142059177160263, -0.07006905972957611, 0.10335221886634827, 0.0642772987484932, -0.007916257716715336, -0.017455292865633965, 0.05862560123205185, -0.0019505820237100124, -0.1787227988243103, -0.01534467563033104, -0.017193710431456566, -0.03449141979217529, -0.010596910491585732, -0.04126671329140663, -0.035154737532138824, 0.21760714054107666, -0.11700252443552017, -0.07729177176952362, -0.08433544635772705, 0.02199605666100979, 0.12576031684875488, -0.07335899025201797, 0.034073356539011, 0.01063627004623413, 0.03376453369855881, -0.04772178456187248, -0.032141976058483124, 0.09888292849063873, -0.06103953346610069, -0.06673327833414078, -0.06066526845097542, 0.1603119820356369, 0.04086674004793167, 0.053534816950559616, -0.016529452055692673, 0.04596596583724022, 0.0072338697500526905, -0.0961422473192215, -0.02132909744977951, 0.04431116208434105, 0.1406518518924713, 0.0554928295314312, -0.07493487745523453, -0.07389665395021439, -0.06671958416700363, -0.08323919773101807, 0.15904465317726135, 0.1622907668352127, -0.05212559923529625, 0.03699541464447975, 0.171492800116539, -0.12402082234621048, -0.14744022488594055, -0.04837218299508095, 0.0789959654211998, 0.0741085410118103, -0.037598222494125366, -0.18454495072364807, -0.007916046306490898, 0.13345156610012054, 0.000958540418650955, 0.0488286167383194, -0.4027175009250641, -0.13237258791923523, 0.010012648068368435, 0.04001937434077263, 0.003297631861642003, -0.1290978640317917, -0.07193847000598907, -0.07141648977994919, -0.09627332538366318, 0.12038139253854752, -0.03510265424847603, 0.10215350985527039, -0.0042586675845086575, 0.020024340599775314, 0.0471021942794323, -0.035790685564279556, 0.1364126056432724, 0.0011140881106257439, 0.024320051074028015, -0.04237183555960655, 0.027665823698043823, 0.018767224624753, -0.021884223446249962, 0.14015746116638184, -0.0411207377910614, 0.04850335419178009, -0.1672021746635437, -0.061227358877658844, -0.06611791253089905, 0.011254626326262951, -0.04073917120695114, -0.05738110467791557, -0.042178407311439514, 0.011924811638891697, 0.044225871562957764, -0.012776969000697136, 0.025504745543003082, -0.06438466906547546, 0.06151498481631279, 0.18440745770931244, 0.06997537612915039, 0.019882788881659508, -0.10278943181037903, 0.0002498393296264112, 0.004567252937704325, 0.06401377171278, -0.13737009465694427, -0.005691625643521547, 0.14178311824798584, 0.04022027179598808, 0.10886571556329727, -0.014982827007770538, -0.12977439165115356, 0.017481781542301178, 0.07496250420808792, -0.0717114508152008, -0.104715496301651, -0.016484390944242477, 0.052166685461997986, -0.05452774465084076, -0.019666701555252075, 0.1162177100777626, -0.05949931591749191, -0.04328155517578125, -0.008583432994782925, 0.015884432941675186, -0.05791778117418289, 0.22485125064849854, 0.0393797941505909, 0.05380629003047943, -0.05915781483054161, 0.10018401592969894, 0.1320405751466751, -0.12531842291355133, 0.029427658766508102, 0.2026100605726242, -0.06376224756240845, -0.04760729894042015, 0.02512875571846962, 0.11398543417453766, -0.0860591009259224, -0.05528518185019493, -0.03227557986974716, -0.02592533640563488, 0.0227374080568552, 0.0399216003715992, 0.0353061705827713, 0.04067600518465042, -0.008515174500644207, -0.03833143413066864, -0.07475139945745468, 0.07118780165910721, 0.049844082444906235, 0.0028607638087123632, -0.02167041040956974, 0.10530156642198563, 0.01569681614637375, -0.0008633179822936654, -0.01346327643841505, -0.0030970023944973946, -0.06835835427045822, 0.02472231164574623, -0.07072585076093674, 0.012209846638143063, -0.06424838304519653, -0.025276092812418938, -0.03967420011758804, 0.015372165478765965, -0.012250117026269436, -0.007399176247417927, -0.04318100959062576, -0.04542335495352745, -0.06286022067070007, 0.037496235221624374, -0.08421928435564041, -0.029459817335009575, -0.006482182070612907, -0.02816978096961975, 0.04962689429521561, 0.022986363619565964, 0.004472424276173115, 0.03812967240810394, -0.034442462027072906, 0.05850837007164955, 0.029022106900811195, 0.048920486122369766, 0.020793186500668526, -0.05583430081605911, 0.029424352571368217, 0.035860612988471985, 0.0007994977058842778, -0.0013967507984489202, 0.00528831547126174, -0.13694950938224792, -0.05008191242814064, -0.047917548567056656, -0.03733840212225914, -0.06048978865146637, 0.10228736698627472, 0.07705356180667877, 0.05437060445547104, 0.07731541991233826, -0.07114176452159882, 0.06873338669538498, -0.17075565457344055, -0.010184815153479576, 0.003448717761784792, -0.016334859654307365, -0.02254396863281727, 0.002958511933684349, 0.05153581500053406, -0.050968486815690994, 0.13488641381263733, 0.020127320662140846, 0.07523615658283234, 0.016597943380475044, -0.05761435627937317, 0.010964510031044483, 0.011896911077201366, 0.127768412232399, -0.0006858676206320524, -0.018799275159835815, -0.06101500242948532, 0.1046610027551651, -0.0038848482072353363, 0.11784042418003082, 0.035516589879989624, 0.12932781875133514, 0.1454671025276184, 0.04783279076218605, 0.021693846210837364, -0.08514150977134705, -0.040941886603832245, 0.04886047914624214, -0.000650803791359067, 0.05348736420273781, -0.024061664938926697, 0.08462360501289368, 0.15657998621463776, -0.16698449850082397, 0.12201397120952606, 0.011666763573884964, -0.06965480744838715, -0.05033797398209572, -0.11430362612009048, -0.04809257760643959, -0.048631370067596436, -0.04162382706999779, -0.1122252568602562, 0.011448864825069904, 0.034033313393592834, 0.044694993644952774, -0.029370253905653954, 0.09562129527330399, 0.020864617079496384, -0.11962630599737167, 0.06413638591766357, 0.014697320759296417, 0.1034000813961029, -0.024509701877832413, 0.05166468024253845, 0.05554773285984993, 0.004028866998851299, 0.05078314617276192, 0.06835182011127472, -0.007987339049577713, 0.010408198460936546, 0.003947047516703606, -0.05838654935359955, -0.03805685043334961, 0.030042732134461403, 0.08163293451070786, 0.20820286870002747, 0.05382388085126877, -0.0896265059709549, -0.02091093361377716, 0.1995181441307068, -0.051606185734272, -0.0773186981678009, -0.09264864772558212, 0.2145339548587799, 0.03697822615504265, 0.03443831205368042, -0.005748099181801081, -0.10253239423036575, -0.010594029910862446, 0.1228870078921318, 0.18791401386260986, -0.02908477559685707, -0.0360933318734169, 0.018561754375696182, -0.004080141894519329, 0.022041432559490204, 0.031728845089673996, 0.04414042830467224, 0.3074914216995239, -0.08109768480062485, 0.0486692376434803, -0.05859397351741791, 0.022735916078090668, -0.007126369513571262, 0.1465429961681366, -0.019193874672055244, 0.00270666042342782, -0.030740806832909584, 0.09616676717996597, -0.002045330824330449, -0.17770962417125702, 0.0012297052890062332, -0.1152203232049942, -0.1160355657339096, 0.0018426831811666489, -0.02586817741394043, 0.04414182901382446, 0.07793496549129486, 0.01494012400507927, 0.030661869794130325, 0.01899930089712143, 0.021637285128235817, -0.11988554149866104, -0.1346447765827179, 0.020923443138599396, -0.01913151703774929, 0.09606079757213593, -0.0047011105343699455, 0.10888984799385071, 0.10034950077533722, 0.018497101962566376, -0.08609244227409363, 0.10024727880954742, 0.02724250964820385, 0.02180658094584942, 0.09823962301015854, 0.07559803128242493, -0.001195235992781818, 0.036220651119947433, 0.036561865359544754, -0.05999521166086197, 0.04332638159394264, -0.0383852943778038, -0.002069465583190322, -0.13949479162693024, 0.07747656106948853, -0.029149619862437248, 0.1298401802778244, 0.16584166884422302, -0.021144147962331772, 0.0022778038401156664, -0.04782365262508392, 0.023666994646191597, -0.00047055119648575783, 0.1039942130446434, -0.01969177834689617, -0.21351304650306702, 0.021959539502859116, -0.05249670892953873, 0.0257259551435709, -0.26308056712150574, -0.029147358611226082, 0.033680353313684464, -0.057267412543296814, -0.032886162400245667, 0.0745527371764183, 0.06234011426568031, 0.042100902646780014, -0.02920033410191536, -0.08277638256549835, -0.007021188270300627, 0.10807452350854874, -0.11371917277574539, -0.10548191517591476 ]
null
null
transformers
# legal_t5_small_multitask_cs_fr model Model on translating legal text from Cszech to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_cs_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to French. ### How to use Here is how to use this model to translate legal text from Cszech to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_cs_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_cs_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Agentura USA pro ochranu životního prostředí ve své hodnotící studii v roce 2002 zjistila možnou systémovou toxicitu a karcinogenitu a údaje získané z krevních testů nasvědčují rozsáhlé expozici obyvatelstva." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_cs_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_cs_fr | 47.588| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech French", "tags": ["translation Cszech French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Agentura USA pro ochranu \u017eivotn\u00edho prost\u0159ed\u00ed ve sv\u00e9 hodnot\u00edc\u00ed studii v roce 2002 zjistila mo\u017enou syst\u00e9movou toxicitu a karcinogenitu a \u00fadaje z\u00edskan\u00e9 z krevn\u00edch test\u016f nasv\u011bd\u010duj\u00ed rozs\u00e1hl\u00e9 expozici obyvatelstva."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_cs_fr
[ "transformers", "pytorch", "t5", "text2text-generation", "translation Cszech French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech French" ]
TAGS #transformers #pytorch #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_cs\_fr model ========================================= Model on translating legal text from Cszech to French. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_cs\_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to French. ### How to use Here is how to use this model to translate legal text from Cszech to French in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_cs\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 56, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06875405460596085, 0.13453786075115204, -0.003943570889532566, 0.08193623274564743, 0.04786525294184685, 0.001554742339067161, 0.04027405381202698, 0.08798933029174805, -0.06529329717159271, 0.06649995595216751, 0.062480054795742035, -0.01750406250357628, 0.04911233112215996, 0.028782499954104424, 0.05286407843232155, -0.17873893678188324, -0.0009964695200324059, -0.021632246673107147, -0.04616652429103851, 0.10783085227012634, 0.09331479668617249, -0.04224643111228943, 0.056419212371110916, -0.013946201652288437, -0.07209841907024384, 0.03922203928232193, -0.08016151189804077, -0.04104503244161606, 0.11190838366746902, 0.08304480463266373, 0.08629065006971359, 0.0001722802990116179, 0.06569977849721909, -0.18786439299583435, -0.008305471390485764, 0.0721718966960907, -0.02700665406882763, 0.04620969668030739, 0.1029285192489624, -0.00870655756443739, 0.17803964018821716, -0.05775866657495499, 0.03052530623972416, 0.048457834869623184, -0.09972977638244629, -0.11082625389099121, -0.0546628013253212, 0.008301648311316967, 0.06919492781162262, 0.13457953929901123, -0.02916182391345501, 0.04214438796043396, -0.027488626539707184, 0.07342392951250076, 0.07834562659263611, -0.24650169909000397, -0.03088216297328472, 0.024384720250964165, 0.06098771095275879, 0.08540589362382889, -0.04796741157770157, 0.00578617537394166, 0.0606214739382267, 0.057469289749860764, 0.0696413591504097, -0.058731116354465485, -0.018021374940872192, -0.025065725669264793, -0.12927494943141937, -0.05729496479034424, 0.177457794547081, 0.018624836578965187, -0.03679628297686577, -0.09181224554777145, -0.06893947720527649, -0.059304192662239075, -0.005698642693459988, -0.03663058578968048, 0.02020173892378807, -0.0004266916075721383, 0.038939185440540314, -0.05292940512299538, -0.11270754039287567, -0.06525138765573502, -0.0652034729719162, 0.04826173558831215, 0.03713628649711609, 0.017073942348361015, 0.03506377339363098, 0.08806706964969635, -0.1403323858976364, -0.06069735065102577, 0.018448548391461372, 0.0018119248561561108, -0.10103709250688553, 0.01276424154639244, 0.0017623318126425147, -0.17085589468479156, -0.004284100141376257, -0.007212786935269833, -0.08872227370738983, 0.0330602303147316, 0.0410899743437767, 0.040774911642074585, 0.04084039106965065, 0.12514829635620117, -0.09037115424871445, -0.1143081784248352, -0.03736913204193115, -0.005797356367111206, -0.026058679446578026, 0.026435568928718567, -0.06561645865440369, -0.045178066939115524, -0.015351966954767704, 0.037350937724113464, 0.006720948964357376, -0.0034978187177330256, -0.011763345450162888, -0.013395819813013077, 0.11281061917543411, -0.08517807722091675, 0.02710358425974846, 0.010473250411450863, -0.094709612429142, -0.027000792324543, 0.06344413012266159, -0.025641916319727898, -0.11470470577478409, 0.05201921612024307, -0.03612459450960159, -0.013489706441760063, -0.11275922507047653, -0.17719614505767822, -0.0027100080624222755, 0.024203456938266754, -0.07292129844427109, -0.08485492318868637, -0.0890595093369484, -0.07733696699142456, 0.0370570532977581, -0.057830747216939926, 0.010686934925615788, -0.10447131097316742, -0.004353975877165794, 0.038827087730169296, 0.004325753543525934, 0.06282708793878555, -0.049591194838285446, 0.03017895109951496, -0.009198823012411594, 0.05631021410226822, -0.009175093844532967, 0.028096644207835197, -0.08307277411222458, 0.046504031866788864, -0.07891188561916351, 0.14270086586475372, -0.009581508114933968, -0.03323930874466896, -0.1350647211074829, -0.06553054600954056, -0.09290724247694016, 0.03334049880504608, 0.08591057360172272, 0.14130942523479462, -0.21974268555641174, -0.030831599608063698, 0.23598244786262512, -0.06689304858446121, -0.06783662736415863, 0.15552900731563568, -0.03106836974620819, 0.027412425726652145, 0.05920672044157982, 0.07303670048713684, 0.055300187319517136, -0.04325513914227486, -0.05538882687687874, 0.025706829503178596, 0.03440232574939728, 0.03626338765025139, 0.10017089545726776, -0.07019563764333725, 0.07540503889322281, -0.011075062677264214, 0.04377080500125885, 0.00950770452618599, -0.04448604956269264, -0.03521319106221199, -0.008012618869543076, -0.02748863399028778, 0.007062628399580717, 0.01680423691868782, 0.008596139959990978, -0.07155636698007584, -0.09438968449831009, -0.018416833132505417, 0.09158768504858017, -0.07208898663520813, 0.031830575317144394, 0.02660992369055748, -0.05482683330774307, -0.07304809242486954, 0.009340628050267696, -0.14562298357486725, -0.01778866909444332, 0.03740735352039337, -0.035799190402030945, 0.09895036369562149, 0.04515023157000542, 0.0592239685356617, 0.10943920165300369, -0.05169343203306198, -0.024131037294864655, -0.024149537086486816, -0.025847792625427246, -0.08925826847553253, -0.12545673549175262, -0.01532702799886465, -0.008884912356734276, 0.048422448337078094, -0.1659165471792221, 0.01828796975314617, -0.02703794464468956, 0.09481405466794968, 0.003096447791904211, -0.02033160999417305, -0.006875707767903805, 0.07282249629497528, -0.04259665310382843, -0.023989945650100708, 0.031119268387556076, -0.012063160538673401, -0.03194889426231384, 0.11056564748287201, -0.100465327501297, -0.11542382836341858, 0.09714832156896591, 0.004699473734945059, -0.10560834407806396, -0.010994930751621723, -0.01530513633042574, -0.0524602048099041, -0.04627300426363945, -0.07285355031490326, 0.21986344456672668, 0.0486583448946476, 0.17271658778190613, -0.11317386478185654, -0.0558171346783638, 0.020754311233758926, -0.026020163670182228, -0.015478000044822693, 0.16947656869888306, 0.06347516179084778, -0.13852743804454803, 0.0966811329126358, 0.05018288269639015, -0.015380107797682285, 0.11184088885784149, 0.050846491008996964, -0.1039782166481018, -0.008350104093551636, 0.06674008816480637, 0.0018219065386801958, 0.047777775675058365, -0.09141235053539276, -0.029097113758325577, 0.023164421319961548, 0.05884457752108574, 0.062435004860162735, -0.11761509627103806, 0.08268781006336212, 0.06521003693342209, -0.04155076667666435, 0.05359937995672226, -0.04031585901975632, -0.03990194573998451, 0.11471926420927048, 0.009530018083751202, -0.06472685188055038, -0.0423911027610302, -0.039890844374895096, -0.11230497807264328, 0.19727185368537903, -0.10032393783330917, -0.22111673653125763, -0.13189256191253662, 0.016903165727853775, -0.059954725205898285, 0.01922358013689518, 0.035873591899871826, -0.044113870710134506, -0.039679065346717834, -0.10054827481508255, 0.06207853928208351, -0.11843012273311615, -0.055092982947826385, -0.10658340156078339, 0.05281881242990494, -0.01931389980018139, -0.15187302231788635, 0.021818790584802628, 0.008138202130794525, -0.041757732629776, -0.00832473672926426, -0.04100934416055679, 0.11304930597543716, 0.14731086790561676, -0.04694100841879845, -0.04165053367614746, -0.0005761093343608081, 0.13871286809444427, -0.06447121500968933, 0.04150410369038582, 0.03574227914214134, 0.04980146139860153, 0.04418470710515976, 0.12512339651584625, 0.043946512043476105, -0.04109843075275421, 0.03526240959763527, 0.05279328674077988, -0.020484408363699913, -0.25207850337028503, -0.11643065512180328, -0.0692145973443985, -0.009615711867809296, 0.08664171397686005, 0.042660191655159, -0.03727344423532486, 0.004043822176754475, -0.05303739756345749, 0.03930488973855972, 0.012720828875899315, 0.05863810330629349, 0.06279998272657394, -0.017101258039474487, 0.07519112527370453, -0.06588345021009445, -0.058372918516397476, 0.09912455826997757, 0.02684573270380497, 0.18463434278964996, -0.04644656926393509, 0.23399104177951813, 0.054408829659223557, 0.05362216383218765, -0.00017421056691091508, 0.06874192506074905, -0.03401988744735718, 0.03607964515686035, -0.028508607298135757, -0.06528937816619873, 0.01020121481269598, 0.06639192253351212, 0.00021132190886419266, 0.0035281707532703876, -0.07091835886240005, -0.04200170189142227, 0.07998305559158325, 0.2008093297481537, 0.08445224910974503, -0.1874777376651764, -0.047429658472537994, -0.015154431574046612, -0.06784386187791824, -0.08578655123710632, 0.02114514820277691, 0.17781555652618408, -0.09171668440103531, -0.014794179238379002, 0.027530334889888763, 0.12732143700122833, -0.10581616312265396, -0.01757683791220188, 0.030818382278084755, 0.031534597277641296, -0.024259110912680626, 0.11768072843551636, -0.23915275931358337, 0.18488538265228271, 0.016348380595445633, 0.034689754247665405, -0.04253707453608513, 0.028224842622876167, -0.04901007562875748, 0.02272891439497471, 0.11659207940101624, 0.030424948781728745, -0.04745563119649887, -0.07890131324529648, -0.09163910150527954, -0.027682505548000336, 0.08211955428123474, -0.04760277271270752, 0.08109107613563538, 0.05909818783402443, -0.012024840340018272, -0.00923134759068489, 0.04770868271589279, -0.038138777017593384, -0.18192075192928314, 0.005913557950407267, 0.0034015474375337362, -0.04651794582605362, -0.012226907536387444, -0.035465504974126816, -0.05125026777386665, 0.2294260412454605, -0.12634581327438354, -0.05875883251428604, -0.06151402369141579, -0.010223707184195518, 0.13068293035030365, -0.07137826830148697, 0.03999832272529602, 0.002411183901131153, 0.04095099866390228, -0.0505049005150795, -0.02863207831978798, 0.07837934792041779, -0.07862722873687744, -0.04797263815999031, -0.07444416731595993, 0.1501099020242691, 0.042954057455062866, 0.04846521466970444, -0.015307926572859287, 0.03338263928890228, -0.010063993744552135, -0.10831937193870544, -0.023927656933665276, 0.009655236266553402, 0.14077603816986084, 0.045796141028404236, -0.08046537637710571, -0.08125375211238861, -0.06824228167533875, -0.0601518377661705, 0.15707358717918396, 0.16861432790756226, -0.06332088261842728, 0.03744012117385864, 0.17859847843647003, -0.11290869116783142, -0.17234186828136444, -0.037298914045095444, 0.08546692878007889, 0.06494082510471344, -0.05044133961200714, -0.18004097044467926, 0.009418504312634468, 0.10141711682081223, 0.008514485321938992, 0.052040569484233856, -0.3994922637939453, -0.1433733105659485, 0.011190655641257763, 0.018398655578494072, 0.012091054581105709, -0.10880652815103531, -0.042007096111774445, -0.0543929748237133, -0.09410607814788818, 0.11305902898311615, -0.03688938170671463, 0.08565416187047958, 0.009734735824167728, 0.0029543770942837, 0.04055262729525566, -0.030470270663499832, 0.12871740758419037, 0.01674160361289978, 0.028856471180915833, -0.039559267461299896, 0.05291701480746269, 0.024956917390227318, -0.015400074422359467, 0.13707751035690308, -0.043324682861566544, 0.054371412843465805, -0.16086289286613464, -0.03716026991605759, -0.056350819766521454, 0.017517808824777603, -0.04782906547188759, -0.06118663027882576, -0.05120348930358887, 0.021187186241149902, 0.057442065328359604, -0.009307309985160828, -0.02802635356783867, -0.03739805892109871, 0.03361817076802254, 0.18978850543498993, 0.08124889433383942, 0.0278066024184227, -0.10247454047203064, 0.03454982489347458, 0.007633124012500048, 0.05963548645377159, -0.12974399328231812, 0.014300426468253136, 0.14403122663497925, 0.028084078803658485, 0.11830568313598633, -0.009832738898694515, -0.1273009032011032, -0.009441574104130268, 0.06454535573720932, -0.10616084933280945, -0.13475768268108368, -0.024364378303289413, 0.03306568041443825, -0.047617290169000626, -0.020171068608760834, 0.10181354731321335, -0.08424254506826401, -0.025618450716137886, -0.012850377708673477, 0.015856117010116577, -0.06455636024475098, 0.19856488704681396, 0.006275637540966272, 0.057193998247385025, -0.053735703229904175, 0.08679236471652985, 0.1398409903049469, -0.13390235602855682, 0.022879842668771744, 0.19503198564052582, -0.06168428435921669, -0.04889111965894699, 0.03053089790046215, 0.10954402387142181, -0.0339626669883728, -0.06200812757015228, -0.018834907561540604, -0.030146624892950058, 0.01602361537516117, 0.03396831825375557, 0.034080639481544495, 0.040482986718416214, 0.006778364069759846, -0.05186949670314789, -0.10309933125972748, 0.09628945589065552, 0.07318434119224548, 0.012919275090098381, 0.003453231183812022, 0.10465669631958008, 0.009664852172136307, 0.004517182242125273, -0.010918183252215385, 0.008619825355708599, -0.05920735001564026, 0.00810903962701559, -0.07136072218418121, -0.0011017220094799995, -0.03652220964431763, -0.014558771625161171, -0.030913375318050385, 0.002839143155142665, -0.011645440943539143, 0.0026650498621165752, -0.0442708395421505, -0.040504876524209976, -0.036034051328897476, 0.04413418471813202, -0.10326672345399857, -0.028109561651945114, 0.020978562533855438, -0.04134909063577652, 0.06288455426692963, 0.023739149793982506, -0.00234967446886003, 0.012078301049768925, -0.031141266226768494, 0.042781002819538116, -0.00651388056576252, 0.06182539090514183, 0.009167586453258991, -0.07041068375110626, 0.034246668219566345, 0.02622533217072487, -0.0014230612432584167, -0.010146336629986763, 0.0082530677318573, -0.11751174181699753, -0.04373349994421005, -0.048122208565473557, -0.053842220455408096, -0.06443078815937042, 0.10540648549795151, 0.0504651814699173, 0.074452243745327, 0.08912684768438339, -0.06646382808685303, 0.06376013904809952, -0.14516226947307587, -0.010530535131692886, 0.02353042922914028, -0.02975204586982727, -0.0049895597621798515, -0.0217840988188982, 0.0425340011715889, -0.06376839429140091, 0.14292293787002563, 0.019995758309960365, 0.08160426467657089, 0.018758265301585197, -0.08427665382623672, -0.028255362063646317, 0.025484461337327957, 0.08044752478599548, -0.020635245367884636, -0.025128453969955444, -0.08694318681955338, 0.08567030727863312, 0.011361656710505486, 0.12532411515712738, 0.04676631838083267, 0.11012431234121323, 0.13084791600704193, 0.043194860219955444, -0.0028073282446712255, -0.09130912274122238, -0.05774889886379242, 0.06333128362894058, 0.0006373911164700985, 0.06154191121459007, -0.054280199110507965, 0.11847839504480362, 0.12832939624786377, -0.15307895839214325, 0.12218595296144485, 0.012023795396089554, -0.08956954628229141, -0.066153384745121, -0.08331253379583359, -0.04577573761343956, -0.04255148023366928, -0.04128318279981613, -0.1099441722035408, 0.02619757503271103, 0.022330649197101593, 0.06354670971632004, -0.018872980028390884, 0.10289067775011063, -0.01683230698108673, -0.10553265362977982, 0.07260029017925262, 0.008135552518069744, 0.10664279758930206, -0.02113022841513157, 0.029437555000185966, 0.04922499135136604, -0.01932496950030327, 0.04455690085887909, 0.0605488196015358, 0.004706913605332375, 0.01126078050583601, 0.00026376458117738366, -0.046870771795511246, -0.04378744214773178, 0.021096933633089066, 0.07560033351182938, 0.1962835192680359, 0.06119222939014435, -0.08926317095756531, -0.017599374055862427, 0.17965729534626007, -0.06010699272155762, -0.08565562218427658, -0.11129018664360046, 0.20965726673603058, 0.02793235331773758, 0.03114018775522709, 0.0071827322244644165, -0.08837053179740906, -0.013049841858446598, 0.12320253252983093, 0.19766314327716827, -0.02733922377228737, -0.03073623776435852, 0.03548765555024147, -0.008078193292021751, 0.02731437236070633, 0.026118667796254158, 0.059766218066215515, 0.2845395803451538, -0.08636356890201569, 0.05131000652909279, -0.05972961336374283, 0.03127491846680641, -0.008796312846243382, 0.1578158438205719, -0.007240161299705505, -0.0005878948140889406, -0.038435839116573334, 0.07966191321611404, 0.01415930688381195, -0.157029926776886, -0.0010373967234045267, -0.11135075241327286, -0.09896428883075714, 0.025136888027191162, -0.029146000742912292, 0.03838063403964043, 0.07376350462436676, 0.004799815826117992, 0.014378772117197514, 0.05051526799798012, 0.0073233661241829395, -0.09807358682155609, -0.11132188886404037, 0.0015250574797391891, -0.05435582250356674, 0.1202300637960434, 0.004426456987857819, 0.14977607131004333, 0.09943026304244995, 0.01630508340895176, -0.07094354182481766, 0.07464201748371124, 0.03971477970480919, 0.02852875553071499, 0.06480572372674942, 0.08222009986639023, -0.0149092311039567, 0.06087329611182213, 0.022243551909923553, -0.039073068648576736, 0.057801149785518646, -0.0608496367931366, -0.017715387046337128, -0.12312135100364685, 0.0922861248254776, -0.03572673350572586, 0.1313067525625229, 0.18142539262771606, -0.011631046421825886, 0.006502062547951937, -0.05288027599453926, 0.00021130857930984348, -0.018750175833702087, 0.0779450312256813, -0.012096548452973366, -0.19336721301078796, 0.050586339086294174, -0.034264128655195236, 0.047787562012672424, -0.24773940443992615, -0.03002118319272995, 0.030429724603891373, -0.032110795378685, -0.03326496481895447, 0.08289487659931183, 0.0741136372089386, 0.021389076486229897, -0.03525213152170181, -0.09132593870162964, -0.012312584556639194, 0.09599541127681732, -0.07875753939151764, -0.10226834565401077 ]
null
null
transformers
# legal_t5_small_multitask_cs_it model Model on translating legal text from Cszech to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_cs_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Italian. ### How to use Here is how to use this model to translate legal text from Cszech to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_cs_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_cs_it", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Příprava Evropské rady (29.-30. října 2009)" pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_cs_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_cs_it | 45.297| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Italian", "tags": ["translation Cszech Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "P\u0159\u00edprava Evropsk\u00e9 rady (29.-30. \u0159\u00edjna 2009)"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_cs_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_cs\_it model ========================================= Model on translating legal text from Cszech to Italian. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_cs\_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Italian. ### How to use Here is how to use this model to translate legal text from Cszech to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_cs\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0899319276213646, 0.15262667834758759, -0.002933377865701914, 0.09039083868265152, 0.0771389976143837, 0.015983758494257927, 0.026017429307103157, 0.1262151002883911, -0.02211769111454487, 0.0921061709523201, 0.03944037854671478, 0.007909715175628662, 0.059763696044683456, 0.04264698550105095, 0.02960856258869171, -0.1949753314256668, 0.001311778207309544, -0.028554430231451988, -0.036919109523296356, 0.11364230513572693, 0.08740193396806717, -0.061434973031282425, 0.051786232739686966, -0.04655297100543976, -0.10205620527267456, 0.0315861701965332, -0.07936502993106842, -0.03487951681017876, 0.10452618449926376, 0.0792839452624321, 0.07764538377523422, -0.016851406544446945, 0.07578250765800476, -0.17545078694820404, -0.0013998113572597504, 0.07253523170948029, -0.015633761882781982, 0.04081422835588455, 0.1118619292974472, -0.0019325820030644536, 0.19499342143535614, -0.0427858866751194, 0.014689335599541664, 0.049804218113422394, -0.0994463562965393, -0.08389770239591599, -0.06985322386026382, 0.03162543103098869, 0.0796627625823021, 0.14207887649536133, -0.02947808802127838, 0.04719151183962822, -0.027566004544496536, 0.07482235878705978, 0.11501193791627884, -0.23827269673347473, -0.033767908811569214, 0.025169603526592255, 0.04042056202888489, 0.09615004807710648, -0.03939269483089447, 0.021650170907378197, 0.04948444664478302, 0.05202573165297508, 0.043599482625722885, -0.03084646537899971, 0.01898200437426567, -0.020620282739400864, -0.1263616532087326, -0.06871902942657471, 0.16977114975452423, 0.02503235451877117, -0.031043512746691704, -0.096394382417202, -0.057085342705249786, -0.0685209408402443, -0.0011993474327027798, -0.026359567418694496, 0.0002382864331593737, 0.0010948721319437027, 0.05174410715699196, -0.043724387884140015, -0.11375574767589569, -0.0784822553396225, -0.0560949333012104, 0.10153130441904068, 0.04071083664894104, 0.012692246586084366, 0.0460781566798687, 0.07803203910589218, -0.12206894159317017, -0.06959352642297745, 0.014425599947571754, 0.031331464648246765, -0.0856490284204483, 0.012354981154203415, -0.008793666027486324, -0.18066687881946564, 0.0012096617138013244, -0.022698545828461647, -0.04647507891058922, 0.010697382502257824, 0.04113028198480606, 0.04336829483509064, 0.04012934863567352, 0.10858717560768127, -0.11797647923231125, -0.11707838624715805, -0.028953105211257935, -0.007992235943675041, -0.00603077095001936, 0.010857313871383667, -0.07897565513849258, -0.04695506766438484, -0.001648646080866456, 0.0681639090180397, 0.016127875074744225, 0.0040900795720517635, -0.013447148725390434, -0.0389278307557106, 0.11175014823675156, -0.10899814963340759, 0.021037869155406952, 0.007475659251213074, -0.10406070947647095, -0.0048424070701003075, 0.03893880173563957, -0.04113911837339401, -0.11412692815065384, 0.059233635663986206, -0.048864059150218964, -0.010833832435309887, -0.11439807713031769, -0.17369601130485535, 0.014363305643200874, -0.017443731427192688, -0.06665264815092087, -0.09272049367427826, -0.11793636530637741, -0.06815866380929947, 0.019274575635790825, -0.06932277977466583, 0.02749200165271759, -0.08687616884708405, 0.013673117384314537, 0.02369416318833828, -0.0053637148812413216, 0.05950159579515457, -0.045383110642433167, 0.05100737512111664, -0.002466571284458041, 0.059346120804548264, -0.00550817372277379, 0.03327568247914314, -0.07712728530168533, 0.041398823261260986, -0.06847822666168213, 0.133098304271698, -0.008183291181921959, -0.001168683753348887, -0.13525839149951935, -0.05029091238975525, -0.08697090297937393, 0.03446747362613678, 0.0935567319393158, 0.15542864799499512, -0.20905788242816925, -0.03333136811852455, 0.20780248939990997, -0.07915583997964859, -0.07119366526603699, 0.1346660554409027, -0.018862958997488022, 0.051712602376937866, 0.07912340760231018, 0.0927395150065422, 0.024793673306703568, -0.054670125246047974, -0.07154242694377899, -0.018259886652231216, 0.011463538743555546, 0.007270321249961853, 0.10142529010772705, -0.05741299316287041, 0.12204854935407639, 0.008498040959239006, 0.027969708666205406, 0.01142152864485979, -0.05401085689663887, -0.04105624556541443, -0.009000865742564201, -0.053378019481897354, -0.03084276244044304, 0.015368691645562649, 0.02401146851480007, -0.0642121359705925, -0.09025738388299942, -0.011133811436593533, 0.11064914613962173, -0.07945465296506882, 0.034590739756822586, 0.003407369600608945, -0.005858110263943672, -0.10416410118341446, 0.018708163872361183, -0.15098614990711212, 0.005605220794677734, 0.04298108443617821, -0.020613079890608788, 0.09045231342315674, 0.012980131432414055, 0.05947316437959671, 0.09013146162033081, -0.05503322184085846, -0.016946252435445786, -0.01984637975692749, -0.02375616319477558, -0.10629264265298843, -0.1108693778514862, -0.050667427480220795, -0.004448437597602606, 0.03675040602684021, -0.15928404033184052, 0.021868474781513214, -0.010865570977330208, 0.09853003174066544, -0.0037354687228798866, -0.03441618010401726, -0.004077106714248657, 0.046643443405628204, -0.039115797728300095, -0.03603604435920715, 0.030364414677023888, -0.007550016511231661, -0.032130345702171326, 0.08332081139087677, -0.11698676645755768, -0.08004028350114822, 0.10024384409189224, 0.01827949658036232, -0.08823878318071365, -0.025071509182453156, -0.008991481736302376, -0.053001198917627335, -0.04607979580760002, -0.07486219704151154, 0.1694944053888321, 0.05154881253838539, 0.1440417766571045, -0.11058080196380615, -0.03720524162054062, 0.03106401488184929, -0.014678611420094967, -0.029352709650993347, 0.14500278234481812, 0.060059964656829834, -0.17525146901607513, 0.09543903917074203, 0.03845313936471939, -0.02694125846028328, 0.13533002138137817, 0.06892415881156921, -0.12231902778148651, -0.011034602299332619, 0.053215544670820236, 0.01003983337432146, 0.0405113622546196, -0.10271108895540237, -0.02137218415737152, 0.02184092439711094, 0.06455936282873154, 0.0764981061220169, -0.10200289636850357, 0.0621127113699913, 0.05857286602258682, -0.03248274698853493, 0.06166278198361397, -0.050150755792856216, -0.06409987062215805, 0.12189368903636932, 0.025150464847683907, -0.04432287439703941, -0.042865220457315445, -0.03915007784962654, -0.10004867613315582, 0.18172354996204376, -0.08575378358364105, -0.22799023985862732, -0.13508397340774536, 0.015400078147649765, -0.047820206731557846, 0.025464002043008804, 0.02158482000231743, -0.042143527418375015, -0.03447781130671501, -0.0947686955332756, 0.07072672992944717, -0.10557094216346741, -0.044399671256542206, -0.09234390407800674, 0.05978903919458389, -0.036742355674505234, -0.1293242871761322, 0.020160241052508354, -0.0004004088114015758, -0.06164032220840454, -0.01435263454914093, -0.07335422188043594, 0.1361713707447052, 0.1655946522951126, -0.03309464827179909, -0.006747918669134378, 0.009426327422261238, 0.09516822546720505, -0.08210496604442596, 0.04958990961313248, 0.06783608347177505, 0.052921827882528305, 0.01452153455466032, 0.11261522024869919, 0.0419791117310524, -0.052501339465379715, 0.029086848720908165, 0.05404732748866081, -0.025079594925045967, -0.25141775608062744, -0.10899396240711212, -0.06919068098068237, -0.014893284067511559, 0.08853369206190109, 0.048979464918375015, -0.03391972929239273, 0.013833943754434586, -0.04794282838702202, -0.009563694708049297, 0.018281010910868645, 0.06021880730986595, 0.03180352598428726, -0.019197039306163788, 0.07995086163282394, -0.05921746417880058, -0.03287607431411743, 0.10260947793722153, 0.06247902289032936, 0.20003871619701385, -0.03689662739634514, 0.25747984647750854, 0.04395177215337753, 0.05204665660858154, 0.004469044040888548, 0.08038174360990524, -0.026186170056462288, 0.014860889874398708, -0.018907491117715836, -0.06732465326786041, -0.0031519795302301645, 0.06510870903730392, 0.009514550678431988, -0.0032552205957472324, -0.06305598467588425, -0.05716719478368759, 0.0821390450000763, 0.24051347374916077, 0.06080600246787071, -0.19375798106193542, -0.053307514637708664, -0.012079284526407719, -0.06194080039858818, -0.07381489127874374, 0.0002914307697210461, 0.15788300335407257, -0.0827750638127327, -0.025648167356848717, 0.02426677756011486, 0.13398320972919464, -0.13048385083675385, -0.03140338137745857, 0.016783414408564568, 0.024934446439146996, -0.03000427596271038, 0.12657293677330017, -0.22735652327537537, 0.18947182595729828, 0.03184749186038971, 0.07459673285484314, -0.06316886842250824, 0.010399825870990753, -0.04087670519948006, 0.009310313500463963, 0.118768110871315, 0.021109558641910553, -0.029661796987056732, -0.11891226470470428, -0.11642327159643173, -0.03017420694231987, 0.09209329634904861, -0.05606038123369217, 0.0912930965423584, 0.04949460178613663, -0.008913878351449966, -0.00896252878010273, 0.08167477697134018, -0.0026402093935757875, -0.17093363404273987, 0.008113174699246883, 0.011797930113971233, -0.05693577975034714, -0.007850958965718746, -0.05848243460059166, -0.021490568295121193, 0.2431674748659134, -0.10388660430908203, -0.057085197418928146, -0.08769463002681732, 0.03262528032064438, 0.11381937563419342, -0.07393428683280945, 0.031840525567531586, 0.004491528030484915, 0.052107732743024826, -0.0551275797188282, -0.02625781111419201, 0.09589194506406784, -0.05975949764251709, -0.06417373567819595, -0.061149999499320984, 0.14874963462352753, 0.05027429014444351, 0.048151012510061264, -0.013059087097644806, 0.04164239019155502, 0.007001722231507301, -0.09027765691280365, -0.009912771172821522, 0.036572378128767014, 0.1320599764585495, 0.07276210933923721, -0.08797843754291534, -0.07248138636350632, -0.0706217810511589, -0.07719436287879944, 0.15733644366264343, 0.16778987646102905, -0.06033514440059662, 0.005178536754101515, 0.16233068704605103, -0.11723914742469788, -0.17415334284305573, -0.019144082441926003, 0.06822409480810165, 0.07146615535020828, -0.05102069675922394, -0.17592082917690277, 0.005565249361097813, 0.14270195364952087, -0.007095031440258026, 0.06981759518384933, -0.386709988117218, -0.1275259405374527, 0.0023876905906945467, 0.04504041746258736, 0.005690163467079401, -0.11756234616041183, -0.03970634937286377, -0.06195953115820885, -0.09513021260499954, 0.09102321416139603, -0.023303918540477753, 0.10118306428194046, -0.0007015920709818602, -0.00568528613075614, 0.044026535004377365, -0.04511485621333122, 0.11865834891796112, 0.021935639902949333, 0.026959042996168137, -0.045700907707214355, 0.030475737527012825, 0.01995745860040188, -0.009524073451757431, 0.14986804127693176, -0.04818512499332428, 0.036236539483070374, -0.1430089771747589, -0.06303323805332184, -0.06344670802354813, 0.023644138127565384, -0.03882930800318718, -0.06543911248445511, -0.040872570127248764, 0.034360434859991074, 0.039348501712083817, 0.004347720183432102, 0.0033528015483170748, -0.07207869738340378, 0.046749841421842575, 0.18921320140361786, 0.060057204216718674, 0.012713897973299026, -0.09140711277723312, 0.0025794885586947203, 0.002938309218734503, 0.05380401015281677, -0.11039253324270248, 0.005426220130175352, 0.14581376314163208, 0.0351472869515419, 0.11538819968700409, -0.007643578108400106, -0.1302769035100937, 0.0003340102848596871, 0.07271793484687805, -0.08789694309234619, -0.11134923249483109, -0.01965360715985298, 0.04881603270769119, -0.06214819848537445, 0.008378200232982635, 0.11023812741041183, -0.07872891426086426, -0.02699398435652256, -0.017313139513134956, 0.033572517335414886, -0.06950542330741882, 0.22910693287849426, 0.03395497798919678, 0.042917851358652115, -0.05727306753396988, 0.12213949114084244, 0.11916002631187439, -0.12114422023296356, 0.024427570402622223, 0.2070433646440506, -0.06746453791856766, -0.052739277482032776, 0.022953946143388748, 0.09435241669416428, -0.05624699965119362, -0.06197720393538475, -0.04194781184196472, -0.01992598921060562, 0.02134334295988083, 0.011359324678778648, 0.044515613466501236, 0.041338857263326645, -0.023078294470906258, -0.0457984060049057, -0.10841479897499084, 0.08787830173969269, 0.0715053379535675, 0.00048389798030257225, -0.013780809007585049, 0.10631959140300751, 0.01875085011124611, -0.017577514052391052, -0.013000305742025375, -0.012089448049664497, -0.04579869285225868, 0.02349262870848179, -0.05125400796532631, -0.01504012942314148, -0.05502985790371895, -0.02333212085068226, -0.046763427555561066, 0.004350306000560522, -0.026044702157378197, 0.0005709818215109408, -0.05842725932598114, -0.050889115780591965, -0.059989821165800095, 0.022488847374916077, -0.09393714368343353, -0.026493865996599197, -0.0002620649174787104, -0.03418203815817833, 0.06605665385723114, 0.039755139499902725, 0.0058732121251523495, 0.02554488182067871, -0.027267197147011757, 0.05451240390539169, 0.0011359610361978412, 0.049815867096185684, 0.00815410166978836, -0.04890504106879234, 0.03616519272327423, 0.044821515679359436, -0.008388735353946686, -0.0020642229355871677, 0.005640828981995583, -0.12075529247522354, -0.0332644097507, -0.05448337644338608, -0.034387052059173584, -0.06290235370397568, 0.11226803064346313, 0.07622026652097702, 0.06944924592971802, 0.058979347348213196, -0.051393602043390274, 0.07320238649845123, -0.1608220487833023, -0.0013310385402292013, 0.018117647618055344, -0.039405230432748795, -0.01883227378129959, -0.00038131960900500417, 0.051921360194683075, -0.056671351194381714, 0.12287253141403198, 0.01802198961377144, 0.053894661366939545, 0.022879283875226974, -0.05652162805199623, 0.008531411178410053, 0.016887903213500977, 0.12175319343805313, -0.013494293205440044, -0.022278860211372375, -0.05062796548008919, 0.08622563630342484, -0.01550931017845869, 0.1368177980184555, 0.041731320321559906, 0.1401781141757965, 0.13262136280536652, 0.04924802854657173, 0.03007841669023037, -0.09510199725627899, -0.061020627617836, 0.05242050439119339, -0.023519963026046753, 0.06465836614370346, -0.03808770701289177, 0.11266930401325226, 0.15600846707820892, -0.16948086023330688, 0.10133783519268036, 0.001977020874619484, -0.08755886554718018, -0.04857978969812393, -0.13321039080619812, -0.04922816529870033, -0.046015262603759766, -0.033011581748723984, -0.11868209391832352, 0.02458343282341957, 0.03662010654807091, 0.05277939513325691, -0.04250108078122139, 0.10849017649888992, 0.008326740935444832, -0.09776898473501205, 0.08123771846294403, 0.0035291470121592283, 0.0898081362247467, -0.023544812574982643, 0.013270271010696888, 0.04390871897339821, -0.023245343938469887, 0.05846545472741127, 0.05455096811056137, -0.002186193596571684, 0.004426917992532253, 0.004663332365453243, -0.05781010910868645, -0.04123283177614212, 0.013225325383245945, 0.07606823742389679, 0.19848231971263885, 0.04074832424521446, -0.08684457093477249, -0.02247239649295807, 0.1789052039384842, -0.04818529263138771, -0.06447654962539673, -0.09725344926118851, 0.2024712711572647, 0.03933998569846153, 0.03100324049592018, -0.0089440131559968, -0.09576201438903809, -0.006894845515489578, 0.12217805534601212, 0.1862340271472931, -0.044328123331069946, -0.03693034127354622, 0.02959924004971981, -0.005872529465705156, 0.016400201246142387, 0.028681021183729172, 0.04427588731050491, 0.27019748091697693, -0.08607824146747589, 0.07795454561710358, -0.0580168142914772, 0.01827467419207096, -0.0022549668792635202, 0.14769606292247772, -0.006134458817541599, 0.010715248994529247, -0.06196911633014679, 0.09098552167415619, 0.02578061819076538, -0.15450024604797363, -0.0020516004879027605, -0.12490010261535645, -0.12028851360082626, 0.012353017926216125, -0.04587555304169655, 0.03157401457428932, 0.09435903280973434, -0.0024229588452726603, 0.03202694654464722, 0.02075263299047947, 0.019530585035681725, -0.11508375406265259, -0.1602489948272705, 0.016758626326918602, -0.031069323420524597, 0.11282220482826233, -0.006717705633491278, 0.11721664667129517, 0.10078779608011246, 0.02399747259914875, -0.0845598503947258, 0.09163026511669159, 0.028964754194021225, 0.011982843279838562, 0.06633637100458145, 0.09443942457437515, -0.028659548610448837, 0.05116744339466095, 0.03756558522582054, -0.05881480872631073, 0.05120884254574776, -0.07884804904460907, -0.017157435417175293, -0.1353413164615631, 0.07381800562143326, -0.030609775334596634, 0.14631663262844086, 0.17375485599040985, -0.01777099072933197, -0.00880184955894947, -0.051978956907987595, 0.022724667564034462, 0.007574496790766716, 0.0986042395234108, -0.014888533391058445, -0.2291075885295868, 0.02347498945891857, -0.0567944161593914, 0.035071127116680145, -0.23195871710777283, -0.039802271872758865, 0.02427622675895691, -0.05163606256246567, -0.03815176710486412, 0.07469076663255692, 0.08391223102807999, 0.029434341937303543, -0.0328151136636734, -0.1157740131020546, 0.0007350817904807627, 0.10130434483289719, -0.10652728378772736, -0.10858451575040817 ]
null
null
transformers
# legal_t5_small_multitask_cs_sv model Model on translating legal text from Cszech to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_cs_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Swedish. ### How to use Here is how to use this model to translate legal text from Cszech to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_cs_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_cs_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Hračky určené pro častý kontakt s kůží obsahující alergenní látky jiné než vonné, které jsou známé vyvoláváním vážných nebo dokonce osudných účinků na zdraví dětí (například látky, které mohou vyvolat anafylaktický šok), musí být v souladu s ustanoveními týkajícími se označování uvedenými ve směrnici Komise 2006/125/ES ze dne 5. prosince 2006 o obilných a ostatních příkrmech pro kojence a malé děti." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_cs_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_cs_sv | 35.871| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Swedish", "tags": ["translation Cszech Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Hra\u010dky ur\u010den\u00e9 pro \u010dast\u00fd kontakt s k\u016f\u017e\u00ed obsahuj\u00edc\u00ed alergenn\u00ed l\u00e1tky jin\u00e9 ne\u017e vonn\u00e9, kter\u00e9 jsou zn\u00e1m\u00e9 vyvol\u00e1v\u00e1n\u00edm v\u00e1\u017en\u00fdch nebo dokonce osudn\u00fdch \u00fa\u010dink\u016f na zdrav\u00ed d\u011bt\u00ed (nap\u0159\u00edklad l\u00e1tky, kter\u00e9 mohou vyvolat anafylaktick\u00fd \u0161ok), mus\u00ed b\u00fdt v souladu s ustanoven\u00edmi t\u00fdkaj\u00edc\u00edmi se ozna\u010dov\u00e1n\u00ed uveden\u00fdmi ve sm\u011brnici Komise 2006/125/ES ze dne 5. prosince 2006 o obiln\u00fdch a ostatn\u00edch p\u0159\u00edkrmech pro kojence a mal\u00e9 d\u011bti."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_cs_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_cs\_sv model ========================================= Model on translating legal text from Cszech to Swedish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_cs\_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Swedish. ### How to use Here is how to use this model to translate legal text from Cszech to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_cs\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_cs\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06674984842538834, 0.10972949117422104, -0.0027350119780749083, 0.09334184974431992, 0.06387066096067429, -0.014812106266617775, 0.0035619796253740788, 0.10029280930757523, -0.041104722768068314, 0.07776585966348648, 0.06796485185623169, 0.008087226189672947, 0.06243831664323807, 0.02842746488749981, 0.04066520929336548, -0.2207353413105011, 0.0032334320712834597, -0.04037094488739967, -0.038801439106464386, 0.10366781055927277, 0.09084756672382355, -0.04697297140955925, 0.04165203496813774, -0.040249574929475784, -0.05369250848889351, 0.019639436155557632, -0.07966318726539612, -0.02962156943976879, 0.10009966790676117, 0.07218924164772034, 0.07122605293989182, -0.007840733043849468, 0.07416782528162003, -0.17616352438926697, -0.0061771683394908905, 0.03121648170053959, -0.015629548579454422, 0.03432881459593773, 0.09270299226045609, 0.025219760835170746, 0.21368511021137238, -0.06305845826864243, 0.022156061604619026, 0.04042304679751396, -0.07266385853290558, -0.11630550771951675, -0.06838040798902512, 0.02906094491481781, 0.08560165762901306, 0.13631731271743774, -0.041366081684827805, 0.02630460076034069, -0.02125009149312973, 0.09435705840587616, 0.1110386773943901, -0.2516988515853882, -0.02927740290760994, 0.08154746890068054, 0.06679598987102509, 0.10170525312423706, -0.04946957528591156, 0.035110291093587875, 0.05784212052822113, 0.0771622359752655, 0.08887934684753418, -0.050674520432949066, -0.020291833207011223, -0.013614295050501823, -0.13290958106517792, -0.04712255671620369, 0.1754077970981598, 0.02614329569041729, -0.02187718078494072, -0.10845313221216202, -0.041989926248788834, -0.0632162094116211, 0.011286473833024502, -0.03034570999443531, 0.010570378974080086, -0.01838059164583683, 0.03803112730383873, -0.08504538238048553, -0.12677282094955444, -0.0552571639418602, -0.048510074615478516, 0.046517062932252884, 0.011237198486924171, 0.016946617513895035, 0.06599167734384537, 0.0532870776951313, -0.14095048606395721, -0.07764124870300293, -0.005368994083255529, 0.011371148750185966, -0.0967976450920105, 0.012408614158630371, -0.010637903586030006, -0.23350441455841064, 0.0024953545071184635, -0.02771681360900402, -0.05732177942991257, 0.01424398459494114, 0.049130093306303024, 0.0405585877597332, 0.045667633414268494, 0.12957018613815308, -0.10460161417722702, -0.12789735198020935, -0.03427421674132347, -0.03534930571913719, 0.008865323849022388, 0.00001775481359800324, -0.07502736151218414, -0.044926200062036514, 0.027232231572270393, 0.045370474457740784, -0.010105401277542114, 0.01409282349050045, 0.009084244258701801, -0.018874777480959892, 0.09333591908216476, -0.10084980726242065, -0.0010476737516000867, -0.0073540243320167065, -0.08480168133974075, -0.034010402858257294, 0.06489966064691544, -0.022791722789406776, -0.10743606090545654, 0.0813356265425682, -0.02461211569607258, 0.0028439497109502554, -0.08114287257194519, -0.1920064389705658, 0.020375369116663933, -0.025058189406991005, -0.05464017763733864, -0.08809532225131989, -0.11203134059906006, -0.0840478464961052, 0.03220891207456589, -0.054818037897348404, 0.0033941338770091534, -0.10371184349060059, -0.013406477868556976, 0.03885907307267189, -0.029903145506978035, 0.07731350511312485, -0.05004443973302841, 0.028654415160417557, -0.04365342855453491, 0.06633194535970688, -0.0033500692807137966, 0.022871730849146843, -0.0861666277050972, 0.03506889194250107, -0.09731937199831009, 0.14365941286087036, -0.05412773787975311, 0.002761074109002948, -0.1254763901233673, -0.05794330686330795, -0.05947946384549141, 0.04842572659254074, 0.07478858530521393, 0.13260984420776367, -0.21158315241336823, -0.034901686012744904, 0.19833222031593323, -0.08598113059997559, -0.060653675347566605, 0.11458458751440048, -0.010686579160392284, 0.03409252688288689, 0.0924578607082367, 0.12183022499084473, 0.024757934734225273, -0.04977721720933914, -0.06921158730983734, 0.019865360110998154, 0.0006094435229897499, -0.0019623618572950363, 0.10249493271112442, -0.07366926223039627, 0.1073666512966156, 0.02365620620548725, 0.0469660758972168, -0.00026671256637200713, -0.024501610547304153, -0.035930272191762924, 0.0016923421062529087, -0.035756245255470276, -0.04842503368854523, 0.017100175842642784, 0.015022028237581253, -0.07865814864635468, -0.06278401613235474, 0.044760338962078094, 0.07661774754524231, -0.07265152037143707, 0.04656483232975006, 0.04790185019373894, -0.04874726012349129, -0.1256394237279892, 0.015725325793027878, -0.14609946310520172, -0.01238252129405737, 0.0096785519272089, -0.010690158233046532, 0.09598904103040695, 0.061443932354450226, 0.06911473721265793, 0.08953949809074402, -0.056568507105112076, -0.0081338444724679, -0.008123182691633701, -0.02058287151157856, -0.10785617679357529, -0.11028537154197693, -0.03300394117832184, -0.017366284504532814, 0.017728907987475395, -0.1472628116607666, 0.001633545383810997, -0.01891205459833145, 0.08173725008964539, 0.007410536054521799, -0.032159727066755295, 0.04112473875284195, 0.05698687210679054, -0.02194167859852314, -0.027554422616958618, 0.036817021667957306, -0.013105062767863274, -0.07505898922681808, 0.12418706715106964, -0.08698851615190506, -0.10726232081651688, 0.09491513669490814, 0.030679820105433464, -0.09188664704561234, -0.017774010077118874, -0.0053992560133337975, -0.04991184175014496, -0.05239703133702278, -0.0666264146566391, 0.17073632776737213, 0.0644787847995758, 0.1419592797756195, -0.11123242974281311, -0.04773756116628647, 0.02315317466855049, -0.04705660045146942, -0.020274663344025612, 0.17792218923568726, 0.02846335805952549, -0.19373254477977753, 0.09238296747207642, -0.007383312564343214, -0.023926040157675743, 0.19222277402877808, 0.06553444266319275, -0.10708878189325333, -0.00537611311301589, 0.03212803974747658, -0.00028487457893788815, 0.06364009529352188, -0.06431533396244049, -0.018974220380187035, 0.03468073904514313, 0.06492746621370316, 0.0657278373837471, -0.08591705560684204, 0.054511237889528275, 0.05734460428357124, -0.049391936510801315, 0.07663854956626892, -0.026832427829504013, -0.06069880723953247, 0.09173011034727097, 0.014084705151617527, -0.030843831598758698, -0.04501887038350105, -0.04111819714307785, -0.11483489722013474, 0.19153305888175964, -0.09708558768033981, -0.2401677370071411, -0.15536825358867645, 0.030429305508732796, -0.06878305226564407, 0.022830814123153687, 0.04992540925741196, -0.05496538057923317, -0.054223041981458664, -0.11657899618148804, 0.09940366446971893, -0.07423124462366104, -0.0535941906273365, -0.10467588901519775, 0.0473753847181797, -0.01775691658258438, -0.13527576625347137, 0.0167530607432127, -0.0166741032153368, -0.029877711087465286, 0.0016536501934751868, -0.0443449467420578, 0.1171695813536644, 0.12877929210662842, -0.017503738403320312, -0.021377580240368843, 0.011336060240864754, 0.12104985862970352, -0.05331713333725929, 0.05733475834131241, 0.03415446728467941, 0.035426925867795944, 0.020626896992325783, 0.13252443075180054, 0.043333228677511215, -0.03619511052966118, 0.026267575100064278, 0.06385616958141327, -0.04387341067194939, -0.25841134786605835, -0.10571780800819397, -0.05986900255084038, 0.0022485905792564154, 0.08538240939378738, 0.05341031402349472, -0.06562881916761398, 0.022735396400094032, -0.03672782704234123, -0.018154313787817955, 0.01856495626270771, 0.05663850158452988, 0.027334632351994514, -0.02714613266289234, 0.08554061502218246, -0.05652010440826416, -0.012670468538999557, 0.0915699377655983, 0.03127806633710861, 0.1722400188446045, -0.04562930762767792, 0.211983785033226, 0.04985494166612625, 0.031913939863443375, -0.000015737743524368852, 0.08395326882600784, -0.04131611809134483, 0.012895001098513603, -0.005800224374979734, -0.06991352885961533, -0.02332165092229843, 0.06623092293739319, 0.007851850241422653, 0.007386321201920509, -0.04322120174765587, -0.04239293932914734, 0.07486094534397125, 0.2266208827495575, 0.07349827140569687, -0.1491401046514511, -0.0870642364025116, 0.005761702544987202, -0.07917346060276031, -0.06651967018842697, -0.0010216280352324247, 0.16796013712882996, -0.10270175337791443, 0.015690308064222336, 0.006803414318710566, 0.13326597213745117, -0.10053909569978714, -0.023560626432299614, 0.0056650033220648766, 0.017121946439146996, -0.02596614882349968, 0.11976511031389236, -0.22005122900009155, 0.18470263481140137, 0.026823237538337708, 0.06306620687246323, -0.05353517085313797, 0.010304765775799751, -0.04915192350745201, -0.0046037896536290646, 0.13129922747612, 0.043450791388750076, -0.08494244515895844, -0.10082025825977325, -0.10372664034366608, -0.020675675943493843, 0.054135989397764206, -0.06437563896179199, 0.09562760591506958, 0.06508628278970718, 0.004928957670927048, -0.032824497669935226, 0.053896620869636536, 0.0010212135966867208, -0.15015099942684174, -0.010700336657464504, -0.01771743968129158, -0.0483814999461174, -0.00630929134786129, -0.054381512105464935, -0.07241594046354294, 0.2422560602426529, -0.12087903171777725, -0.10958925634622574, -0.08999905735254288, 0.028786078095436096, 0.10978253930807114, -0.07721614092588425, 0.02492610551416874, 0.018426984548568726, 0.04919014126062393, -0.0772646814584732, -0.022623777389526367, 0.07539458572864532, -0.04941141605377197, -0.06616422533988953, -0.019563522189855576, 0.16401620209217072, 0.05551714450120926, 0.0480458065867424, -0.016877302899956703, 0.06531170755624771, 0.00629980955272913, -0.11048422753810883, -0.0086738932877779, 0.05520617216825485, 0.13990579545497894, 0.050129324197769165, -0.055004436522722244, -0.06648260354995728, -0.05327407643198967, -0.08214853703975677, 0.18011441826820374, 0.1727728694677353, -0.05803300440311432, 0.06294019520282745, 0.17631091177463531, -0.10748182237148285, -0.1994871348142624, -0.04995609074831009, 0.0863027423620224, 0.08435520529747009, -0.015985939651727676, -0.15397322177886963, 0.030519040301442146, 0.1206568256020546, -0.0035187355242669582, 0.026151813566684723, -0.3361159861087799, -0.14978893101215363, 0.010533268563449383, 0.03758189454674721, 0.005487389396876097, -0.08864323794841766, -0.0540081225335598, -0.0632157027721405, -0.09289886057376862, 0.09871984273195267, -0.0406913198530674, 0.11472880095243454, 0.0035335381980985403, 0.03236456587910652, 0.05025434121489525, -0.049244850873947144, 0.12925636768341064, 0.007348410319536924, 0.015807291492819786, -0.07507103681564331, 0.07129020988941193, 0.014858097769320011, -0.027221718803048134, 0.1881309151649475, -0.07660530507564545, 0.0389283150434494, -0.1501741260290146, -0.06357567757368088, -0.07294540107250214, 0.04228559136390686, -0.03746931627392769, -0.0810527354478836, -0.05801188200712204, 0.03632229566574097, 0.05918029695749283, -0.02136530913412571, 0.05742746591567993, -0.0977248027920723, 0.04719589650630951, 0.1876530647277832, 0.08147529512643814, 0.010356930084526539, -0.09234212338924408, 0.013901986181735992, -0.008509669452905655, 0.058971792459487915, -0.13210467994213104, -0.0005487330490723252, 0.1475309282541275, 0.033913563936948776, 0.10872228443622589, -0.039126671850681305, -0.12458261102437973, 0.004302458371967077, 0.05689728260040283, -0.10956607013940811, -0.10149858146905899, -0.017760634422302246, -0.01136672031134367, -0.049902211874723434, -0.01447497121989727, 0.12921157479286194, -0.09691482782363892, -0.013726427219808102, -0.009145161136984825, 0.03699435666203499, -0.06910894066095352, 0.218866765499115, 0.04392705857753754, 0.06176270544528961, -0.06637071818113327, 0.12004221230745316, 0.09643743187189102, -0.10012701153755188, 0.05110510066151619, 0.20060256123542786, -0.07741843909025192, -0.05089542642235756, 0.03134342283010483, 0.12379465997219086, -0.06239226460456848, -0.05285072699189186, -0.014299753122031689, -0.040317703038454056, 0.015684502199292183, 0.019258126616477966, 0.05106698349118233, 0.029193034395575523, -0.0068694246001541615, -0.05070759356021881, -0.07726671546697617, 0.10156809538602829, 0.04727231711149216, -0.007189728785306215, -0.022925248369574547, 0.09740497171878815, -0.005008149892091751, 0.007383269257843494, -0.023891879245638847, 0.017872324213385582, -0.04275912046432495, 0.007176165468990803, -0.06645115464925766, -0.0014330397825688124, -0.06651704758405685, -0.007723587565124035, -0.04482964798808098, 0.00571994436904788, -0.01538253016769886, 0.00481824716553092, -0.04915209859609604, -0.041944850236177444, -0.07717806845903397, 0.015228291042149067, -0.0955272689461708, -0.04928024858236313, -0.012510553002357483, -0.011256539262831211, 0.053555771708488464, 0.027843335643410683, -0.0022474259603768587, 0.05777332931756973, -0.009858943521976471, 0.06766670942306519, 0.02501950040459633, 0.037254393100738525, 0.018443603068590164, -0.02655484899878502, -0.009885230101644993, 0.033371590077877045, -0.0037883431650698185, 0.003391105215996504, 0.0021202610805630684, -0.1254674345254898, -0.062329065054655075, -0.049991168081760406, -0.026983629912137985, -0.0658373311161995, 0.10957960039377213, 0.06950070708990097, 0.07777184993028641, 0.10887850821018219, -0.0761362612247467, 0.07817678898572922, -0.1630251407623291, -0.0010980135994032025, 0.02236616238951683, -0.04607311636209488, -0.023831717669963837, -0.00012068326759617776, 0.04201063886284828, -0.08435361832380295, 0.13548491895198822, 0.025742916390299797, 0.05287015065550804, 0.027608191594481468, -0.05747870355844498, 0.021596474573016167, 0.017783600836992264, 0.12338221818208694, -0.02545262686908245, -0.019558627158403397, -0.05985686555504799, 0.08559132367372513, -0.011061614379286766, 0.10267671942710876, 0.06427038460969925, 0.10260943323373795, 0.13377724587917328, 0.05849127098917961, 0.022800035774707794, -0.061149731278419495, -0.0642317682504654, 0.05890985205769539, 0.0027157545555382967, 0.07080347836017609, -0.014493150636553764, 0.08520842343568802, 0.16878175735473633, -0.18004943430423737, 0.11618126183748245, 0.010662347078323364, -0.08692995458841324, -0.06746621429920197, -0.15814939141273499, -0.07572713494300842, -0.023080702871084213, -0.02425539493560791, -0.12981192767620087, 0.009538576006889343, 0.04400867596268654, 0.06705782562494278, -0.01726173236966133, 0.1053028479218483, -0.00702197989448905, -0.10636408627033234, 0.0679171085357666, 0.003003127872943878, 0.06905736029148102, -0.008648902177810669, 0.032125525176525116, 0.06689972430467606, 0.0022223941050469875, 0.02781759388744831, 0.04966678097844124, -0.0008550463826395571, -0.011697797104716301, -0.0009851281065493822, -0.06958625465631485, -0.03847777843475342, 0.035819608718156815, 0.0921894833445549, 0.16307643055915833, 0.06217312067747116, -0.09766149520874023, -0.03287170082330704, 0.1889497935771942, -0.050050683319568634, -0.0921337753534317, -0.1055355966091156, 0.19575420022010803, 0.017139073461294174, 0.05079038813710213, -0.008851932361721992, -0.09224041551351547, 0.0045186858624219894, 0.10830166935920715, 0.20187729597091675, -0.011122539639472961, -0.023389142006635666, -0.007197631988674402, -0.014025743119418621, 0.008834851905703545, 0.02908947318792343, 0.022039873525500298, 0.251450777053833, -0.07524176687002182, 0.0686119869351387, -0.0688745304942131, 0.0035885146353393793, -0.009327671490609646, 0.13644835352897644, -0.0001957176864380017, 0.004843146074563265, -0.049473606050014496, 0.10208679735660553, -0.013395830988883972, -0.15977858006954193, -0.008966987021267414, -0.09003943204879761, -0.1324308067560196, 0.01079612784087658, -0.0107215391471982, 0.04718037694692612, 0.06507978588342667, 0.016247831284999847, 0.05534292384982109, -0.004353062249720097, 0.020961526781320572, -0.1116626113653183, -0.11963126808404922, 0.008615934289991856, 0.0006494682165794075, 0.0742306262254715, 0.015660548582673073, 0.10318727791309357, 0.09121479839086533, 0.016788683831691742, -0.06764326244592667, 0.10465989261865616, 0.036808013916015625, -0.000042043007852043957, 0.07827206701040268, 0.12861046195030212, -0.01602320559322834, 0.06039823591709137, 0.03656839206814766, -0.08116968721151352, 0.021173834800720215, -0.035603269934654236, -0.022201374173164368, -0.1040201261639595, 0.09489842504262924, -0.03707512095570564, 0.14022372663021088, 0.18677952885627747, -0.006168429274111986, -0.01299580279737711, -0.06749272346496582, 0.027313124388456345, -0.017587309703230858, 0.06598887592554092, -0.005437537562102079, -0.2040606141090393, 0.015204951167106628, -0.04337691143155098, 0.017560012638568878, -0.22070527076721191, -0.037375666201114655, 0.01922345906496048, -0.04202929511666298, -0.01853080280125141, 0.07980021834373474, 0.06664577126502991, 0.01739940419793129, -0.023546498268842697, -0.06324704736471176, 0.023550698533654213, 0.09856972843408585, -0.11845482140779495, -0.11488717794418335 ]
null
null
transformers
# legal_t5_small_multitask_de_en model Model on translating legal text from Deustch to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_de_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to English. ### How to use Here is how to use this model to translate legal text from Deustch to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_de_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_de_en", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Der zuständige Ausschuss wacht darüber, dass alle Angaben, die die Ausübung des Mandats eines Mitglieds bzw. die Rangfolge der Stellvertreter beeinflussen können, dem Parlament unverzüglich von den Behörden der Mitgliedstaaten und der Union - unter Angabe deren Wirksamwerdens im Falle einer Benennung - übermittelt werden." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_de_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_de_en | 42.437| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch English", "tags": ["translation Deustch English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Der zust\u00e4ndige Ausschuss wacht dar\u00fcber, dass alle Angaben, die die Aus\u00fcbung des Mandats eines Mitglieds bzw. die Rangfolge der Stellvertreter beeinflussen k\u00f6nnen, dem Parlament unverz\u00fcglich von den Beh\u00f6rden der Mitgliedstaaten und der Union - unter Angabe deren Wirksamwerdens im Falle einer Benennung - \u00fcbermittelt werden."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_de_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_de\_en model ========================================= Model on translating legal text from Deustch to English. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_de\_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to English. ### How to use Here is how to use this model to translate legal text from Deustch to English in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_de\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0721229761838913, 0.13504454493522644, -0.0036881512496620417, 0.08543930947780609, 0.07166869938373566, 0.012466796673834324, 0.013773634098470211, 0.10925532132387161, -0.05115944892168045, 0.06841405481100082, 0.05629798769950867, 0.01786436140537262, 0.05159115791320801, 0.03194212168455124, 0.046988725662231445, -0.201224684715271, -0.00931849330663681, -0.02248368225991726, -0.04197169095277786, 0.0965552106499672, 0.0890137106180191, -0.054769471287727356, 0.050677914172410965, -0.038399673998355865, -0.06995213776826859, 0.03299520164728165, -0.08723313361406326, -0.03985506668686867, 0.1075541079044342, 0.07395906746387482, 0.0771920457482338, -0.016603002324700356, 0.0648643970489502, -0.19451618194580078, -0.001150654279626906, 0.06771877408027649, -0.008335692808032036, 0.04327748343348503, 0.09236562997102737, -0.0014089515898376703, 0.18758337199687958, -0.06258665025234222, 0.036641355603933334, 0.05270991101861, -0.11208108067512512, -0.12185557186603546, -0.0695318877696991, 0.0278027206659317, 0.0938386470079422, 0.1390828937292099, -0.03444722294807434, 0.020029591396450996, -0.0019357376731932163, 0.0758446529507637, 0.09803915023803711, -0.24099649488925934, -0.017596038058400154, 0.04129191115498543, 0.056302886456251144, 0.08170784264802933, -0.04412558674812317, 0.0036194452550262213, 0.0587909072637558, 0.0748821273446083, 0.06796601414680481, -0.04372367635369301, 0.02205680124461651, -0.00829513929784298, -0.12354627251625061, -0.06250878423452377, 0.15242615342140198, 0.0291688721626997, -0.03156781941652298, -0.10129835456609726, -0.05987858027219772, -0.07613059878349304, 0.006184873171150684, -0.03954197093844414, 0.011609402485191822, 0.00004933666787110269, 0.03941543027758598, -0.0687207579612732, -0.1101134866476059, -0.06881364434957504, -0.07937324792146683, 0.0604049451649189, 0.0370657779276371, 0.013751029968261719, 0.022091612219810486, 0.07850430905818939, -0.11716693639755249, -0.0789891704916954, 0.006017887499183416, 0.013886893168091774, -0.08025488257408142, 0.019360259175300598, -0.00764604052528739, -0.1803063154220581, -0.001667668460868299, -0.01174057088792324, -0.08038810640573502, 0.023752186447381973, 0.03685630112886429, 0.027075888589024544, 0.05722251534461975, 0.1175307035446167, -0.10257375240325928, -0.11986790597438812, -0.019954893738031387, -0.005071077961474657, -0.0015150730032473803, 0.019627699628472328, -0.06278138607740402, -0.02956763468682766, 0.016695700585842133, 0.06014886870980263, 0.004036926198750734, 0.004686030559241772, -0.007595902308821678, -0.029249191284179688, 0.13349615037441254, -0.09684841334819794, 0.006474354304373264, 0.0019866337534040213, -0.09097561240196228, -0.02254750020802021, 0.06153016537427902, -0.029820909723639488, -0.09842126816511154, 0.050531648099422455, -0.04002620279788971, -0.019064513966441154, -0.10053040087223053, -0.1834334284067154, 0.002684324514120817, -0.004413793794810772, -0.04957576468586922, -0.1010076180100441, -0.14956116676330566, -0.07314436882734299, 0.016029255464673042, -0.054360561072826385, 0.00904955342411995, -0.09240871667861938, 0.00650562159717083, 0.030811427161097527, -0.013280073180794716, 0.06020013988018036, -0.04322493448853493, 0.04595140367746353, 0.016966497525572777, 0.06067696213722229, 0.0017135139787569642, 0.04154908284544945, -0.07283630222082138, 0.04266147315502167, -0.07047518342733383, 0.15196815133094788, -0.005678355228155851, 0.012158130295574665, -0.13984136283397675, -0.049859847873449326, -0.08324858546257019, 0.04858379438519478, 0.08242852240800858, 0.12823113799095154, -0.20956376194953918, -0.032681796699762344, 0.2001841515302658, -0.06467357277870178, -0.06422886252403259, 0.11358154565095901, -0.03319862484931946, 0.04127797111868858, 0.08899158984422684, 0.08815810084342957, 0.030573075637221336, -0.04074953868985176, -0.05913239344954491, -0.01642073132097721, 0.008755100890994072, 0.03305807337164879, 0.0898025780916214, -0.0723980963230133, 0.09197577834129333, 0.006865785922855139, 0.06028756871819496, 0.02147163264453411, -0.03805241361260414, -0.031675226986408234, 0.004643267020583153, -0.044373638927936554, -0.030515460297465324, 0.007856468670070171, 0.01643165573477745, -0.06556543707847595, -0.08126787096261978, 0.053834158927202225, 0.10145898163318634, -0.06407570838928223, 0.027530403807759285, 0.01630416512489319, -0.06333755701780319, -0.11656982451677322, 0.022424789145588875, -0.15470385551452637, 0.010388071648776531, 0.02343614399433136, -0.04089011996984482, 0.10300295054912567, 0.031010283157229424, 0.05378050357103348, 0.08370057493448257, -0.05389336869120598, -0.021584978327155113, -0.02335496060550213, -0.012906181626021862, -0.105084128677845, -0.09682302176952362, -0.027209913358092308, -0.01714237965643406, 0.0033797624055296183, -0.14989207684993744, 0.017280805855989456, -0.04131157696247101, 0.08315734565258026, 0.010900934226810932, -0.03052455373108387, 0.02683931030333042, 0.05890350416302681, -0.03292619064450264, -0.03525492548942566, 0.025442494079470634, -0.012511133216321468, -0.041158854961395264, 0.09407304972410202, -0.12584324181079865, -0.09963618963956833, 0.10873539745807648, 0.027763070538640022, -0.09269721806049347, 0.00924170296639204, -0.0057681770995259285, -0.05635010823607445, -0.04706094413995743, -0.08578252047300339, 0.20734019577503204, 0.04966586455702782, 0.15559911727905273, -0.09991403669118881, -0.0568360835313797, 0.010486540384590626, -0.009646514430642128, -0.018656624481081963, 0.15283703804016113, 0.03413264825940132, -0.17537155747413635, 0.0893036499619484, 0.03643236309289932, -0.011523116379976273, 0.12201452255249023, 0.06036502122879028, -0.10721901804208755, -0.017270611599087715, 0.02517414651811123, -0.0035119177773594856, 0.04093936085700989, -0.08911959081888199, -0.02944711036980152, 0.023197371512651443, 0.07803640514612198, 0.07171067595481873, -0.09329121559858322, 0.06895089894533157, 0.07526465505361557, -0.03666777163743973, 0.0590364895761013, -0.03716146573424339, -0.04523907229304314, 0.1019403412938118, 0.019931025803089142, -0.014137393794953823, -0.039429839700460434, -0.03480148687958717, -0.10536465793848038, 0.18477541208267212, -0.10311347246170044, -0.2353053092956543, -0.13434015214443207, 0.012903260998427868, -0.04173607751727104, 0.018026651814579964, 0.04519502446055412, -0.05321504548192024, -0.04455441236495972, -0.10090938210487366, 0.11005599796772003, -0.1174837201833725, -0.04933471232652664, -0.08387962728738785, 0.04705098643898964, -0.013256764970719814, -0.1521453857421875, 0.02059163898229599, 0.00807451456785202, -0.03411652520298958, -0.00031272784690372646, -0.04046974331140518, 0.1063983216881752, 0.14321225881576538, -0.02093043364584446, -0.02976161427795887, 0.00027039911947213113, 0.1294509917497635, -0.06262991577386856, 0.046742167323827744, 0.058721527457237244, 0.054766349494457245, 0.018170630559325218, 0.11772454530000687, 0.04034547135233879, -0.059603385627269745, 0.04761480912566185, 0.06069806590676308, -0.022778065875172615, -0.26875656843185425, -0.08848623186349869, -0.05833427980542183, -0.014826643280684948, 0.09111414849758148, 0.053468551486730576, -0.029005276039242744, -0.0017457003705203533, -0.03875599429011345, 0.040255896747112274, 0.008632844313979149, 0.06204015389084816, 0.07399222254753113, -0.0237809456884861, 0.08066113293170929, -0.060895875096321106, -0.03444918990135193, 0.09163466095924377, 0.04586087912321091, 0.17994092404842377, -0.028140248730778694, 0.2256479412317276, 0.05285751819610596, 0.016177913174033165, 0.0016805093036964536, 0.07536106556653976, -0.044608261436223984, 0.03141242638230324, -0.030655721202492714, -0.05763871595263481, 0.0005410881130956113, 0.0705394595861435, 0.008934134617447853, 0.01593795418739319, -0.0533602349460125, -0.05857430398464203, 0.06832855939865112, 0.20811061561107635, 0.07965251803398132, -0.1852341741323471, -0.06214144453406334, 0.005344506353139877, -0.08541153371334076, -0.07773910462856293, 0.017026379704475403, 0.12789227068424225, -0.07693234831094742, -0.004591033328324556, 0.02989315241575241, 0.1329815685749054, -0.12602245807647705, -0.02089923433959484, 0.006984105333685875, 0.02252178080379963, -0.02493087761104107, 0.10704981535673141, -0.25970134139060974, 0.16442306339740753, 0.02814268134534359, 0.061751484870910645, -0.033129461109638214, 0.013691400177776814, -0.038413580507040024, -0.026100125163793564, 0.09869851171970367, 0.01940053701400757, -0.03333697095513344, -0.1012200340628624, -0.10176234692335129, -0.020963918417692184, 0.058751028031110764, -0.05636269599199295, 0.0943794995546341, 0.06455359607934952, 0.00031846793717704713, -0.00854931678622961, 0.06698931753635406, -0.03678765520453453, -0.16294659674167633, -0.013687945902347565, -0.014427870512008667, -0.02723967656493187, -0.011136944405734539, -0.0433177687227726, -0.041365835815668106, 0.21621417999267578, -0.1184939295053482, -0.08800995349884033, -0.07806125283241272, 0.03027026541531086, 0.12097953259944916, -0.0737258791923523, 0.032886575907468796, 0.015452037565410137, 0.03197123482823372, -0.050359975546598434, -0.0446377657353878, 0.0845242291688919, -0.048863790929317474, -0.06680960953235626, -0.06030886620283127, 0.15229760110378265, 0.05357484519481659, 0.05142064020037651, -0.029548341408371925, 0.046532049775123596, -0.0015375104267150164, -0.09308139234781265, -0.0026587131433188915, 0.060661040246486664, 0.13100998103618622, 0.050674982368946075, -0.0608799047768116, -0.06749694049358368, -0.07654877007007599, -0.08344234526157379, 0.1541023999452591, 0.16940756142139435, -0.04756896197795868, 0.0271316971629858, 0.1682409793138504, -0.1167168840765953, -0.15985852479934692, -0.047866277396678925, 0.06390858441591263, 0.07686464488506317, -0.03301837295293808, -0.18548612296581268, 0.016612233594059944, 0.12481296807527542, 0.002349963178858161, 0.06913433969020844, -0.3649134933948517, -0.14581531286239624, 0.03887362778186798, 0.03462112694978714, -0.004347262438386679, -0.13267643749713898, -0.05250014364719391, -0.06882848590612411, -0.10555727779865265, 0.10599910467863083, -0.03226868808269501, 0.09613174200057983, -0.009171362034976482, 0.03524655103683472, 0.036818891763687134, -0.035731058567762375, 0.12265871465206146, 0.014422676526010036, 0.03744801506400108, -0.050320133566856384, 0.03566643223166466, 0.008357623592019081, -0.025332486256957054, 0.16229943931102753, -0.051516495645046234, 0.05866437777876854, -0.15543167293071747, -0.06451409310102463, -0.06033933535218239, 0.011863481253385544, -0.0363595150411129, -0.07731810212135315, -0.052000775933265686, 0.02272637002170086, 0.032099030911922455, -0.01757822372019291, 0.01581808179616928, -0.056907184422016144, 0.03914346173405647, 0.16289940476417542, 0.06544920802116394, 0.025439612567424774, -0.09613936394453049, 0.007757401559501886, -0.010087359696626663, 0.06489548832178116, -0.15612341463565826, 0.001648972393013537, 0.14456143975257874, 0.03634972497820854, 0.1292305737733841, -0.013076608069241047, -0.1351366490125656, 0.015118601731956005, 0.05624411255121231, -0.081291563808918, -0.1201857328414917, -0.013778182677924633, 0.031210020184516907, -0.05763932317495346, -0.008075862191617489, 0.11966288089752197, -0.07888209819793701, -0.03241109475493431, -0.005180174019187689, 0.030309002846479416, -0.06521551311016083, 0.21822968125343323, 0.03849031403660774, 0.048899441957473755, -0.05300838500261307, 0.11215490102767944, 0.12934495508670807, -0.12194596976041794, 0.04146870598196983, 0.199534073472023, -0.06504184752702713, -0.05351180210709572, -0.003131893230602145, 0.11242672055959702, -0.043691132217645645, -0.048677317798137665, -0.01245998777449131, -0.030395569279789925, 0.030627990141510963, 0.013833367265760899, 0.03335091099143028, 0.04953645169734955, -0.016193140298128128, -0.031528130173683167, -0.08686470985412598, 0.08985459059476852, 0.05238804221153259, 0.009551611728966236, -0.024781296029686928, 0.09038568288087845, 0.014146976172924042, 0.005914600100368261, -0.011874456889927387, -0.0010231041815131903, -0.05016189441084862, 0.016232511028647423, -0.09935911744832993, -0.004014377482235432, -0.06408362835645676, -0.018233293667435646, -0.03214874491095543, 0.007190376985818148, -0.012512666173279285, 0.0028477918822318316, -0.03620455414056778, -0.06013079732656479, -0.05295567214488983, 0.0315709263086319, -0.09558925777673721, -0.046741075813770294, -0.0013238912215456367, -0.03010411374270916, 0.0638098195195198, 0.03199091553688049, 0.007084894925355911, 0.02734106034040451, 0.009366828948259354, 0.054568901658058167, 0.02351916767656803, 0.04773386940360069, 0.00944000668823719, -0.06844905763864517, 0.012454664334654808, 0.02495516650378704, -0.0075496165081858635, -0.008370717987418175, 0.0028964264784008265, -0.1350056678056717, -0.05188775807619095, -0.056307923048734665, -0.021464407444000244, -0.06539414823055267, 0.07413893938064575, 0.07165698707103729, 0.07019263505935669, 0.0887971967458725, -0.06492162495851517, 0.07320835441350937, -0.17142389714717865, -0.004172701388597488, 0.006183313671499491, -0.02210325561463833, -0.01229766570031643, -0.0015215071616694331, 0.05025588721036911, -0.06843290477991104, 0.1458672732114792, -0.0014154339442029595, 0.0464906245470047, 0.02741648070514202, -0.08001083880662918, -0.0043453918769955635, 0.016982298344373703, 0.12556537985801697, -0.012581311166286469, -0.0312498789280653, -0.07537209987640381, 0.08441662043333054, 0.0009742999100126326, 0.12399516999721527, 0.04403579980134964, 0.1383328139781952, 0.13437865674495697, 0.04174331575632095, 0.01323004998266697, -0.08682823926210403, -0.06324810534715652, 0.04194262623786926, 0.0031328690238296986, 0.0539054311811924, -0.019280023872852325, 0.10061664879322052, 0.16211065649986267, -0.15907874703407288, 0.11360937356948853, 0.0010433091083541512, -0.07513459771871567, -0.050416626036167145, -0.09692958742380142, -0.05358211696147919, -0.038572005927562714, -0.03631088137626648, -0.11439244449138641, 0.002378974575549364, 0.030166994780302048, 0.056422434747219086, -0.030223015695810318, 0.10001176595687866, 0.025682533159852028, -0.10559479147195816, 0.06103638932108879, 0.005579331424087286, 0.06815547496080399, -0.002061824779957533, 0.044023070484399796, 0.057454463094472885, -0.008061016909778118, 0.04739546403288841, 0.05470621585845947, -0.02705392800271511, 0.015061940997838974, -0.00028244315763004124, -0.05973973497748375, -0.042900145053863525, 0.03571893647313118, 0.0602903813123703, 0.2032625377178192, 0.06590573489665985, -0.0891299620270729, -0.015708424150943756, 0.16810356080532074, -0.04952983930706978, -0.09290807694196701, -0.09272751957178116, 0.22373874485492706, 0.05078229680657387, 0.032963745296001434, -0.0018696157494559884, -0.10509169101715088, -0.012421296909451485, 0.11105415970087051, 0.20658576488494873, -0.03209354355931282, -0.03440631926059723, 0.024831358343362808, -0.008447090163826942, 0.006930381990969181, 0.018758999183773994, 0.0658114030957222, 0.2778850197792053, -0.07354553788900375, 0.0558280348777771, -0.06095924973487854, 0.010431217961013317, 0.002971889218315482, 0.13893212378025055, -0.004396074917167425, 0.008584321476519108, -0.036475807428359985, 0.08464506268501282, -0.0018027926562353969, -0.17426005005836487, -0.01449570432305336, -0.10524727404117584, -0.10714340209960938, 0.014408952556550503, -0.0434475801885128, 0.037458594888448715, 0.06504028290510178, 0.012676484882831573, 0.04219938442111015, 0.038857582956552505, 0.009053162299096584, -0.12010245770215988, -0.11066639423370361, 0.019854053854942322, -0.002854498103260994, 0.0742836520075798, 0.001027095946483314, 0.11463265866041183, 0.09900371730327606, 0.028473449870944023, -0.08526260405778885, 0.10863582789897919, 0.02439187467098236, 0.0365125834941864, 0.07040177285671234, 0.09945850819349289, -0.00249371281825006, 0.037086836993694305, 0.04380273073911667, -0.04922722280025482, 0.03139482066035271, -0.045792356133461, -0.02733675390481949, -0.14067451655864716, 0.08152002841234207, -0.02541935257613659, 0.13739310204982758, 0.16273842751979828, -0.017622334882616997, -0.001124094007536769, -0.04575632885098457, 0.010610109195113182, -0.0015473937382921576, 0.08938513696193695, -0.020758314058184624, -0.19371484220027924, 0.03171990439295769, -0.049702513962984085, 0.03177986294031143, -0.26893162727355957, -0.043228428810834885, 0.03228938579559326, -0.04976065084338188, -0.02871514856815338, 0.08437588065862656, 0.07212753593921661, 0.03604578599333763, -0.03952020779252052, -0.07925988733768463, -0.01087815873324871, 0.1092619076371193, -0.12317264825105667, -0.11552689224481583 ]
null
null
transformers
# legal_t5_small_multitask_de_es model Model on translating legal text from Deustch to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_de_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Spanish. ### How to use Here is how to use this model to translate legal text from Deustch to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_de_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_de_es", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Kugelförmige, eiförmige oder ellipsenförmige Verpackungen dürfen keine Abmessungen aufweisen, die durch eine Einklemmung im Mund oder Rachen eine Blockierung der internen Atemwege verursachen können." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_de_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_de_es | 36.458| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Spanish", "tags": ["translation Deustch Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Kugelf\u00f6rmige, eif\u00f6rmige oder ellipsenf\u00f6rmige Verpackungen d\u00fcrfen keine Abmessungen aufweisen, die durch eine Einklemmung im Mund oder Rachen eine Blockierung der internen Atemwege verursachen k\u00f6nnen."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_de_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_de\_es model ========================================= Model on translating legal text from Deustch to Spanish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_de\_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Spanish. ### How to use Here is how to use this model to translate legal text from Deustch to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_de\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08563581854104996, 0.13812381029129028, -0.004081119317561388, 0.09306404739618301, 0.07711002230644226, 0.0066772522404789925, 0.001196470344439149, 0.10813146084547043, -0.051649391651153564, 0.07825488597154617, 0.04638289287686348, 0.03706137835979462, 0.047917384654283524, 0.014015479013323784, 0.03286974877119064, -0.1984093338251114, -0.0070124524645507336, -0.027175216004252434, -0.04778620973229408, 0.0889517143368721, 0.08052871376276016, -0.04748784750699997, 0.04640498012304306, -0.04534433037042618, -0.06844804435968399, 0.04403385892510414, -0.07648477703332901, -0.061052873730659485, 0.11138027161359787, 0.0779464989900589, 0.08373766392469406, -0.00916301365941763, 0.061389707028865814, -0.17280375957489014, -0.004689723718911409, 0.061343684792518616, -0.015940338373184204, 0.045871201902627945, 0.10815810412168503, -0.010210631415247917, 0.1901499629020691, -0.07140424847602844, 0.03437921777367592, 0.05193737894296646, -0.1326972395181656, -0.12565746903419495, -0.07583413273096085, 0.0026728122029453516, 0.0912223607301712, 0.13171280920505524, -0.028492355719208717, 0.014255505986511707, -0.0030295932665467262, 0.058385927230119705, 0.09873385727405548, -0.23264747858047485, -0.023862015455961227, 0.03462753817439079, 0.057966481894254684, 0.0891055017709732, -0.02706528641283512, 0.014307769946753979, 0.06819910556077957, 0.0651828795671463, 0.05853563919663429, -0.05374974384903908, 0.0033043655566871166, -0.009982765652239323, -0.1164165660738945, -0.07527241855859756, 0.13645115494728088, 0.019427116960287094, -0.025667937472462654, -0.09743296355009079, -0.06672917306423187, -0.06595922261476517, 0.010637667961418629, -0.04033415764570236, 0.0174394603818655, -0.007950462400913239, 0.04984471574425697, -0.05600457265973091, -0.10426918417215347, -0.06762044131755829, -0.08053632825613022, 0.05963626503944397, 0.030765080824494362, 0.00468109268695116, 0.03544033318758011, 0.08389995247125626, -0.10309229791164398, -0.08576717972755432, 0.016122637316584587, 0.027053195983171463, -0.0754394605755806, 0.025256717577576637, -0.009395522065460682, -0.1869637668132782, 0.00715473759919405, -0.030743056908249855, -0.10611691325902939, 0.01672344282269478, 0.03646145761013031, 0.026018453761935234, 0.055151764303445816, 0.11248183995485306, -0.09634833037853241, -0.11420901864767075, -0.026961369439959526, -0.018546150997281075, 0.0008267265511676669, 0.016860822215676308, -0.07974663376808167, -0.043013568967580795, 0.008343200199306011, 0.07462777197360992, 0.01847708411514759, -0.006593337748199701, -0.014016040600836277, -0.03909752890467644, 0.13318049907684326, -0.09188268333673477, 0.014683086425065994, 0.0004799991729669273, -0.09151162207126617, -0.017069164663553238, 0.04571811482310295, -0.03731314092874527, -0.09868866950273514, 0.03807475045323372, -0.038928110152482986, -0.029576756060123444, -0.11086269468069077, -0.18330630660057068, 0.014030885882675648, -0.012809705920517445, -0.04792070761322975, -0.10494469106197357, -0.13366273045539856, -0.08977700769901276, 0.016011152416467667, -0.07646875828504562, 0.018705258145928383, -0.10253140330314636, 0.019613685086369514, 0.031474389135837555, -0.018509887158870697, 0.06088974326848984, -0.04445440694689751, 0.05310695618391037, 0.029301604256033897, 0.06597048789262772, -0.00042983866296708584, 0.038838356733322144, -0.06793335825204849, 0.05187045782804489, -0.0851609855890274, 0.1660812646150589, -0.004205561242997646, 0.010662444867193699, -0.14689505100250244, -0.04700012132525444, -0.08417272567749023, 0.05723245069384575, 0.08533524721860886, 0.14067715406417847, -0.2040431797504425, -0.038105398416519165, 0.2093910276889801, -0.05479995533823967, -0.05738362297415733, 0.1041826456785202, -0.027612945064902306, 0.052398256957530975, 0.08979929238557816, 0.09683196991682053, 0.02282320149242878, -0.044082432985305786, -0.04637747257947922, -0.019545910879969597, 0.017900705337524414, 0.0177211482077837, 0.09809552133083344, -0.08430979400873184, 0.09575636684894562, 0.00930606946349144, 0.039236243814229965, 0.01639449968934059, -0.032943107187747955, -0.028805311769247055, 0.016433976590633392, -0.029029536992311478, -0.034067533910274506, 0.008827686309814453, 0.021649271249771118, -0.05884290859103203, -0.07301435619592667, 0.055883780121803284, 0.08526992797851562, -0.05867871269583702, 0.023132488131523132, 0.01746342144906521, -0.04779472202062607, -0.13438789546489716, 0.019473835825920105, -0.15641598403453827, 0.013116840273141861, 0.019898515194654465, -0.027072643861174583, 0.09596972167491913, 0.028300125151872635, 0.0495789609849453, 0.08017019927501678, -0.05110544711351395, -0.017396962270140648, -0.028829624876379967, -0.016607113182544708, -0.09474687278270721, -0.10056664049625397, -0.025572963058948517, -0.015680436044931412, -0.0013692115899175406, -0.1513943076133728, 0.011969194747507572, -0.035789623856544495, 0.06807982921600342, 0.002384774386882782, -0.021734263747930527, 0.017050104215741158, 0.07340493053197861, -0.03333238884806633, -0.04459158331155777, 0.03516193479299545, -0.006284461822360754, -0.02855057828128338, 0.08909187465906143, -0.13049814105033875, -0.09721788018941879, 0.10258261859416962, 0.02542739175260067, -0.0947856679558754, -0.00933090504258871, 0.001639663241803646, -0.04753510281443596, -0.05055350437760353, -0.0819196030497551, 0.21330992877483368, 0.04654484987258911, 0.16326221823692322, -0.1197429671883583, -0.049595221877098083, 0.016371412202715874, -0.015012377873063087, -0.027411188930273056, 0.15972813963890076, 0.04136969521641731, -0.17149505019187927, 0.09769326448440552, 0.020785056054592133, -0.005658356938511133, 0.12592121958732605, 0.06545977294445038, -0.1080116257071495, -0.01714693196117878, 0.037284500896930695, 0.015215807594358921, 0.0434526652097702, -0.08800506591796875, -0.026792442426085472, 0.019989194348454475, 0.07778345793485641, 0.08235448598861694, -0.09734927117824554, 0.06471091508865356, 0.07389403134584427, -0.03600933030247688, 0.05576084554195404, -0.034541916102170944, -0.04776831343770027, 0.10502339899539948, 0.03403265401721001, -0.022754009813070297, -0.04404839128255844, -0.032507654279470444, -0.10562893748283386, 0.18533308804035187, -0.09982587397098541, -0.24786998331546783, -0.14586777985095978, 0.016058316454291344, -0.0369737334549427, 0.03949643298983574, 0.04834498465061188, -0.057402320206165314, -0.029993902891874313, -0.07577143609523773, 0.12075868248939514, -0.10194171220064163, -0.05966944247484207, -0.08638156205415726, 0.056778114289045334, -0.024429673328995705, -0.14818613231182098, 0.020510995760560036, 0.012319155968725681, -0.039808351546525955, -0.009650318883359432, -0.05632830038666725, 0.12549996376037598, 0.15285131335258484, -0.012577095068991184, -0.03294806554913521, -0.0004141312383580953, 0.11741306632757187, -0.0694466233253479, 0.03623713552951813, 0.07197239249944687, 0.06869512796401978, 0.010778948664665222, 0.11612726002931595, 0.040904343128204346, -0.06453661620616913, 0.03136612847447395, 0.055319905281066895, -0.02955172397196293, -0.27109161019325256, -0.09720470756292343, -0.05920163542032242, -0.03110567480325699, 0.1012658029794693, 0.047935184091329575, -0.0224133413285017, 0.015806470066308975, -0.034596990793943405, 0.03959719091653824, 0.0002651755348779261, 0.06337081640958786, 0.08280707895755768, -0.024178313091397285, 0.06716523319482803, -0.06171146035194397, -0.04827045649290085, 0.09924986213445663, 0.0654277354478836, 0.17510777711868286, -0.02905556932091713, 0.23622514307498932, 0.05542410537600517, 0.01614830642938614, -0.010121362283825874, 0.07349316775798798, -0.03985234722495079, 0.030862506479024887, -0.04050324484705925, -0.06100974231958389, -0.008120972663164139, 0.05370684713125229, -0.002027490409091115, 0.011895954608917236, -0.0744137316942215, -0.07557452470064163, 0.06716258078813553, 0.1988862007856369, 0.06639689952135086, -0.1943521946668625, -0.058147553354501724, 0.0031279316172003746, -0.06825815886259079, -0.08304382860660553, 0.006913397926837206, 0.13486754894256592, -0.08400014787912369, -0.0010246767196804285, 0.022987687960267067, 0.13740427792072296, -0.1278955638408661, -0.01726943626999855, 0.00332517153583467, 0.023924648761749268, -0.01725400798022747, 0.11506441980600357, -0.24947825074195862, 0.1787053346633911, 0.030530523508787155, 0.05905533209443092, -0.035205256193876266, 0.015678077936172485, -0.055090032517910004, -0.03048437088727951, 0.10077415406703949, 0.018667180091142654, -0.026997001841664314, -0.09719204157590866, -0.09529240429401398, -0.02330821007490158, 0.04525864124298096, -0.06622931361198425, 0.09641967713832855, 0.06800583750009537, -0.00144330901093781, -0.013771780766546726, 0.05988235026597977, -0.016087958589196205, -0.17854182422161102, -0.016135167330503464, -0.0272659994661808, -0.022941770032048225, -0.008922494016587734, -0.04002488777041435, -0.039849597960710526, 0.21272273361682892, -0.11425051838159561, -0.08486203104257584, -0.08211224526166916, 0.030849847942590714, 0.12227004766464233, -0.07099118083715439, 0.03522264212369919, 0.015961306169629097, 0.02542956732213497, -0.049483489245176315, -0.03722130134701729, 0.10240206867456436, -0.05889423191547394, -0.059179116040468216, -0.06602680683135986, 0.14743375778198242, 0.04860946908593178, 0.04603991284966469, -0.016045331954956055, 0.04230494424700737, 0.008388265036046505, -0.09320897608995438, -0.016981257125735283, 0.048434797674417496, 0.1362362504005432, 0.04511178284883499, -0.07033374905586243, -0.07317500561475754, -0.06948892772197723, -0.08617755025625229, 0.14747239649295807, 0.15819630026817322, -0.04851168394088745, 0.03176869824528694, 0.17002087831497192, -0.12352275848388672, -0.1501672863960266, -0.04592902213335037, 0.07993808388710022, 0.0759962797164917, -0.03273122385144234, -0.19417884945869446, -0.011910423636436462, 0.13823536038398743, 0.003918142057955265, 0.06107282266020775, -0.3991827368736267, -0.13393180072307587, 0.01867971383035183, 0.03492555394768715, 0.007321973796933889, -0.12958917021751404, -0.06853615492582321, -0.08037180453538895, -0.09373992681503296, 0.10904666781425476, -0.02916775830090046, 0.09353214502334595, -0.004941576160490513, 0.026753976941108704, 0.04585568234324455, -0.030450396239757538, 0.1415318101644516, 0.002524661598727107, 0.030820557847619057, -0.04535217955708504, 0.03375782445073128, 0.013423001393675804, -0.02229873649775982, 0.14394593238830566, -0.03373921290040016, 0.04865380376577377, -0.1585530936717987, -0.06850847601890564, -0.05915847793221474, 0.013383219949901104, -0.04034481942653656, -0.06648501753807068, -0.04076135531067848, 0.012011394836008549, 0.03986094146966934, -0.01788152940571308, 0.015654979273676872, -0.058791205286979675, 0.047796815633773804, 0.16977311670780182, 0.06469622254371643, 0.03205449879169464, -0.09966834634542465, 0.0052719078958034515, 0.0032101545948535204, 0.06736849248409271, -0.13766424357891083, -0.0012297083158046007, 0.14521265029907227, 0.030515024438500404, 0.11103055626153946, -0.014715217985212803, -0.13442684710025787, 0.021697144955396652, 0.07374943792819977, -0.06230172887444496, -0.11347077041864395, -0.01660851389169693, 0.036763716489076614, -0.04479992389678955, -0.012838969938457012, 0.12031953781843185, -0.060997478663921356, -0.04397750273346901, -0.007825120352208614, 0.01982569508254528, -0.060862310230731964, 0.2262442708015442, 0.03480267524719238, 0.04652651399374008, -0.055092476308345795, 0.10605956614017487, 0.1319241225719452, -0.13676215708255768, 0.03756491094827652, 0.19672176241874695, -0.05884642153978348, -0.049032580107450485, 0.01213106233626604, 0.11914314329624176, -0.08996261656284332, -0.05950320139527321, -0.03343509882688522, -0.029869407415390015, 0.026250381022691727, 0.01998038962483406, 0.03000047616660595, 0.04370392858982086, -0.009869737550616264, -0.03306113928556442, -0.07427922636270523, 0.06998557597398758, 0.05194319412112236, 0.00830028485506773, -0.026177730411291122, 0.08927425742149353, 0.014048448763787746, -0.0051969969645142555, -0.012375674210488796, 0.0029513684567064047, -0.06750307977199554, 0.020847665145993233, -0.09650251269340515, 0.008518257178366184, -0.060641005635261536, -0.01987999863922596, -0.03833948075771332, 0.016587266698479652, -0.00846842397004366, -0.004989852663129568, -0.03596040979027748, -0.05188500136137009, -0.059255488216876984, 0.03649305924773216, -0.08561664074659348, -0.035946231335401535, -0.012685198336839676, -0.02837887778878212, 0.05152500420808792, 0.0182162094861269, 0.008892365731298923, 0.03115825727581978, -0.017408380284905434, 0.063729427754879, 0.03176373988389969, 0.050414830446243286, 0.02060076780617237, -0.06377020478248596, 0.025942830368876457, 0.03527599573135376, -0.0069849505089223385, -0.00799444131553173, 0.0024668348487466574, -0.1390582174062729, -0.044958632439374924, -0.046544674783945084, -0.038129501044750214, -0.05788273736834526, 0.0963514894247055, 0.07802215963602066, 0.05812852084636688, 0.07819753885269165, -0.07454165816307068, 0.06881251931190491, -0.16571468114852905, -0.01177047099918127, 0.002935777185484767, -0.01117265596985817, -0.010353142395615578, 0.0039912122301757336, 0.05308176204562187, -0.0508735254406929, 0.14074891805648804, 0.02258162386715412, 0.07923365384340286, 0.01685044728219509, -0.07554412633180618, 0.0036250560078769922, 0.014509424567222595, 0.1268480122089386, -0.006103139370679855, -0.017908575013279915, -0.0842762142419815, 0.10579801350831985, 0.0022642884869128466, 0.12815245985984802, 0.028421444818377495, 0.14287620782852173, 0.1412876695394516, 0.043982427567243576, 0.02318033203482628, -0.08797605335712433, -0.055435553193092346, 0.04687262699007988, 0.012469331733882427, 0.048232994973659515, -0.017033260315656662, 0.07062221318483353, 0.1544416844844818, -0.16221070289611816, 0.12198685854673386, 0.007290552370250225, -0.06687234342098236, -0.04955551028251648, -0.1047249361872673, -0.04547679051756859, -0.050096262246370316, -0.043850257992744446, -0.11055862158536911, 0.008667637594044209, 0.020620077848434448, 0.04644215479493141, -0.036428771913051605, 0.09791652858257294, 0.022919759154319763, -0.12369614839553833, 0.06038181856274605, 0.01778929866850376, 0.0994044691324234, -0.010420827195048332, 0.054307155311107635, 0.05228574946522713, 0.014160579070448875, 0.044947270303964615, 0.06450692564249039, -0.0141444755718112, 0.009899401105940342, 0.003937719389796257, -0.04954181984066963, -0.039847929030656815, 0.03620285168290138, 0.0669320747256279, 0.21734164655208588, 0.05763952434062958, -0.09268976747989655, -0.014890458434820175, 0.18648479878902435, -0.048059817403554916, -0.08479239791631699, -0.09132976084947586, 0.2266121655702591, 0.04777722805738449, 0.04618309810757637, -0.00752032408490777, -0.10953821241855621, -0.013757695443928242, 0.10940901190042496, 0.19809585809707642, -0.02718180976808071, -0.04088467359542847, 0.019818438217043877, -0.0029570437036454678, 0.02126188948750496, 0.028035035356879234, 0.05162860453128815, 0.308091938495636, -0.0764036476612091, 0.048443540930747986, -0.058302395045757294, 0.03029874712228775, -0.004666808992624283, 0.15034939348697662, -0.014699568971991539, 0.005106749478727579, -0.028032006695866585, 0.10039632022380829, 0.003968364093452692, -0.18946141004562378, 0.0026578803081065416, -0.11183319240808487, -0.1156129315495491, 0.0026688205543905497, -0.03351885452866554, 0.03926144540309906, 0.07684323936700821, 0.017972644418478012, 0.03444598242640495, 0.030295254662632942, 0.018153652548789978, -0.117745041847229, -0.12347974628210068, 0.016603080555796623, 0.0020150188356637955, 0.08435487002134323, -0.0057248747907578945, 0.10645832866430283, 0.1028280258178711, 0.02602604776620865, -0.0882694348692894, 0.10446126013994217, 0.023709498345851898, 0.03635694831609726, 0.09637651592493057, 0.06286982446908951, -0.002072796691209078, 0.02547440491616726, 0.03871433064341545, -0.059733133763074875, 0.03723355382680893, -0.02817796729505062, -0.012558349408209324, -0.14183339476585388, 0.08619774878025055, -0.030829712748527527, 0.1250607967376709, 0.16238069534301758, -0.01739949733018875, 0.0073685068637132645, -0.047627002000808716, 0.026866719126701355, 0.005176508333534002, 0.10975891351699829, -0.020211737602949142, -0.20664937794208527, 0.02385893650352955, -0.048213135451078415, 0.026617322117090225, -0.2666902542114258, -0.029610086232423782, 0.03708411753177643, -0.05403962358832359, -0.029624268412590027, 0.07234137505292892, 0.056685835123062134, 0.04301043599843979, -0.03350415453314781, -0.08258204907178879, -0.01654084213078022, 0.10942135006189346, -0.11007161438465118, -0.1075664833188057 ]
null
null
transformers
# legal_t5_small_multitask_de_fr model Model on translating legal text from Deustch to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_de_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to French. ### How to use Here is how to use this model to translate legal text from Deustch to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_de_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_de_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Wegen einer in Ausübung ihres Amtes erfolgten Äußerung oder Abstimmung dürfen Mitglieder des Europäischen Parlaments weder in ein Ermittlungsverfahren verwickelt noch festgenommen oder verfolgt werden." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_de_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_de_fr | 41.003| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch French", "tags": ["translation Deustch French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Wegen einer in Aus\u00fcbung ihres Amtes erfolgten \u00c4u\u00dferung oder Abstimmung d\u00fcrfen Mitglieder des Europ\u00e4ischen Parlaments weder in ein Ermittlungsverfahren verwickelt noch festgenommen oder verfolgt werden."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_de_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_de\_fr model ========================================= Model on translating legal text from Deustch to French. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_de\_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to French. ### How to use Here is how to use this model to translate legal text from Deustch to French in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_de\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07341083139181137, 0.14453622698783875, -0.003838655771687627, 0.0960025042295456, 0.059084076434373856, -0.005504985339939594, 0.019609615206718445, 0.09392544627189636, -0.04994562640786171, 0.0676213949918747, 0.06384923309087753, -0.0031319374684244394, 0.037820469588041306, 0.034417636692523956, 0.058412592858076096, -0.1874968707561493, -0.003583210753276944, -0.02582632191479206, -0.056955624371767044, 0.09602098912000656, 0.09484558552503586, -0.03858093172311783, 0.05926814302802086, -0.021884236484766006, -0.06499677896499634, 0.05184545740485191, -0.08500004559755325, -0.03854558989405632, 0.1151508167386055, 0.08014208823442459, 0.07430287450551987, -0.012912407517433167, 0.06786864995956421, -0.1888158768415451, -0.003772496012970805, 0.06730904430150986, -0.024933205917477608, 0.04148721694946289, 0.10292812436819077, -0.013318994082510471, 0.18246370553970337, -0.0712406188249588, 0.03236955776810646, 0.05211015045642853, -0.1022484228014946, -0.11747746914625168, -0.07690940797328949, 0.018133776262402534, 0.06854910403490067, 0.14263349771499634, -0.025136476382613182, 0.010753171518445015, -0.02297092229127884, 0.06728196144104004, 0.10582142323255539, -0.23912948369979858, -0.020941536873579025, 0.027607711032032967, 0.06138371676206589, 0.07387422770261765, -0.04963402822613716, 0.015325933694839478, 0.0526597797870636, 0.0665157213807106, 0.0725894570350647, -0.052149176597595215, 0.025003965944051743, -0.013659385032951832, -0.11993444710969925, -0.0648910328745842, 0.15080927312374115, 0.03224238008260727, -0.039242424070835114, -0.09965889155864716, -0.05817493051290512, -0.07797142118215561, -0.014469020068645477, -0.05158698931336403, 0.030754780396819115, -0.004902943968772888, 0.04531672224402428, -0.061844781041145325, -0.09691871702671051, -0.07305967807769775, -0.07854853570461273, 0.044728342443704605, 0.030397512018680573, 0.01015708688646555, 0.03945285826921463, 0.08078481256961823, -0.1486211121082306, -0.062178462743759155, 0.007050450891256332, 0.01476059015840292, -0.08415697515010834, 0.021174786612391472, -0.002949811751022935, -0.15473783016204834, 0.009234718978404999, 0.010918540880084038, -0.10711546987295151, 0.021430613473057747, 0.028657682240009308, 0.027132997289299965, 0.046418167650699615, 0.11860515922307968, -0.1052294373512268, -0.13518574833869934, -0.029166586697101593, -0.003176394384354353, -0.020195247605443, 0.034647293388843536, -0.05894307419657707, -0.045238710939884186, 0.008056399412453175, 0.05432603508234024, 0.00485222227871418, -0.014802057296037674, -0.0019148505525663495, -0.024610944092273712, 0.14905937016010284, -0.09396061301231384, 0.018039477989077568, -0.0015402109129354358, -0.09821180254220963, -0.012184007093310356, 0.053846437484025955, -0.02571963332593441, -0.1047421544790268, 0.045422304421663284, -0.039054885506629944, -0.020639361813664436, -0.10493598133325577, -0.1824120730161667, -0.0045483168214559555, 0.01979672722518444, -0.06304370611906052, -0.09376294910907745, -0.12751388549804688, -0.07402626425027847, 0.03522033616900444, -0.0653945654630661, 0.014974693767726421, -0.10326286405324936, -0.001284534577280283, 0.04071945697069168, 0.0035360418260097504, 0.05226736515760422, -0.05093228816986084, 0.026320725679397583, -0.0037769044283777475, 0.05954936146736145, -0.006755008362233639, 0.039499033242464066, -0.056137945502996445, 0.05243512615561485, -0.08458084613084793, 0.16253761947155, -0.011663365177810192, -0.02211841195821762, -0.14940467476844788, -0.0645001232624054, -0.09357775747776031, 0.04216723516583443, 0.09544221311807632, 0.14145106077194214, -0.20702122151851654, -0.04245324805378914, 0.2191227227449417, -0.06522907316684723, -0.05884004011750221, 0.14712660014629364, -0.0345536544919014, 0.01936790905892849, 0.07723880559206009, 0.0884270891547203, 0.038170281797647476, -0.04957937076687813, -0.0583123154938221, 0.005186178255826235, 0.016939695924520493, 0.03155744448304176, 0.10760235041379929, -0.07366780936717987, 0.08002626150846481, -0.006968540605157614, 0.05464022606611252, 0.01461617648601532, -0.04296888783574104, -0.031515948474407196, 0.005283758044242859, -0.031813524663448334, -0.006154065486043692, 0.00979208666831255, 0.01443600095808506, -0.06353038549423218, -0.0877372995018959, 0.029879633337259293, 0.09937199205160141, -0.066568523645401, 0.022874847054481506, 0.02970598079264164, -0.05853964760899544, -0.09728026390075684, 0.016026414930820465, -0.14570161700248718, 0.001617141766473651, 0.025244513526558876, -0.05557004362344742, 0.09683334082365036, 0.03176265209913254, 0.06032166630029678, 0.09720738232135773, -0.05983041599392891, -0.017406383529305458, -0.04444185271859169, -0.02101331576704979, -0.08673408627510071, -0.1057882308959961, -0.014921520836651325, -0.0159168504178524, 0.02093094028532505, -0.155568465590477, 0.013925219886004925, -0.03667223080992699, 0.07911426573991776, 0.005710627883672714, -0.019990665838122368, -0.005137976724654436, 0.05654352530837059, -0.031235072761774063, -0.028488634154200554, 0.0253379475325346, -0.0209786519408226, -0.009306016378104687, 0.09478193521499634, -0.09891851246356964, -0.08326057344675064, 0.10892175883054733, 0.029009709134697914, -0.10535553842782974, -0.002967187901958823, -0.013081622309982777, -0.04870380088686943, -0.038987379521131516, -0.07575111836194992, 0.20310451090335846, 0.054650090634822845, 0.17097559571266174, -0.10460883378982544, -0.059235043823719025, 0.021304989233613014, -0.002672401489689946, -0.02484719455242157, 0.15983879566192627, 0.048587728291749954, -0.1533852517604828, 0.0937914028763771, 0.041410692036151886, -0.011804766021668911, 0.11810868978500366, 0.060354799032211304, -0.10587088763713837, -0.019224774092435837, 0.04462135210633278, 0.0035858212504535913, 0.0453459769487381, -0.08530250191688538, -0.0343521349132061, 0.01804647222161293, 0.07121313363313675, 0.06999310851097107, -0.09647824615240097, 0.07633956521749496, 0.06467130035161972, -0.045869097113609314, 0.06452703475952148, -0.027084022760391235, -0.051480453461408615, 0.11362818628549576, 0.03347399830818176, -0.05228934437036514, -0.048131611198186874, -0.03719708323478699, -0.10193464159965515, 0.1960722655057907, -0.10016442835330963, -0.2321120947599411, -0.12380939722061157, 0.019225722178816795, -0.0702134370803833, 0.020807145163416862, 0.04286060854792595, -0.05153937637805939, -0.0375121533870697, -0.10532466322183609, 0.08359023183584213, -0.1176970973610878, -0.04326115548610687, -0.08951890468597412, 0.04956445470452309, -0.02199186198413372, -0.15969952940940857, 0.013239149935543537, 0.0044041587971150875, -0.04166174679994583, -0.01849246397614479, -0.043909359723329544, 0.11138193309307098, 0.13929563760757446, -0.038951676338911057, -0.030349059030413628, 0.00803452543914318, 0.13832218945026398, -0.0634806901216507, 0.040539294481277466, 0.04936177283525467, 0.08313806354999542, 0.027675990015268326, 0.11790411919355392, 0.045841068029403687, -0.048401087522506714, 0.04293259605765343, 0.07212897390127182, -0.021334484219551086, -0.2495356649160385, -0.10933110862970352, -0.060209840536117554, -0.016407055780291557, 0.09639006853103638, 0.05265079438686371, -0.01487649604678154, -0.0060893637128174305, -0.040915388613939285, 0.04199913144111633, 0.01900137960910797, 0.06012643128633499, 0.08308578282594681, -0.017736943438649178, 0.07789430767297745, -0.06469383835792542, -0.0549633763730526, 0.10038159042596817, 0.04373646900057793, 0.17005449533462524, -0.034395016729831696, 0.2242620438337326, 0.05614017695188522, 0.016562698408961296, -0.01039345096796751, 0.07685068249702454, -0.036164846271276474, 0.029746441170573235, -0.03528731316328049, -0.06020112708210945, 0.016019416972994804, 0.060560815036296844, -0.0007679588161408901, 0.0002200617891503498, -0.07378804683685303, -0.06028744578361511, 0.07933492213487625, 0.20598195493221283, 0.07155958563089371, -0.18987281620502472, -0.05532810837030411, -0.01365754846483469, -0.08027340471744537, -0.0837649554014206, 0.023604528978466988, 0.1394841969013214, -0.0918571725487709, -0.012190054170787334, 0.03006267547607422, 0.1301092952489853, -0.1067373976111412, -0.015148426406085491, 0.03116626664996147, 0.03165239095687866, -0.025051705539226532, 0.10370762646198273, -0.24450364708900452, 0.17189455032348633, 0.023385098204016685, 0.04391242563724518, -0.030251311138272285, 0.019499026238918304, -0.034817907959222794, 0.000845859874971211, 0.11981174349784851, 0.02806318923830986, -0.026355907320976257, -0.07557522505521774, -0.088318832218647, -0.02880295179784298, 0.06282099336385727, -0.062275536358356476, 0.08001036942005157, 0.06082240119576454, -0.019726330414414406, -0.013781235553324223, 0.06276225298643112, -0.04392180219292641, -0.17741617560386658, -0.007611708249896765, -0.015818238258361816, -0.03448806703090668, -0.008379711769521236, -0.04347481578588486, -0.04869270697236061, 0.23448391258716583, -0.10206383466720581, -0.06671492755413055, -0.0737445056438446, 0.01132949534803629, 0.12246396392583847, -0.07289515435695648, 0.05638739839196205, 0.006012319587171078, 0.04012683779001236, -0.06153562292456627, -0.03832139074802399, 0.08616439998149872, -0.06940720230340958, -0.0410241074860096, -0.06095591560006142, 0.15347132086753845, 0.053756192326545715, 0.04495852440595627, -0.017105337232351303, 0.04327625408768654, -0.006481432821601629, -0.11052000522613525, -0.020273499190807343, 0.026332108303904533, 0.129191055893898, 0.05033805966377258, -0.06968668848276138, -0.08109869807958603, -0.06684553623199463, -0.06200389936566353, 0.16043555736541748, 0.16882286965847015, -0.061624255031347275, 0.0373983159661293, 0.17048558592796326, -0.10652390867471695, -0.17519094049930573, -0.0392136424779892, 0.08314783126115799, 0.0580858439207077, -0.04853298142552376, -0.1887328177690506, 0.0070846229791641235, 0.11648033559322357, 0.003994165454059839, 0.06428944319486618, -0.3798823654651642, -0.1417718380689621, 0.021181251853704453, 0.022516684606671333, 0.013503198511898518, -0.11413615942001343, -0.03894910588860512, -0.0759553462266922, -0.0978628545999527, 0.11640942096710205, -0.03605019301176071, 0.08291500061750412, -0.004003629088401794, 0.00828417669981718, 0.0328858382999897, -0.03245115280151367, 0.13047023117542267, 0.03179403021931648, 0.040064118802547455, -0.033737894147634506, 0.043622005730867386, 0.015263893641531467, -0.020136239007115364, 0.15707698464393616, -0.038220394402742386, 0.05646771565079689, -0.1544344425201416, -0.0563594326376915, -0.06180078163743019, 0.022700373083353043, -0.041626013815402985, -0.06626754254102707, -0.05221404880285263, 0.025999397039413452, 0.050172120332717896, -0.01134098507463932, -0.02378617599606514, -0.039901524782180786, 0.02920740284025669, 0.17793011665344238, 0.053080976009368896, 0.03323699161410332, -0.10685744136571884, 0.03338497877120972, 0.0008037397637963295, 0.05962551757693291, -0.1416967809200287, 0.014643055386841297, 0.1446278840303421, 0.028632663190364838, 0.1254519820213318, -0.010678027756512165, -0.12515577673912048, -0.0067308819852769375, 0.05750466510653496, -0.08773390203714371, -0.12274660170078278, -0.02437584102153778, 0.026975706219673157, -0.04300333932042122, -0.01548908744007349, 0.1105363667011261, -0.08007755130529404, -0.03273982182145119, -0.010494367219507694, 0.023043159395456314, -0.0756928101181984, 0.20560137927532196, 0.02328760176897049, 0.052869245409965515, -0.05361767113208771, 0.09171584248542786, 0.12270250916481018, -0.13913780450820923, 0.031807731837034225, 0.20471136271953583, -0.062065429985523224, -0.05466898903250694, 0.009158275090157986, 0.11933553218841553, -0.06198848411440849, -0.054403264075517654, -0.02348579652607441, -0.02630794234573841, 0.03481888771057129, 0.01773434691131115, 0.036337874829769135, 0.037095312029123306, 0.0002899863466154784, -0.049104392528533936, -0.09023842215538025, 0.09516264498233795, 0.06309188902378082, 0.0012256361078470945, 0.003168988274410367, 0.06489692628383636, 0.0070898509584367275, 0.015089768916368484, -0.011105663143098354, 0.0026013858150690794, -0.05227350816130638, 0.0037212094757705927, -0.08930654078722, -0.0019842160400003195, -0.051029860973358154, -0.007070804946124554, -0.03830394893884659, 0.010517924092710018, -0.009731031954288483, 0.0018180317711085081, -0.042524706572294235, -0.0592149943113327, -0.044530872255563736, 0.038350850343704224, -0.10432405769824982, -0.033640552312135696, 0.00694013899192214, -0.036993421614170074, 0.0631590336561203, 0.030309902504086494, 0.015581419691443443, 0.01747666671872139, -0.008020109497010708, 0.03735790401697159, -0.004501625895500183, 0.05510203540325165, 0.007134644780308008, -0.07788871973752975, 0.030528569594025612, 0.026738859713077545, 0.00024373538326472044, -0.013656901195645332, -0.0034174337051808834, -0.12556146085262299, -0.04882728308439255, -0.06414683163166046, -0.038689080625772476, -0.06704483181238174, 0.09927845001220703, 0.0639362782239914, 0.07721208781003952, 0.0774991437792778, -0.06682144850492477, 0.06264857947826385, -0.1547563225030899, -0.011204873211681843, 0.0076990192756056786, -0.02060219645500183, -0.006419879850000143, -0.009246102534234524, 0.042160142213106155, -0.055530961602926254, 0.15624111890792847, 0.028631143271923065, 0.0733465850353241, 0.02727161906659603, -0.10072451084852219, -0.01795477792620659, 0.024356167763471603, 0.09717674553394318, -0.015199986286461353, -0.02533571422100067, -0.08487984538078308, 0.08783846348524094, 0.009096413850784302, 0.14277556538581848, 0.04188309982419014, 0.1418379247188568, 0.1266849786043167, 0.04019937291741371, -0.0006984025822021067, -0.09413577616214752, -0.057875361293554306, 0.053495582193136215, 0.011700344271957874, 0.053013913333415985, -0.04220331460237503, 0.0924287885427475, 0.14262549579143524, -0.16403846442699432, 0.1294732242822647, 0.015272842720150948, -0.07774124294519424, -0.05714128538966179, -0.07761284708976746, -0.04971346631646156, -0.048732057213783264, -0.04117496311664581, -0.10647398233413696, 0.024066733196377754, 0.02132059633731842, 0.07316917926073074, -0.019518954679369926, 0.09859941899776459, -0.018183527514338493, -0.10576210170984268, 0.07052414864301682, 0.007214448414742947, 0.0799575075507164, -0.006113559938967228, 0.05006161704659462, 0.050711773335933685, -0.014187416061758995, 0.04417792707681656, 0.06011323258280754, -0.014478913508355618, 0.014633200131356716, 0.006468628998845816, -0.047750554978847504, -0.041147105395793915, 0.02780318260192871, 0.06559670716524124, 0.21325986087322235, 0.06404762715101242, -0.10759677737951279, -0.008735059760510921, 0.16734367609024048, -0.05141507089138031, -0.09362414479255676, -0.10359789431095123, 0.2237701416015625, 0.051964521408081055, 0.044680263847112656, -0.011207114905118942, -0.09198760986328125, -0.015387218445539474, 0.09949778020381927, 0.21453100442886353, -0.025330383330583572, -0.034169647842645645, 0.031125452369451523, -0.008032743819057941, 0.03328084573149681, 0.0032504480332136154, 0.06788463890552521, 0.2904721200466156, -0.08781019598245621, 0.05275103449821472, -0.06305359303951263, 0.02031630277633667, 0.0037097479216754436, 0.15255025029182434, -0.0015718975337222219, 0.006789937615394592, -0.036472443491220474, 0.08684644848108292, 0.014586657285690308, -0.1816428005695343, 0.004566917661577463, -0.10801827907562256, -0.10217602550983429, 0.014559322968125343, -0.06608422845602036, 0.04305344447493553, 0.06887152791023254, 0.01317153126001358, 0.026708580553531647, 0.04348686337471008, 0.011140553280711174, -0.11195589601993561, -0.12319719046354294, 0.007903475314378738, 0.006843624636530876, 0.10204098373651505, 0.00036944812745787203, 0.12269273400306702, 0.10106593370437622, 0.013925587758421898, -0.07807382196187973, 0.08087141811847687, 0.032009415328502655, 0.04086099565029144, 0.06418959051370621, 0.08570332825183868, -0.009983585216104984, 0.031819842755794525, 0.017679311335086823, -0.038535937666893005, 0.041993096470832825, -0.06013557314872742, -0.03596602752804756, -0.13935096561908722, 0.0953284278512001, -0.03198707848787308, 0.13048183917999268, 0.16972453892230988, -0.008947554975748062, 0.01827937178313732, -0.05704856663942337, 0.013098450377583504, -0.004114941693842411, 0.07865095138549805, -0.01607518084347248, -0.18810850381851196, 0.055000580847263336, -0.05359749495983124, 0.03453505039215088, -0.2654983401298523, -0.036381520330905914, 0.031024809926748276, -0.03345109149813652, -0.019499730318784714, 0.07811247557401657, 0.06581061333417892, 0.027813786640763283, -0.041157692670822144, -0.08767955750226974, -0.01801927015185356, 0.10288908332586288, -0.09195639938116074, -0.10785496234893799 ]
null
null
transformers
# legal_t5_small_multitask_de_it model Model on translating legal text from Deustch to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_de_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Italian. ### How to use Here is how to use this model to translate legal text from Deustch to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_de_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_de_it", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Im vergangenen März hat die Parlamentarische Versammlung der Union für den Mittelmeerraum einstimmig den Bericht „Einwanderung und Integration: Dialog zwischen den neuen Generationen zur Entwicklung einer Kultur des Friedens“ verabschiedet." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_de_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_de_it | 41.405| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Italian", "tags": ["translation Deustch Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Im vergangenen M\u00e4rz hat die Parlamentarische Versammlung der Union f\u00fcr den Mittelmeerraum einstimmig den Bericht \u201eEinwanderung und Integration: Dialog zwischen den neuen Generationen zur Entwicklung einer Kultur des Friedens\u201c verabschiedet."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_de_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_de\_it model ========================================= Model on translating legal text from Deustch to Italian. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_de\_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Italian. ### How to use Here is how to use this model to translate legal text from Deustch to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_de\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08916660398244858, 0.1526920199394226, -0.0032360092736780643, 0.09680905938148499, 0.07853599637746811, 0.016668381169438362, 0.029046256095170975, 0.12346845865249634, -0.030318845063447952, 0.08715055137872696, 0.03976190462708473, 0.011265830136835575, 0.060554903000593185, 0.041504137217998505, 0.030623646453022957, -0.20215211808681488, 0.002370979404076934, -0.027189673855900764, -0.04319896548986435, 0.10074914246797562, 0.09374745190143585, -0.05810366943478584, 0.05186297744512558, -0.05029488727450371, -0.0914423018693924, 0.04521586373448372, -0.08488205820322037, -0.03301135078072548, 0.11095535010099411, 0.08246707916259766, 0.07810666412115097, -0.01637888327240944, 0.07872626185417175, -0.17382873594760895, -0.002467086538672447, 0.0788901150226593, -0.013150511309504509, 0.03204396367073059, 0.11235293745994568, 0.0014436276396736503, 0.1895037442445755, -0.047039374709129333, 0.019651534035801888, 0.046405162662267685, -0.10248853266239166, -0.10037896037101746, -0.07036478072404861, 0.020605670288205147, 0.07734479755163193, 0.1445785015821457, -0.031493671238422394, 0.034045472741127014, -0.021183840930461884, 0.07004795968532562, 0.10063309222459793, -0.22254236042499542, -0.031007761135697365, 0.02004050463438034, 0.0433727465569973, 0.09359996020793915, -0.032465726137161255, 0.017737360671162605, 0.054319433867931366, 0.06177988275885582, 0.03913627937436104, -0.028640417382121086, 0.03541591018438339, -0.022656723856925964, -0.12298370897769928, -0.06172231212258339, 0.15981332957744598, 0.02191220596432686, -0.03632408380508423, -0.09595401585102081, -0.05637933313846588, -0.06553152203559875, 0.003390090074390173, -0.046571824699640274, 0.005654475651681423, 0.005605660378932953, 0.06240186095237732, -0.042863134294748306, -0.10730839520692825, -0.07794089615345001, -0.06324350833892822, 0.09433365613222122, 0.04729748144745827, 0.007193637080490589, 0.03644490987062454, 0.08051379024982452, -0.12838859856128693, -0.0731915608048439, 0.012826895341277122, 0.028708383440971375, -0.0745377317070961, 0.01352765318006277, -0.013219126500189304, -0.18461911380290985, 0.006974436342716217, -0.006315143313258886, -0.05674780532717705, 0.008121161721646786, 0.03321534022688866, 0.035842008888721466, 0.045679859817028046, 0.10630139708518982, -0.1210104376077652, -0.1300916224718094, -0.023872394114732742, -0.006511470768600702, -0.00044056292972527444, 0.01697024516761303, -0.07806485146284103, -0.050081826746463776, 0.0017675055423751473, 0.06816370040178299, 0.020413771271705627, -0.0016391945537179708, -0.015455537475645542, -0.04017384350299835, 0.12029202282428741, -0.1094001978635788, 0.018515141680836678, 0.0023315001744776964, -0.10634773969650269, 0.0015784511342644691, 0.03447934240102768, -0.03805916756391525, -0.1123184934258461, 0.0609937384724617, -0.04274832084774971, -0.023270392790436745, -0.11709482222795486, -0.1812741905450821, 0.012899809516966343, -0.017498884350061417, -0.05592849850654602, -0.09943954646587372, -0.13124974071979523, -0.06998609006404877, 0.018024250864982605, -0.07498111575841904, 0.028672607615590096, -0.08954806625843048, 0.01831393502652645, 0.021838093176484108, -0.006393294781446457, 0.051856085658073425, -0.04496662691235542, 0.049149397760629654, 0.006848079152405262, 0.06178366392850876, -0.012372932396829128, 0.041676439344882965, -0.07350751012563705, 0.04086405038833618, -0.06788947433233261, 0.14229615032672882, -0.005259942263364792, -0.00034282563137821853, -0.13697321712970734, -0.049840595573186874, -0.09069991856813431, 0.04140795022249222, 0.10349009931087494, 0.1550847440958023, -0.21979467570781708, -0.03134975954890251, 0.20125135779380798, -0.0818067118525505, -0.06440960615873337, 0.1383441835641861, -0.022805236279964447, 0.05417672544717789, 0.08749400824308395, 0.09498435258865356, 0.028824886307120323, -0.05278648063540459, -0.06731870025396347, -0.02395792305469513, 0.006278087384998798, 0.02296014316380024, 0.09566880017518997, -0.057931579649448395, 0.12822005152702332, 0.007665162440389395, 0.03243514522910118, 0.01236149575561285, -0.04768241196870804, -0.036052897572517395, -0.0014817820629104972, -0.05520806461572647, -0.03168226405978203, 0.014647929929196835, 0.020684881135821342, -0.06534307450056076, -0.09209069609642029, 0.003852564375847578, 0.11007223278284073, -0.0783640518784523, 0.029953453689813614, 0.00823129341006279, -0.01739612966775894, -0.09927921742200851, 0.02134307473897934, -0.1436092108488083, 0.0026193768717348576, 0.03995763510465622, -0.038182955235242844, 0.08844126015901566, 0.02050955966114998, 0.059494197368621826, 0.08977426588535309, -0.057972218841314316, -0.017492404207587242, -0.03052888996899128, -0.019138259813189507, -0.09818942844867706, -0.10999619960784912, -0.04076479747891426, -0.008722472935914993, 0.0185787882655859, -0.15369556844234467, 0.02289091981947422, -0.017908236011862755, 0.09873990714550018, -0.003480857005342841, -0.028617070987820625, -0.011648851446807384, 0.04907318577170372, -0.043177541345357895, -0.042176444083452225, 0.023396717384457588, -0.012839864008128643, -0.024489855393767357, 0.07706974446773529, -0.11420876532793045, -0.07519310712814331, 0.10176149755716324, 0.020718306303024292, -0.09372036159038544, -0.010074221529066563, -0.00558539479970932, -0.05839690938591957, -0.04357120767235756, -0.08589491248130798, 0.17651404440402985, 0.0481022447347641, 0.14062468707561493, -0.1057419553399086, -0.033631402999162674, 0.02780298888683319, -0.004483702126890421, -0.03570691868662834, 0.1407150775194168, 0.06081864982843399, -0.1730429083108902, 0.09965130686759949, 0.039946552366018295, -0.026039261370897293, 0.12736976146697998, 0.07225557416677475, -0.12216614186763763, -0.007722115609794855, 0.04922277852892876, 0.009076806716620922, 0.03798737749457359, -0.10252107679843903, -0.017489971593022346, 0.01577930524945259, 0.07236675918102264, 0.08055361360311508, -0.09558939933776855, 0.06174914911389351, 0.06110059842467308, -0.032761845737695694, 0.057076115161180496, -0.04841481149196625, -0.06697197258472443, 0.12087125331163406, 0.03198154270648956, -0.04901919141411781, -0.0381048247218132, -0.03516685962677002, -0.09324623644351959, 0.17938528954982758, -0.08995489776134491, -0.23363621532917023, -0.12901507318019867, 0.01786673255264759, -0.052138060331344604, 0.030083520337939262, 0.02945912815630436, -0.04584785923361778, -0.03740818798542023, -0.08652058988809586, 0.09086492657661438, -0.11284271627664566, -0.04609445109963417, -0.09257391095161438, 0.05996079370379448, -0.03390960395336151, -0.1374380886554718, 0.01877797581255436, 0.0036659869365394115, -0.05239376425743103, -0.016659865155816078, -0.06921328604221344, 0.1339465081691742, 0.15396912395954132, -0.03056287206709385, -0.013334998860955238, 0.010041582398116589, 0.09988859295845032, -0.08500378578901291, 0.05359697341918945, 0.07455378025770187, 0.05577303096652031, 0.015022951178252697, 0.12112651020288467, 0.04020458087325096, -0.06283389031887054, 0.03396862745285034, 0.06276415288448334, -0.021095825359225273, -0.25489550828933716, -0.10100256651639938, -0.06501268595457077, -0.014902135357260704, 0.08675733953714371, 0.049553316086530685, -0.029081054031848907, 0.007567422464489937, -0.04593108221888542, 0.006030406802892685, 0.022458316758275032, 0.058009129017591476, 0.05322150141000748, -0.02243145741522312, 0.08171217888593674, -0.058726292103528976, -0.04404960945248604, 0.10636083036661148, 0.07095441967248917, 0.1962517499923706, -0.03369387984275818, 0.24009418487548828, 0.04322654381394386, 0.03153665363788605, -0.004775131121277809, 0.0779976099729538, -0.023790517821907997, 0.016709759831428528, -0.02558726817369461, -0.06400752812623978, 0.006090492941439152, 0.06960446387529373, 0.002286158036440611, -0.010835476219654083, -0.05877130106091499, -0.05279717966914177, 0.07862675189971924, 0.23117667436599731, 0.06579072028398514, -0.2012018859386444, -0.0528714545071125, -0.009643904864788055, -0.06030181422829628, -0.0806945413351059, 0.004340238869190216, 0.14047031104564667, -0.07338087260723114, -0.017260735854506493, 0.029553256928920746, 0.13274720311164856, -0.1364344358444214, -0.02506561391055584, 0.022063693031668663, 0.03435773402452469, -0.028512151911854744, 0.12179645150899887, -0.24506187438964844, 0.1741587519645691, 0.029255347326397896, 0.07865803688764572, -0.0574357695877552, 0.015955086797475815, -0.03957216441631317, 0.008265224285423756, 0.11599226295948029, 0.01864095963537693, -0.02531844936311245, -0.11039461940526962, -0.10936304181814194, -0.029876718297600746, 0.08698859810829163, -0.052102770656347275, 0.0841345489025116, 0.052946895360946655, -0.0025637131184339523, -0.005409163422882557, 0.08390557020902634, -0.01632501184940338, -0.171070396900177, 0.0077406177297234535, 0.0024943489115685225, -0.04628095403313637, -0.00588415889069438, -0.05764033645391464, -0.0262721199542284, 0.23896071314811707, -0.10089552402496338, -0.06433135271072388, -0.08547317981719971, 0.041532572358846664, 0.10987246781587601, -0.07184383273124695, 0.03292461484670639, 0.009778009727597237, 0.04412233456969261, -0.05740612745285034, -0.03115992061793804, 0.09938579052686691, -0.05777331441640854, -0.05662108212709427, -0.0668608620762825, 0.13541242480278015, 0.05831505358219147, 0.04025052860379219, -0.0121964355930686, 0.037940192967653275, 0.008080795407295227, -0.08709713816642761, -0.0052453395910561085, 0.040399305522441864, 0.12779419124126434, 0.06297522038221359, -0.08396195620298386, -0.07198332995176315, -0.07376927882432938, -0.07992464303970337, 0.14549808204174042, 0.16367308795452118, -0.057000935077667236, -0.00036556983832269907, 0.1606208235025406, -0.11674335598945618, -0.17688125371932983, -0.01608111895620823, 0.06925324350595474, 0.0731799528002739, -0.04640737175941467, -0.18549907207489014, 0.001607748563401401, 0.1480531543493271, -0.004738608840852976, 0.08255717158317566, -0.3835116922855377, -0.1292814165353775, 0.010396422818303108, 0.03997863829135895, 0.009578797966241837, -0.11785835772752762, -0.035635095089673996, -0.07074408233165741, -0.09229531139135361, 0.07903917878866196, -0.016686055809259415, 0.09239529818296432, -0.0008436786010861397, 0.00010622633999446407, 0.04268600046634674, -0.04016187787055969, 0.12361817806959152, 0.0241434033960104, 0.03294924274086952, -0.048627469688653946, 0.036784812808036804, 0.014362805522978306, -0.009402699768543243, 0.1536245197057724, -0.040615636855363846, 0.03604510426521301, -0.1339666247367859, -0.07044517248868942, -0.056530069559812546, 0.026218751445412636, -0.03854616358876228, -0.07470695674419403, -0.03940188139677048, 0.03520038351416588, 0.035225432366132736, -0.0004497629124671221, -0.007087585516273975, -0.06692256033420563, 0.033087704330682755, 0.17521356046199799, 0.05458363890647888, 0.024648616090416908, -0.08834986388683319, 0.007070193067193031, 0.0014658367726951838, 0.056605178862810135, -0.10949566960334778, 0.010048669762909412, 0.14920714497566223, 0.024828657507896423, 0.11725613474845886, -0.007426739204674959, -0.13516025245189667, 0.004435767885297537, 0.07140887528657913, -0.07838256657123566, -0.12054067850112915, -0.019816530868411064, 0.033518899232149124, -0.05248624458909035, 0.016563167795538902, 0.11453751474618912, -0.08065696805715561, -0.027673233300447464, -0.016979064792394638, 0.03813554719090462, -0.07278049737215042, 0.2311628758907318, 0.029282987117767334, 0.0354473777115345, -0.05324959009885788, 0.12886688113212585, 0.11923302710056305, -0.13256113231182098, 0.032343875616788864, 0.20162218809127808, -0.0628897100687027, -0.054155100136995316, 0.009943293407559395, 0.09888209402561188, -0.06033540889620781, -0.06663518399000168, -0.043709058314561844, -0.0236724391579628, 0.02485218457877636, -0.009362644515931606, 0.039405837655067444, 0.04427758604288101, -0.024837898090481758, -0.041041918098926544, -0.10841496288776398, 0.08690764755010605, 0.07420305907726288, 0.0057390606962144375, -0.018040893599390984, 0.0903148502111435, 0.01721767894923687, -0.02241485007107258, -0.01183454878628254, -0.0059281340800225735, -0.04445100948214531, 0.01983538269996643, -0.07650120556354523, -0.01952846348285675, -0.05130767449736595, -0.01777072064578533, -0.04614555090665817, 0.004985416308045387, -0.02244650200009346, 0.003155564423650503, -0.05175251141190529, -0.05740753188729286, -0.056306496262550354, 0.021112874150276184, -0.09589031338691711, -0.032400213181972504, -0.006303912494331598, -0.034085486084222794, 0.06829637289047241, 0.03526167571544647, 0.009900548495352268, 0.01809772476553917, -0.010255943052470684, 0.0598711222410202, 0.002868850715458393, 0.05148650333285332, 0.00771615095436573, -0.05689223110675812, 0.03278295695781708, 0.04458785429596901, -0.016135383397340775, -0.008388145826756954, 0.002955811796709895, -0.12249508500099182, -0.027708308771252632, -0.053047508001327515, -0.03506748378276825, -0.06029266491532326, 0.1069701761007309, 0.07743138074874878, 0.07346831262111664, 0.059115130454301834, -0.05406231805682182, 0.07329631596803665, -0.15562815964221954, -0.0027016245294362307, 0.017901591956615448, -0.034862201660871506, -0.007009425666183233, 0.0009033017558977008, 0.053736187517642975, -0.05668148398399353, 0.12838831543922424, 0.020538877695798874, 0.057920634746551514, 0.023308472707867622, -0.07419345527887344, 0.0016363570466637611, 0.019520431756973267, 0.12098480761051178, -0.01931563951075077, -0.02099440060555935, -0.07339758425951004, 0.08714772760868073, -0.009866222739219666, 0.14766907691955566, 0.034311845898628235, 0.1544959545135498, 0.12839382886886597, 0.045643940567970276, 0.031954504549503326, -0.09817662090063095, -0.07599780708551407, 0.05079600587487221, -0.01100617740303278, 0.05918705835938454, -0.031701911240816116, 0.09919489175081253, 0.15431039035320282, -0.1649739146232605, 0.10085204243659973, -0.002247745404019952, -0.08501721918582916, -0.04782458022236824, -0.12440124899148941, -0.04650113359093666, -0.04763740301132202, -0.035252783447504044, -0.11711526662111282, 0.022147124633193016, 0.023054955527186394, 0.054662659764289856, -0.04988616704940796, 0.110627681016922, 0.009684133343398571, -0.10152930021286011, 0.0780230462551117, 0.006417871452867985, 0.086070716381073, -0.00915377028286457, 0.014502076432108879, 0.040595244616270065, -0.01364477351307869, 0.05277025327086449, 0.05051518604159355, -0.008109676651656628, 0.003629242070019245, 0.0048345401883125305, -0.048783883452415466, -0.04295159503817558, 0.01892009750008583, 0.0610848069190979, 0.20780356228351593, 0.04429154098033905, -0.08998400717973709, -0.016676664352416992, 0.1658998727798462, -0.0449802502989769, -0.0717553123831749, -0.09600314497947693, 0.21448704600334167, 0.05032685026526451, 0.04299544170498848, -0.010839318856596947, -0.10274116694927216, -0.009993219748139381, 0.10897574573755264, 0.19650031626224518, -0.04292807728052139, -0.041799042373895645, 0.03118705004453659, -0.00475552910938859, 0.015837490558624268, 0.025057289749383926, 0.05199943110346794, 0.2703351378440857, -0.08182971179485321, 0.0786568820476532, -0.057321205735206604, 0.02619125135242939, -0.00018644420197233558, 0.15181906521320343, -0.0015070086810737848, 0.013160099275410175, -0.05978759750723839, 0.0955594927072525, 0.03265346959233284, -0.16601669788360596, -0.0005405815900303423, -0.12185032665729523, -0.12005461752414703, 0.013435203582048416, -0.05390213429927826, 0.026261067017912865, 0.09386825561523438, 0.0003758415987249464, 0.036178961396217346, 0.03217073902487755, 0.01613878458738327, -0.11289560794830322, -0.15035153925418854, 0.011913234367966652, -0.010268121026456356, 0.1017482653260231, -0.007771224714815617, 0.11475758999586105, 0.10345186293125153, 0.03190949186682701, -0.08692797273397446, 0.09540226310491562, 0.025720305740833282, 0.0262465663254261, 0.06407443434000015, 0.08178956061601639, -0.030358150601387024, 0.04075846076011658, 0.03996540606021881, -0.058714695274829865, 0.045445848256349564, -0.0695716068148613, -0.027696793898940086, -0.13786490261554718, 0.08272813260555267, -0.03260089084506035, 0.14248085021972656, 0.17052395641803741, -0.013834205456078053, -0.003918586298823357, -0.05189332738518715, 0.025877533480525017, 0.013468320481479168, 0.10447261482477188, -0.015222210437059402, -0.2233002632856369, 0.025313418358564377, -0.05263775959610939, 0.03627508133649826, -0.23474448919296265, -0.04030350223183632, 0.02757817506790161, -0.048386894166469574, -0.03509370982646942, 0.07246057689189911, 0.07877442985773087, 0.029924042522907257, -0.037152718752622604, -0.11680801212787628, -0.008466505445539951, 0.10256216675043106, -0.10258820652961731, -0.11054561287164688 ]
null
null
transformers
# legal_t5_small_multitask_de_sv model Model on translating legal text from Deustch to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_de_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Swedish. ### How to use Here is how to use this model to translate legal text from Deustch to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_de_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_de_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "SCHRIFTLICHE ANFRAGE P-1584/03" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_de_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_de_sv | 35.945| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Swedish", "tags": ["translation Deustch Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "SCHRIFTLICHE ANFRAGE P-1584/03"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_de_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_de\_sv model ========================================= Model on translating legal text from Deustch to Swedish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_de\_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Swedish. ### How to use Here is how to use this model to translate legal text from Deustch to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_de\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_de\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06525290012359619, 0.10881888121366501, -0.003055718494579196, 0.09990471601486206, 0.06468930840492249, -0.014585972763597965, 0.005592341534793377, 0.09726221859455109, -0.050316132605075836, 0.07226058095693588, 0.0688549280166626, 0.0113736717030406, 0.06344249844551086, 0.02631824091076851, 0.042150646448135376, -0.22821307182312012, 0.004134615883231163, -0.03867044672369957, -0.044660791754722595, 0.09049087762832642, 0.0972106084227562, -0.04305122792720795, 0.0412258617579937, -0.04387393593788147, -0.04208042100071907, 0.03314440697431564, -0.0853833258152008, -0.027595767751336098, 0.1067221611738205, 0.07515126466751099, 0.07170894742012024, -0.007204409688711166, 0.0773710161447525, -0.17489469051361084, -0.007454447913914919, 0.03703350946307182, -0.013243003748357296, 0.025183988735079765, 0.09276721626520157, 0.0292038694024086, 0.20879320800304413, -0.06717446446418762, 0.02753467857837677, 0.037072230130434036, -0.07527846097946167, -0.13411679863929749, -0.06855466216802597, 0.017973873764276505, 0.08332406729459763, 0.1388787478208542, -0.043424341827631, 0.012458443641662598, -0.014751364476978779, 0.08994060754776001, 0.09617708623409271, -0.2359437346458435, -0.026204323396086693, 0.0782107412815094, 0.07101136445999146, 0.09874200820922852, -0.04297993332147598, 0.030794480815529823, 0.06305483728647232, 0.0876184031367302, 0.08552701771259308, -0.04878498241305351, -0.003896601963788271, -0.015554867684841156, -0.12945273518562317, -0.0397292859852314, 0.16493511199951172, 0.0231222752481699, -0.027239587157964706, -0.10793111473321915, -0.04117356240749359, -0.060741525143384933, 0.016239462420344353, -0.05105997994542122, 0.016408804804086685, -0.014367058873176575, 0.04861041530966759, -0.08484409749507904, -0.12001657485961914, -0.05416543781757355, -0.05611605942249298, 0.037670303136110306, 0.01739773526787758, 0.010768578387796879, 0.056216202676296234, 0.05546395108103752, -0.14717459678649902, -0.0816388726234436, -0.007963159121572971, 0.008314473554491997, -0.08517251163721085, 0.013146369718015194, -0.01530366763472557, -0.23780019581317902, 0.008066536858677864, -0.01040609274059534, -0.06825564056634903, 0.012072138488292694, 0.041196420788764954, 0.03269992023706436, 0.05150962248444557, 0.127358078956604, -0.10728055983781815, -0.1408979445695877, -0.028913239017128944, -0.033971115946769714, 0.01448899321258068, 0.0061227018013596535, -0.07380720973014832, -0.04783056676387787, 0.03135138750076294, 0.04480311647057533, -0.00633486732840538, 0.008028600364923477, 0.006887322291731834, -0.01953302137553692, 0.10216863453388214, -0.10120540112257004, -0.003591049462556839, -0.012739338912069798, -0.08679012954235077, -0.028423268347978592, 0.06130111962556839, -0.01925876922905445, -0.10532604157924652, 0.08308465778827667, -0.01795548014342785, -0.0097873630002141, -0.0829494297504425, -0.20010727643966675, 0.019021857529878616, -0.024436907842755318, -0.04352237284183502, -0.09486468881368637, -0.12542927265167236, -0.08616523444652557, 0.031234562397003174, -0.06017836928367615, 0.003962543793022633, -0.10666725784540176, -0.009220736101269722, 0.03759806603193283, -0.030966436490416527, 0.0694751888513565, -0.0495297834277153, 0.026311805471777916, -0.034000296145677567, 0.06935454159975052, -0.010341319255530834, 0.03137501701712608, -0.08225061744451523, 0.03478474169969559, -0.09705781191587448, 0.1530306041240692, -0.05140339210629463, 0.003337533911690116, -0.12679225206375122, -0.057888273149728775, -0.06288524717092514, 0.05569053068757057, 0.08464547246694565, 0.13102006912231445, -0.22245852649211884, -0.03302287682890892, 0.19130399823188782, -0.08800455182790756, -0.05342885106801987, 0.11742064356803894, -0.01456113625317812, 0.03539551794528961, 0.10043929517269135, 0.12400384247303009, 0.029090426862239838, -0.04787050932645798, -0.06488434225320816, 0.014763424172997475, -0.004768182057887316, 0.013891764916479588, 0.09648897498846054, -0.07443448156118393, 0.11269921064376831, 0.022389782592654228, 0.05270254239439964, 0.0007806930225342512, -0.017809873446822166, -0.030918367207050323, 0.009563528001308441, -0.03655587136745453, -0.04971519112586975, 0.016002440825104713, 0.010871008969843388, -0.08009275794029236, -0.06424400955438614, 0.060794100165367126, 0.07551268488168716, -0.07157555967569351, 0.041829030960798264, 0.05384322628378868, -0.0615287646651268, -0.12148573994636536, 0.01840042509138584, -0.13862565159797668, -0.015406884253025055, 0.0060927970334887505, -0.028690584003925323, 0.09391964972019196, 0.06964173913002014, 0.0690218135714531, 0.08923275768756866, -0.059597425162792206, -0.008705059997737408, -0.01913823001086712, -0.015944240614771843, -0.09994279593229294, -0.10932224988937378, -0.022529922425746918, -0.021889008581638336, -0.001371267018839717, -0.14189483225345612, 0.0018704204121604562, -0.0263604037463665, 0.08142013847827911, 0.008502203039824963, -0.02622990868985653, 0.034664757549762726, 0.05970962718129158, -0.02609346993267536, -0.033397890627384186, 0.029481902718544006, -0.018810417503118515, -0.06816062331199646, 0.118740513920784, -0.08371467888355255, -0.10308782756328583, 0.0965573638677597, 0.033167753368616104, -0.09723540395498276, -0.0019335589604452252, -0.0016371319070458412, -0.055584389716386795, -0.050144921988248825, -0.07781738042831421, 0.1783411204814911, 0.06119554862380028, 0.13879521191120148, -0.10632394999265671, -0.04482502117753029, 0.019169878214597702, -0.03702259436249733, -0.026680704206228256, 0.17382194101810455, 0.02880416251718998, -0.19087301194667816, 0.09580174833536148, -0.005841054487973452, -0.02261093631386757, 0.18455477058887482, 0.06884294003248215, -0.1065615639090538, -0.0019965951796621084, 0.027120595797896385, -0.001693077734671533, 0.061414387077093124, -0.0630490779876709, -0.014858703128993511, 0.02844424918293953, 0.07282298803329468, 0.0693986788392067, -0.07907500118017197, 0.05428669601678848, 0.0602993443608284, -0.050409819930791855, 0.07213447988033295, -0.024432271718978882, -0.06360060721635818, 0.08970858156681061, 0.02041672356426716, -0.03569932281970978, -0.03969009965658188, -0.03714430332183838, -0.10823909193277359, 0.18958832323551178, -0.10141044110059738, -0.24618875980377197, -0.14919474720954895, 0.033091992139816284, -0.07275380194187164, 0.027171779423952103, 0.058419305831193924, -0.058672308921813965, -0.05768432468175888, -0.10883017629384995, 0.12008582800626755, -0.08108238130807877, -0.05584684759378433, -0.10476985573768616, 0.046963535249233246, -0.013703448697924614, -0.14389678835868835, 0.01550524216145277, -0.012226007878780365, -0.019600942730903625, 0.00013479471090249717, -0.03915492817759514, 0.11413887143135071, 0.11638891696929932, -0.014519327320158482, -0.028918206691741943, 0.01211332343518734, 0.12652306258678436, -0.055613499134778976, 0.061182379722595215, 0.039919521659612656, 0.03785177692770958, 0.0213173795491457, 0.14179202914237976, 0.041746560484170914, -0.04648870229721069, 0.03144790977239609, 0.07292269170284271, -0.03969227895140648, -0.26215627789497375, -0.09735610336065292, -0.055327147245407104, 0.002365830820053816, 0.08371835201978683, 0.05411045253276825, -0.060832761228084564, 0.01583157666027546, -0.03415419161319733, -0.0016788736684247851, 0.022283507511019707, 0.05470969155430794, 0.04956038296222687, -0.030459953472018242, 0.08723189681768417, -0.05583001300692558, -0.023585211485624313, 0.095051109790802, 0.03884703665971756, 0.1677798181772232, -0.042315904051065445, 0.19356219470500946, 0.04887828230857849, 0.010487031191587448, -0.009298301301896572, 0.08117211610078812, -0.03973006084561348, 0.015079020522534847, -0.012298351153731346, -0.06617037951946259, -0.013814173638820648, 0.07046470046043396, 0.0009377739625051618, 0.00037175911711528897, -0.038557667285203934, -0.03816359117627144, 0.07104253023862839, 0.2160337269306183, 0.07951178401708603, -0.15623490512371063, -0.08690453320741653, 0.008428046479821205, -0.07832615077495575, -0.0736846923828125, 0.0033376961946487427, 0.1502082645893097, -0.09295687079429626, 0.024509593844413757, 0.011456233449280262, 0.13183452188968658, -0.10620452463626862, -0.016958042979240417, 0.010985655710101128, 0.027191990986466408, -0.02441166155040264, 0.11433213204145432, -0.23846665024757385, 0.16840942203998566, 0.023992251604795456, 0.06673439592123032, -0.046881794929504395, 0.01622437685728073, -0.047982506453990936, -0.006220309995114803, 0.1279677003622055, 0.04138724133372307, -0.08087589591741562, -0.09166012704372406, -0.09639108180999756, -0.020201707258820534, 0.048060644418001175, -0.06025535240769386, 0.08865457028150558, 0.06877001374959946, 0.011486025527119637, -0.028963787481188774, 0.05480824038386345, -0.013412468135356903, -0.14963991940021515, -0.011309239082038403, -0.0275835283100605, -0.036624494940042496, -0.004582172259688377, -0.053507618606090546, -0.07797253131866455, 0.23760953545570374, -0.11883091926574707, -0.11777248978614807, -0.08760438859462738, 0.03769050911068916, 0.10583030432462692, -0.07491178810596466, 0.025932179763913155, 0.023757722228765488, 0.040557973086833954, -0.07962080836296082, -0.027828523889183998, 0.07866644859313965, -0.04719012603163719, -0.058531805872917175, -0.02467925287783146, 0.15116135776042938, 0.06350186467170715, 0.04029100015759468, -0.016323311254382133, 0.06205799803137779, 0.007127056829631329, -0.10751551389694214, -0.004154229070991278, 0.05952579528093338, 0.13566166162490845, 0.03987549990415573, -0.049920693039894104, -0.06576032936573029, -0.0559561587870121, -0.08514579385519028, 0.16829954087734222, 0.16869913041591644, -0.054762739688158035, 0.057830482721328735, 0.17503273487091064, -0.10660123080015182, -0.202538400888443, -0.047675687819719315, 0.08751539140939713, 0.08634541928768158, -0.011061604134738445, -0.16383281350135803, 0.02736179530620575, 0.1256720870733261, -0.0006591486162506044, 0.03928445652127266, -0.3320595622062683, -0.15155726671218872, 0.019255781546235085, 0.03224571794271469, 0.009332948364317417, -0.08864665031433105, -0.04998205602169037, -0.07194261997938156, -0.09011760354042053, 0.08716194331645966, -0.03481011092662811, 0.10628323256969452, 0.002653390634804964, 0.039281055331230164, 0.048858873546123505, -0.04385659098625183, 0.13389931619167328, 0.00954850111156702, 0.022194257006049156, -0.07862289994955063, 0.07846173644065857, 0.008993090130388737, -0.02760978415608406, 0.19292522966861725, -0.06957291066646576, 0.03911628574132919, -0.1413118839263916, -0.071004219353199, -0.06576889753341675, 0.04476736858487129, -0.03701658919453621, -0.09052743762731552, -0.05717317387461662, 0.03698352351784706, 0.0551312156021595, -0.02678319625556469, 0.04758527874946594, -0.09242613613605499, 0.03287215530872345, 0.1729191690683365, 0.07646027952432632, 0.02226266823709011, -0.08869985491037369, 0.01894913613796234, -0.010150901973247528, 0.06252765655517578, -0.13236652314662933, 0.0039014918729662895, 0.15148773789405823, 0.023949649184942245, 0.11089397221803665, -0.03896768018603325, -0.12936823070049286, 0.008253876119852066, 0.05554741248488426, -0.10041610896587372, -0.11103314906358719, -0.018020281568169594, -0.028427651152014732, -0.04047866538167, -0.007297911681234837, 0.13352365791797638, -0.09916159510612488, -0.014021910727024078, -0.007879791781306267, 0.04181685298681259, -0.07187440246343613, 0.2201765924692154, 0.039194200187921524, 0.05480063334107399, -0.06219824030995369, 0.12663234770298004, 0.09625724703073502, -0.11152446269989014, 0.059929247945547104, 0.19495314359664917, -0.07231441140174866, -0.05252557247877121, 0.017848264425992966, 0.12952621281147003, -0.06627625226974487, -0.056915283203125, -0.014769427478313446, -0.044236548244953156, 0.019427990540862083, -0.0012599675683304667, 0.04579346626996994, 0.03214409202337265, -0.007885094732046127, -0.045819055289030075, -0.07719595730304718, 0.10142244398593903, 0.04922587051987648, -0.0015929782530292869, -0.027170663699507713, 0.08081372827291489, -0.006937466096132994, 0.0034308049362152815, -0.02294916845858097, 0.0242883563041687, -0.04143930599093437, 0.002706804545596242, -0.0928095281124115, -0.0057347724214196205, -0.0627240538597107, -0.0016843710327520967, -0.043810587376356125, 0.006961049046367407, -0.011507204733788967, 0.007504207082092762, -0.041726160794496536, -0.049201034009456635, -0.07335048168897629, 0.014039279893040657, -0.09765676409006119, -0.05603257566690445, -0.018791578710079193, -0.010975086130201817, 0.05567358061671257, 0.023012958467006683, 0.0023489438463002443, 0.05040072277188301, 0.007982093840837479, 0.07281223684549332, 0.027339885011315346, 0.03858673572540283, 0.018295912072062492, -0.03497767075896263, -0.013953504152595997, 0.03264736011624336, -0.011947120539844036, -0.003028675215318799, -0.0006111000548116863, -0.12785491347312927, -0.05772273242473602, -0.04947446659207344, -0.027225282043218613, -0.0635496973991394, 0.10367559641599655, 0.0701318010687828, 0.08177430182695389, 0.11060840636491776, -0.07954403758049011, 0.07804791629314423, -0.15811051428318024, -0.002570637734606862, 0.02205159328877926, -0.040920525789260864, -0.011630360968410969, 0.0006916947895660996, 0.04323045536875725, -0.08519154042005539, 0.14157970249652863, 0.02802831120789051, 0.05651365593075752, 0.027958456426858902, -0.07576904445886612, 0.013759789057075977, 0.020219959318637848, 0.12251108139753342, -0.03101947344839573, -0.018957030028104782, -0.08350767195224762, 0.08626504242420197, -0.004763120785355568, 0.11299730837345123, 0.0574154369533062, 0.11646424233913422, 0.1290634125471115, 0.05494670569896698, 0.024471020326018333, -0.0645773708820343, -0.07935760915279388, 0.05763403698801994, 0.016280261799693108, 0.0657435804605484, -0.006643770728260279, 0.07129081338644028, 0.167073592543602, -0.17502029240131378, 0.1159207820892334, 0.00630235718563199, -0.08434397727251053, -0.06685589253902435, -0.14852763712406158, -0.07352596521377563, -0.02449071779847145, -0.026667842641472816, -0.12855762243270874, 0.006872877012938261, 0.03040470741689205, 0.06898952275514603, -0.02430555410683155, 0.10793529450893402, -0.004800016526132822, -0.11016944795846939, 0.06443402171134949, 0.00591285852715373, 0.06467980146408081, 0.006606383714824915, 0.034201283007860184, 0.06371524184942245, 0.01217550877481699, 0.021433159708976746, 0.04571565240621567, -0.0067262495867908, -0.012184769846498966, -0.0010401519248262048, -0.06069803237915039, -0.04015708714723587, 0.04236196354031563, 0.07707274705171585, 0.1718824803829193, 0.0661158487200737, -0.10092970728874207, -0.026938587427139282, 0.17505963146686554, -0.0467919297516346, -0.0997805967926979, -0.10400466620922089, 0.20841078460216522, 0.028192570433020592, 0.0627247542142868, -0.01070481352508068, -0.09922245144844055, 0.0012877103872597218, 0.09414705634117126, 0.21249346435070038, -0.009198546409606934, -0.02839517407119274, -0.0058909934014081955, -0.01328168623149395, 0.008022570051252842, 0.025322098284959793, 0.030184706673026085, 0.2513231039047241, -0.07023999094963074, 0.06828432530164719, -0.06880025565624237, 0.011021963320672512, -0.006966266315430403, 0.14017026126384735, 0.004624274093657732, 0.0070564295165240765, -0.04645368456840515, 0.10608108341693878, -0.007454189006239176, -0.17125461995601654, -0.007616295013576746, -0.08635873347520828, -0.13232818245887756, 0.011841725558042526, -0.0184972882270813, 0.042311277240514755, 0.06329620629549026, 0.019255317747592926, 0.059404607862234116, 0.007486619055271149, 0.017187204211950302, -0.1097223088145256, -0.10816463828086853, 0.0038128551095724106, 0.022386034950613976, 0.06237149238586426, 0.015056288801133633, 0.10041962563991547, 0.09399988502264023, 0.024519287049770355, -0.06998389214277267, 0.10932526737451553, 0.03345934674143791, 0.014803454279899597, 0.07679291814565659, 0.11681574583053589, -0.016820888966321945, 0.04983241856098175, 0.03891998901963234, -0.0808468610048294, 0.014470276422798634, -0.025387227535247803, -0.03334222733974457, -0.10636382550001144, 0.10378946363925934, -0.03873668983578682, 0.13609568774700165, 0.18368026614189148, -0.0022871079854667187, -0.008029897697269917, -0.06743750721216202, 0.030200272798538208, -0.012160008773207664, 0.07175218313932419, -0.005837678909301758, -0.19664263725280762, 0.017204413190484047, -0.039062004536390305, 0.01856137253344059, -0.22396722435951233, -0.038025565445423126, 0.022841883823275566, -0.03859088197350502, -0.015150943771004677, 0.07796594500541687, 0.06114044040441513, 0.018015680834650993, -0.027964888140559196, -0.06197311729192734, 0.014067948795855045, 0.09998102486133575, -0.11506720632314682, -0.11731347441673279 ]
null
null
transformers
# legal_t5_small_multitask_en_cs model Model on translating legal text from English to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_en_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from English to Cszech. ### How to use Here is how to use this model to translate legal text from English to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_en_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_en_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "Text proposed by the Commission" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_en_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_en_cs | 36.226| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Cszech", "tags": ["translation English Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Text proposed by the Commission"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_en_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_en\_cs model ========================================= Model on translating legal text from English to Cszech. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_en\_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Cszech. ### How to use Here is how to use this model to translate legal text from English to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_en\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07375800609588623, 0.13510584831237793, -0.003300243988633156, 0.07855281978845596, 0.07002899795770645, 0.011510307900607586, 0.014086236245930195, 0.11097179353237152, -0.03878377750515938, 0.07781186699867249, 0.05558587983250618, 0.012561759911477566, 0.05119285359978676, 0.03540652245283127, 0.04431478679180145, -0.1929013431072235, -0.008482993580400944, -0.023026611655950546, -0.03783358633518219, 0.11094669252634048, 0.08270256221294403, -0.05936555564403534, 0.05114587023854256, -0.034799203276634216, -0.08267423510551453, 0.017679525539278984, -0.08279833197593689, -0.04020709916949272, 0.10029250383377075, 0.07051262259483337, 0.07745764404535294, -0.015155301429331303, 0.06318989396095276, -0.19468232989311218, 0.00008848223660606891, 0.05994589999318123, -0.011990590952336788, 0.05326675996184349, 0.0904470756649971, -0.004201747011393309, 0.19538775086402893, -0.06030135974287987, 0.03014296479523182, 0.05309809744358063, -0.10797566920518875, -0.10059680044651031, -0.07088519632816315, 0.03889859467744827, 0.09602604806423187, 0.13561539351940155, -0.03135965019464493, 0.037626590579748154, -0.011049669235944748, 0.07987335324287415, 0.1158270463347435, -0.2587537169456482, -0.020866189152002335, 0.04602406173944473, 0.05314856022596359, 0.0857265293598175, -0.05103102698922157, 0.009519262239336967, 0.053898897022008896, 0.06317330151796341, 0.07277358323335648, -0.04533211514353752, 0.005709839053452015, -0.007025483530014753, -0.12715016305446625, -0.07153834402561188, 0.1648075431585312, 0.03231972083449364, -0.025836560875177383, -0.1033400148153305, -0.0613810196518898, -0.0818016529083252, 0.002686563413590193, -0.019167132675647736, 0.0054562827572226524, -0.005570273846387863, 0.02860872820019722, -0.06560127437114716, -0.11848719418048859, -0.07009336352348328, -0.06834165006875992, 0.06953662633895874, 0.029115155339241028, 0.020003631711006165, 0.03265972062945366, 0.07854608446359634, -0.10581847280263901, -0.07708312571048737, 0.0067227850668132305, 0.019018108025193214, -0.09231037646532059, 0.017714180052280426, -0.0030679721385240555, -0.17227835953235626, -0.006554367020726204, -0.02421918325126171, -0.06659635156393051, 0.025717858225107193, 0.04622627794742584, 0.03501473739743233, 0.0511711910367012, 0.11951981484889984, -0.09973376989364624, -0.10631459206342697, -0.025855017825961113, -0.005939294584095478, -0.0061112926341593266, 0.009381250478327274, -0.06534359604120255, -0.027039898559451103, 0.014488574117422104, 0.0595412403345108, -0.0009580943151377141, 0.010750584304332733, -0.005352315492928028, -0.028939858078956604, 0.12555493414402008, -0.09431078284978867, 0.008663424290716648, 0.007928110659122467, -0.08770643174648285, -0.026616057381033897, 0.06470740586519241, -0.03399541229009628, -0.09871616214513779, 0.05117278918623924, -0.04738540202379227, -0.00698721781373024, -0.0992889404296875, -0.17215117812156677, 0.004634247627109289, -0.004379754886031151, -0.06048523634672165, -0.09364086389541626, -0.13157983124256134, -0.07188840210437775, 0.015562640503048897, -0.04933115467429161, 0.007386669050902128, -0.08970199525356293, 0.0003500251623336226, 0.03387473523616791, -0.012769109569489956, 0.06780578196048737, -0.04444345086812973, 0.047389473766088486, 0.006109702866524458, 0.05615611746907234, 0.006457047536969185, 0.030878357589244843, -0.07574564963579178, 0.04451684281229973, -0.06870588660240173, 0.14031970500946045, -0.009089620783925056, 0.010500708594918251, -0.13620078563690186, -0.05070970207452774, -0.07839174568653107, 0.038790956139564514, 0.07224098592996597, 0.12880975008010864, -0.1982886642217636, -0.03260008245706558, 0.20692653954029083, -0.06320246309041977, -0.0726715549826622, 0.10938616842031479, -0.029256807640194893, 0.037015173584222794, 0.07786019891500473, 0.08487837761640549, 0.02678532712161541, -0.04297343268990517, -0.06349234282970428, -0.011369779706001282, 0.018114373087882996, 0.017304295673966408, 0.09709466993808746, -0.07029461860656738, 0.08979494869709015, 0.008152988739311695, 0.04959557577967644, 0.017586136236786842, -0.04574969410896301, -0.03622239828109741, -0.0029486047569662333, -0.04283609613776207, -0.031665075570344925, 0.006917503662407398, 0.021396417170763016, -0.06278121471405029, -0.07926030457019806, 0.03428899124264717, 0.10429311543703079, -0.06755715608596802, 0.03137911111116409, 0.010408077389001846, -0.0469806045293808, -0.1201067566871643, 0.018350061029195786, -0.16231633722782135, 0.009165787138044834, 0.028997374698519707, -0.022359326481819153, 0.10296943783760071, 0.02296258509159088, 0.05276774615049362, 0.08543982356786728, -0.04999607056379318, -0.01981540583074093, -0.014311340637505054, -0.017768634483218193, -0.1130521148443222, -0.0995764210820198, -0.04087400063872337, -0.011492826044559479, 0.026187993586063385, -0.1571480929851532, 0.01589825004339218, -0.029554540291428566, 0.08579071611166, 0.00852050632238388, -0.03660118579864502, 0.031372010707855225, 0.05483788996934891, -0.030845822766423225, -0.029581688344478607, 0.03181983903050423, -0.00555663276463747, -0.049594566226005554, 0.09889087080955505, -0.13125096261501312, -0.10451795905828476, 0.10501295328140259, 0.027390746399760246, -0.08731978386640549, -0.010952198877930641, -0.008760691620409489, -0.0509796142578125, -0.04844757542014122, -0.0725443884730339, 0.19722814857959747, 0.05158190801739693, 0.1579517275094986, -0.10485003143548965, -0.06018759310245514, 0.015220885165035725, -0.01948016881942749, -0.01048698928207159, 0.155258446931839, 0.0357184074819088, -0.1778826117515564, 0.08755496889352798, 0.03193427622318268, -0.011927611194550991, 0.13018132746219635, 0.056270502507686615, -0.10813883692026138, -0.022313343361020088, 0.031370859593153, -0.0009728423901833594, 0.04800862818956375, -0.08787806332111359, -0.033830542117357254, 0.030700575560331345, 0.0679437667131424, 0.06613365560770035, -0.10051213949918747, 0.0696408748626709, 0.07153517752885818, -0.03566858917474747, 0.06428058445453644, -0.03964261710643768, -0.04499911889433861, 0.10382544249296188, 0.013405079022049904, -0.010939927771687508, -0.04557853564620018, -0.03960014879703522, -0.11270852386951447, 0.18811418116092682, -0.09642783552408218, -0.2251969426870346, -0.14122958481311798, 0.016211524605751038, -0.0379231721162796, 0.014609195291996002, 0.034810181707143784, -0.04503551870584488, -0.040171317756175995, -0.1086430773139, 0.08280407637357712, -0.10795121639966965, -0.04587578400969505, -0.08212150633335114, 0.044872574508190155, -0.01863493025302887, -0.1419420689344406, 0.02129281498491764, 0.002476772991940379, -0.04388730600476265, 0.0028351133223623037, -0.04624730721116066, 0.11188797652721405, 0.15680710971355438, -0.02290666662156582, -0.022110778838396072, 0.000333475909428671, 0.12105423212051392, -0.06160854175686836, 0.04178500175476074, 0.050207383930683136, 0.049511563032865524, 0.01758505403995514, 0.10672751814126968, 0.041853297501802444, -0.048865821212530136, 0.04189293459057808, 0.04971771687269211, -0.0289378073066473, -0.26068419218063354, -0.09773707389831543, -0.0638643205165863, -0.016268940642476082, 0.09265880286693573, 0.05231943726539612, -0.03229913488030434, 0.00704729650169611, -0.042837489396333694, 0.022144457325339317, 0.003375212661921978, 0.06433893740177155, 0.04539524391293526, -0.017580552026629448, 0.07912662625312805, -0.0610174834728241, -0.02316190116107464, 0.08727360516786575, 0.03725802153348923, 0.184993714094162, -0.03261598199605942, 0.24894236028194427, 0.053359001874923706, 0.04211295023560524, 0.012546360492706299, 0.07824215292930603, -0.0448690690100193, 0.030044248327612877, -0.023874742910265923, -0.06279534101486206, -0.0101172449067235, 0.06594428420066833, 0.014313056133687496, 0.02367488108575344, -0.05980607867240906, -0.062174972146749496, 0.07260814309120178, 0.22225041687488556, 0.07377311587333679, -0.1803974211215973, -0.0638735294342041, 0.0017643780447542667, -0.08165302127599716, -0.07050719112157822, 0.010038903914391994, 0.14870083332061768, -0.089540995657444, -0.014243257232010365, 0.024908188730478287, 0.13345971703529358, -0.12098729610443115, -0.02775556407868862, 0.004448662046343088, 0.009985841810703278, -0.02714063972234726, 0.11324428766965866, -0.24088457226753235, 0.1798231452703476, 0.029840387403964996, 0.05619721859693527, -0.042544592171907425, 0.007373756263405085, -0.04060050845146179, -0.02386961132287979, 0.10201328992843628, 0.022814687341451645, -0.0400691032409668, -0.10957010090351105, -0.11054626852273941, -0.020913220942020416, 0.06733091920614243, -0.06511993706226349, 0.10438038408756256, 0.06133360415697098, -0.006010269746184349, -0.011498012579977512, 0.06420379132032394, -0.01724809966981411, -0.16453932225704193, -0.0099741630256176, -0.004615140147507191, -0.04142061620950699, -0.013985040597617626, -0.04455186054110527, -0.03388124704360962, 0.21981322765350342, -0.12312039732933044, -0.07758407294750214, -0.08065421879291534, 0.020218227058649063, 0.12318946421146393, -0.07653769850730896, 0.03247847035527229, 0.011134820058941841, 0.04304083436727524, -0.04683825373649597, -0.039189714938402176, 0.08101726323366165, -0.05138339102268219, -0.07896275073289871, -0.055477626621723175, 0.16628345847129822, 0.04681553319096565, 0.05832112208008766, -0.028736602514982224, 0.049945369362831116, -0.0007496806792914867, -0.09786603599786758, -0.009204620495438576, 0.05524800345301628, 0.13573896884918213, 0.061753854155540466, -0.06385919451713562, -0.06550973653793335, -0.07340604811906815, -0.07990588247776031, 0.16306138038635254, 0.1756141185760498, -0.05219777673482895, 0.0323876291513443, 0.1665300875902176, -0.11597523093223572, -0.15662196278572083, -0.051169976592063904, 0.06389626860618591, 0.07585260272026062, -0.03648500517010689, -0.1734265834093094, 0.02003542147576809, 0.11758919060230255, -0.0019094509771093726, 0.055958256125450134, -0.3682776093482971, -0.14410805702209473, 0.030185334384441376, 0.03734526038169861, -0.006985969841480255, -0.1327938437461853, -0.05718763545155525, -0.05844659358263016, -0.10899285227060318, 0.1217922568321228, -0.03837361931800842, 0.10439583659172058, -0.008513008244335651, 0.02839498221874237, 0.039212360978126526, -0.04170988127589226, 0.12056749314069748, 0.01481244619935751, 0.031305231153964996, -0.04864373058080673, 0.026318645104765892, 0.015111129730939865, -0.026861252263188362, 0.15416565537452698, -0.06038738414645195, 0.0584026463329792, -0.1657150238752365, -0.05592026934027672, -0.06741506606340408, 0.010444091632962227, -0.03736351430416107, -0.06691108644008636, -0.054914433509111404, 0.02204848825931549, 0.038254428654909134, -0.012020152062177658, 0.028355523943901062, -0.06057577580213547, 0.05359150096774101, 0.17939351499080658, 0.06947648525238037, 0.014310240745544434, -0.09779169410467148, 0.0015919003635644913, -0.007057711016386747, 0.061397191137075424, -0.15227529406547546, -0.0028737979009747505, 0.14225926995277405, 0.04776599630713463, 0.1254199892282486, -0.01210800465196371, -0.13062670826911926, 0.010510917752981186, 0.057749632745981216, -0.09257929027080536, -0.10726085305213928, -0.013410457409918308, 0.04619181156158447, -0.06783800572156906, -0.015916965901851654, 0.11468420922756195, -0.07704880833625793, -0.03158426284790039, -0.005852541420608759, 0.02491830289363861, -0.06147809699177742, 0.21700362861156464, 0.041613928973674774, 0.05724768340587616, -0.05775816738605499, 0.10513503849506378, 0.1285349726676941, -0.1073923408985138, 0.032122962176799774, 0.20134401321411133, -0.07195840775966644, -0.050353582948446274, 0.014703947119414806, 0.10540277510881424, -0.042731836438179016, -0.044362060725688934, -0.012983663007616997, -0.027216294780373573, 0.02443430759012699, 0.03522107005119324, 0.041207101196050644, 0.04756544902920723, -0.013226460665464401, -0.03635416924953461, -0.08807159960269928, 0.08990725874900818, 0.050336457788944244, 0.006282606162130833, -0.02098606713116169, 0.10852404683828354, 0.014447805471718311, 0.010036023333668709, -0.013662764802575111, -0.006862031761556864, -0.05239619314670563, 0.019598688930273056, -0.07122495025396347, 0.0016259512631222606, -0.06731842458248138, -0.025389429181814194, -0.032978273928165436, 0.006038692779839039, -0.01593240723013878, 0.0010773175163194537, -0.043865203857421875, -0.05205642431974411, -0.05665344372391701, 0.03258514776825905, -0.09293057769536972, -0.03980940580368042, 0.0057075852528214455, -0.033440202474594116, 0.062294140458106995, 0.03925558552145958, 0.004124780185520649, 0.036106888204813004, -0.011204873211681843, 0.050872802734375, 0.021218260750174522, 0.04769602045416832, 0.009164254181087017, -0.05829431116580963, 0.01544184423983097, 0.025327183306217194, 0.00026171968784183264, -0.0027324839029461145, 0.00619077030569315, -0.13123616576194763, -0.0596894770860672, -0.05639097839593887, -0.020549319684505463, -0.06731919199228287, 0.08226358145475388, 0.06980635970830917, 0.06718545407056808, 0.08722382038831711, -0.06035629287362099, 0.07254750281572342, -0.17604891955852509, -0.0035491257440298796, 0.007342417724430561, -0.027362577617168427, -0.026298753917217255, -0.0030067532788962126, 0.049455806612968445, -0.06854726374149323, 0.13942652940750122, -0.006729803513735533, 0.04296455532312393, 0.024957936257123947, -0.05984070524573326, 0.005092330742627382, 0.016177693381905556, 0.12774623930454254, -0.006731432396918535, -0.03165875002741814, -0.05074091628193855, 0.08349364250898361, -0.006169677712023258, 0.11416999995708466, 0.05322422459721565, 0.12137039005756378, 0.13584443926811218, 0.046063460409641266, 0.012813166715204716, -0.0849791094660759, -0.04792541265487671, 0.04352430999279022, -0.011409604921936989, 0.059796303510665894, -0.026232946664094925, 0.11384172737598419, 0.16208013892173767, -0.1654679924249649, 0.11269800364971161, 0.004291371442377567, -0.07836440205574036, -0.05079060420393944, -0.10869645327329636, -0.05548029765486717, -0.03683936968445778, -0.034831203520298004, -0.1157681941986084, 0.005388619843870401, 0.049307990819215775, 0.05323702096939087, -0.02308398298919201, 0.09876983612775803, 0.021211206912994385, -0.09996642172336578, 0.0645524263381958, 0.004396761767566204, 0.07183048129081726, -0.01850777305662632, 0.04027967527508736, 0.06117628142237663, -0.017436521127820015, 0.05475416034460068, 0.05614682286977768, -0.016696516424417496, 0.015637069940567017, 0.0011600133730098605, -0.06962895393371582, -0.0400477759540081, 0.02634015679359436, 0.07511836290359497, 0.18989627063274384, 0.06066523864865303, -0.08465410768985748, -0.022383756935596466, 0.18315625190734863, -0.05296287685632706, -0.08440788090229034, -0.09508829563856125, 0.20843005180358887, 0.03786664083600044, 0.02087528631091118, 0.001014393288642168, -0.09736861288547516, -0.006872813683003187, 0.12060295045375824, 0.19307611882686615, -0.03489851951599121, -0.027905425056815147, 0.022604962810873985, -0.009064446203410625, 0.008702435530722141, 0.02228645421564579, 0.055056288838386536, 0.2773289978504181, -0.07751715183258057, 0.05566134676337242, -0.060256242752075195, 0.0027962778694927692, 0.0024816100485622883, 0.13509902358055115, -0.008898680098354816, 0.006300341337919235, -0.040457647293806076, 0.07872256636619568, -0.0076379054225981236, -0.16304780542850494, -0.0123670669272542, -0.10985089838504791, -0.10800517350435257, 0.014818457886576653, -0.03793027251958847, 0.041248153895139694, 0.06676410883665085, 0.007844864390790462, 0.03712816908955574, 0.027092715725302696, 0.013921340927481651, -0.12058976292610168, -0.11959715932607651, 0.024686256423592567, -0.02714313380420208, 0.08809324353933334, 0.0007452979334630072, 0.11939670890569687, 0.09606923162937164, 0.02156728133559227, -0.08277089893817902, 0.1011907160282135, 0.02888290025293827, 0.01979837939143181, 0.07205629348754883, 0.11161836981773376, -0.003025763900950551, 0.047371771186590195, 0.04172001779079437, -0.051058582961559296, 0.03800850734114647, -0.05582737922668457, -0.014504010789096355, -0.13781806826591492, 0.06941016763448715, -0.023864464834332466, 0.1438172459602356, 0.1647345870733261, -0.023045428097248077, -0.005875709000974894, -0.044509612023830414, 0.008187860250473022, -0.00675637973472476, 0.08587254583835602, -0.018946487456560135, -0.20245906710624695, 0.030312955379486084, -0.05657665431499481, 0.030690113082528114, -0.263937771320343, -0.04241427034139633, 0.027925362810492516, -0.05368364229798317, -0.030158907175064087, 0.08622310310602188, 0.07755628973245621, 0.035942573100328445, -0.03661178797483444, -0.07765744626522064, -0.003016089089214802, 0.10832858830690384, -0.12661728262901306, -0.11440831422805786 ]
null
null
transformers
# legal_t5_small_multitask_en_de model Model on translating legal text from English to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_en_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from English to Deustch. ### How to use Here is how to use this model to translate legal text from English to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_en_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_en_de", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "Reiterates its call on the Commission to submit a proposal to the Parliament and Council as soon as possible in order to ensure that bunker oil for engine fuel in new ships is stored in safer, double-hull tanks since freight or container ships often contain heavy fuel as engine fuel in their bunkers the quantity of which may considerably exceed the cargoes of smaller oil tankers; considers that, before submitting such a proposal, the Commission should ascertain whether or not the existing IMO rules laid down in Resolution MEPC.141(54) are sufficient to guarantee the safe transport of bunker oil used as fuel;" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_en_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_en_de | 41.337| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Deustch", "tags": ["translation English Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Reiterates its call on the Commission to submit a proposal to the Parliament and Council as soon as possible in order to ensure that bunker oil for engine fuel in new ships is stored in safer, double-hull tanks since freight or container ships often contain heavy fuel as engine fuel in their bunkers the quantity of which may considerably exceed the cargoes of smaller oil tankers; considers that, before submitting such a proposal, the Commission should ascertain whether or not the existing IMO rules laid down in Resolution MEPC.141(54) are sufficient to guarantee the safe transport of bunker oil used as fuel;"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_en_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_en\_de model ========================================= Model on translating legal text from English to Deustch. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_en\_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Deustch. ### How to use Here is how to use this model to translate legal text from English to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_en\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07162775099277496, 0.13254991173744202, -0.003648351412266493, 0.08667133003473282, 0.07228175550699234, 0.012594237923622131, 0.015489190816879272, 0.10787667334079742, -0.053959060460329056, 0.07101786881685257, 0.05782322585582733, 0.018740834668278694, 0.05257728695869446, 0.033733680844306946, 0.045556627213954926, -0.20136074721813202, -0.007675280328840017, -0.02074209600687027, -0.04559636488556862, 0.09499158710241318, 0.0911005511879921, -0.05439341068267822, 0.051399506628513336, -0.03993060439825058, -0.06908982992172241, 0.035501256585121155, -0.08911121636629105, -0.03933534398674965, 0.10720542818307877, 0.07466170191764832, 0.07727362960577011, -0.015941835939884186, 0.06560280174016953, -0.19197000563144684, -0.0015281245578080416, 0.06837304681539536, -0.010333998128771782, 0.04137489199638367, 0.09046530723571777, -0.0003722001565620303, 0.1877826303243637, -0.06441549956798553, 0.03832795098423958, 0.04983803257346153, -0.11334389448165894, -0.1242097020149231, -0.07018838077783585, 0.02441738359630108, 0.09395727515220642, 0.13972947001457214, -0.03475528210401535, 0.020911455154418945, -0.0028456682339310646, 0.07469647377729416, 0.09603186696767807, -0.2376781404018402, -0.017439140006899834, 0.04144861176609993, 0.057953301817178726, 0.08161333203315735, -0.041872382164001465, 0.0027875537052750587, 0.061135996133089066, 0.0758354589343071, 0.06783875823020935, -0.0432533398270607, 0.025867654010653496, -0.009963858872652054, -0.12194662541151047, -0.0617600716650486, 0.15434342622756958, 0.028704499825835228, -0.03156806528568268, -0.10165507346391678, -0.05987048149108887, -0.07804039120674133, 0.008235309273004532, -0.043749503791332245, 0.012753593735396862, -0.00023233762476593256, 0.03944262117147446, -0.06452567875385284, -0.10922618955373764, -0.06936390697956085, -0.07907691597938538, 0.06003955006599426, 0.03697695583105087, 0.012183420360088348, 0.018653785809874535, 0.07976672798395157, -0.1142808124423027, -0.08200845122337341, 0.003296569688245654, 0.015511447563767433, -0.07742927223443985, 0.018701493740081787, -0.009343425743281841, -0.1781880408525467, -0.00026391370920464396, -0.004699754994362593, -0.08019344508647919, 0.02357448823750019, 0.03594972565770149, 0.024607477709650993, 0.05843622609972954, 0.1173570454120636, -0.10369427502155304, -0.12173832207918167, -0.018181268125772476, -0.004174747038632631, 0.002131242537871003, 0.017741356045007706, -0.06443525850772858, -0.029186338186264038, 0.019073376432061195, 0.059826258569955826, 0.003379472065716982, 0.0037434548139572144, -0.008935409598052502, -0.02970033884048462, 0.13609737157821655, -0.09585169702768326, 0.005484877619892359, 0.0015745284035801888, -0.09133052080869675, -0.02163369581103325, 0.059940405189991, -0.030824415385723114, -0.09603532403707504, 0.05077953264117241, -0.04015861079096794, -0.022361597046256065, -0.101081483066082, -0.18289095163345337, 0.002194059081375599, -0.003738704603165388, -0.04570247605443001, -0.1028103455901146, -0.15040208399295807, -0.07488688081502914, 0.013547934591770172, -0.056537508964538574, 0.008370282128453255, -0.09301294386386871, 0.005237802863121033, 0.031226489692926407, -0.01335504837334156, 0.058291252702474594, -0.04397479072213173, 0.04518178477883339, 0.0197430532425642, 0.06111506372690201, -0.0008919597021304071, 0.04295778647065163, -0.07175671309232712, 0.04385320097208023, -0.07064557820558548, 0.15405595302581787, -0.0034760229755192995, 0.012839684262871742, -0.1386418640613556, -0.05049676448106766, -0.08405515551567078, 0.04746212065219879, 0.08461522310972214, 0.12759694457054138, -0.21343441307544708, -0.03136231750249863, 0.19846436381340027, -0.06571658700704575, -0.06483159959316254, 0.11128885298967361, -0.03494403883814812, 0.040225472301244736, 0.08968246728181839, 0.08720263838768005, 0.029401415959000587, -0.03915882483124733, -0.05831923335790634, -0.018175214529037476, 0.0107602309435606, 0.035846877843141556, 0.0903555303812027, -0.07146363705396652, 0.09522373974323273, 0.006412679795175791, 0.05816981568932533, 0.01969211734831333, -0.03728562965989113, -0.03004339151084423, 0.006900288164615631, -0.04357277229428291, -0.03423939645290375, 0.006811094004660845, 0.01649749092757702, -0.06558176130056381, -0.08112334460020065, 0.05607074126601219, 0.10263462364673615, -0.06569312512874603, 0.025491466745734215, 0.01712031103670597, -0.0621783472597599, -0.11603670567274094, 0.022496987134218216, -0.15431413054466248, 0.006354526150971651, 0.02380097284913063, -0.04523769021034241, 0.1013232097029686, 0.03118295967578888, 0.052857693284749985, 0.08423062413930893, -0.05320650711655617, -0.02064543217420578, -0.02799462340772152, -0.012159192003309727, -0.10289067775011063, -0.09843309968709946, -0.02599152736365795, -0.018371395766735077, 0.0042373305186629295, -0.14878854155540466, 0.016562387347221375, -0.039569541811943054, 0.0857030525803566, 0.010375826619565487, -0.029421348124742508, 0.025052735581994057, 0.058595914393663406, -0.0352214053273201, -0.037233252078294754, 0.022440897300839424, -0.01232913602143526, -0.04215744510293007, 0.09253700077533722, -0.12895412743091583, -0.09920675307512283, 0.10931108891963959, 0.02956845797598362, -0.09538622945547104, 0.010962138883769512, -0.003975389990955591, -0.0571458637714386, -0.045721232891082764, -0.08680825680494308, 0.2090805470943451, 0.04676297679543495, 0.15534697473049164, -0.09990686178207397, -0.05818590894341469, 0.010398549027740955, -0.006452734116464853, -0.019716326147317886, 0.15071837604045868, 0.03407156839966774, -0.17671719193458557, 0.09190783649682999, 0.03640725463628769, -0.009922793135046959, 0.1199503093957901, 0.0607234351336956, -0.1084645539522171, -0.018676679581403732, 0.024934906512498856, -0.002753038890659809, 0.0425674244761467, -0.08899858593940735, -0.02884206920862198, 0.021898500621318817, 0.0772651806473732, 0.07132016122341156, -0.09163423627614975, 0.06944935768842697, 0.07600120455026627, -0.0365525484085083, 0.05751071497797966, -0.036935318261384964, -0.047363441437482834, 0.10135357826948166, 0.022566718980669975, -0.015705808997154236, -0.038554079830646515, -0.03420597314834595, -0.10393466800451279, 0.18696263432502747, -0.10354941338300705, -0.2352720946073532, -0.13190512359142303, 0.01769985258579254, -0.04340941831469536, 0.020286547020077705, 0.04600550979375839, -0.05239475890994072, -0.04543953761458397, -0.09962787479162216, 0.1094616949558258, -0.1173052117228508, -0.04784183204174042, -0.08113732933998108, 0.045256659388542175, -0.012701673433184624, -0.1530078500509262, 0.02028302475810051, 0.007324314210563898, -0.030823515728116035, -0.000401576777221635, -0.039834532886743546, 0.10730843991041183, 0.14229129254817963, -0.01933354325592518, -0.030671922490000725, 0.0010173652553930879, 0.13108910620212555, -0.06460801512002945, 0.044899772852659225, 0.05844578519463539, 0.05343141034245491, 0.01732865907251835, 0.11953578144311905, 0.03963311016559601, -0.06332536041736603, 0.04882942885160446, 0.06123068928718567, -0.023832937702536583, -0.2667442560195923, -0.08631817251443863, -0.05837549269199371, -0.018758287653326988, 0.09059445559978485, 0.05376674607396126, -0.024813450872898102, -0.002274540951475501, -0.03922755643725395, 0.043393511325120926, 0.009051605127751827, 0.062005460262298584, 0.07593142986297607, -0.022509772330522537, 0.08099183440208435, -0.060356419533491135, -0.03768094629049301, 0.09218806028366089, 0.04696478694677353, 0.1789952665567398, -0.028012774884700775, 0.22483736276626587, 0.0514395609498024, 0.01214388944208622, 0.0006692596361972392, 0.07472379505634308, -0.0427420511841774, 0.03216826915740967, -0.03229960426688194, -0.057874009013175964, 0.0019324864260852337, 0.071077860891819, 0.006368372589349747, 0.015531573444604874, -0.053652141243219376, -0.05774940922856331, 0.06796712428331375, 0.20585988461971283, 0.08129392564296722, -0.19163815677165985, -0.06260430812835693, 0.0053453147411346436, -0.08114917576313019, -0.07916656881570816, 0.015928952023386955, 0.12471124529838562, -0.07498999685049057, -0.0017913910560309887, 0.03174548223614693, 0.1319446712732315, -0.12777209281921387, -0.020630965009331703, 0.010840676724910736, 0.023502739146351814, -0.02437681518495083, 0.10488825291395187, -0.2616555392742157, 0.16066725552082062, 0.027157297357916832, 0.06210421398282051, -0.032634053379297256, 0.015033241361379623, -0.03939230740070343, -0.025214295834302902, 0.09721909463405609, 0.018787162378430367, -0.03391048684716225, -0.09981852769851685, -0.10089056193828583, -0.02055976912379265, 0.05923730880022049, -0.05831773206591606, 0.0945558249950409, 0.06582298874855042, 0.00147329515311867, -0.006735165603458881, 0.06369909644126892, -0.03608042001724243, -0.16337580978870392, -0.01130153052508831, -0.016964811831712723, -0.02527589537203312, -0.01116913091391325, -0.04320817440748215, -0.039571043103933334, 0.21328604221343994, -0.11911537498235703, -0.08897104114294052, -0.07747561484575272, 0.03204377740621567, 0.11979985237121582, -0.07352767884731293, 0.033867813646793365, 0.01644856110215187, 0.0315583273768425, -0.047563016414642334, -0.04611051455140114, 0.08569049835205078, -0.04807858169078827, -0.06792604178190231, -0.06144978851079941, 0.14948670566082, 0.05436113476753235, 0.0486564040184021, -0.027532730251550674, 0.045894213020801544, -0.0006095306598581374, -0.09336663037538528, -0.003561324207112193, 0.06030043959617615, 0.13115638494491577, 0.04969910904765129, -0.05707715079188347, -0.06513729691505432, -0.07652479410171509, -0.08258074522018433, 0.1489686667919159, 0.17091064155101776, -0.0476626381278038, 0.024982672184705734, 0.16690051555633545, -0.11479911208152771, -0.15978389978408813, -0.04676617309451103, 0.06439752131700516, 0.07743457704782486, -0.033026233315467834, -0.18575897812843323, 0.015878327190876007, 0.12551161646842957, 0.002742029493674636, 0.07090576738119125, -0.3662947118282318, -0.14697317779064178, 0.04053974524140358, 0.0321122407913208, -0.003271790686994791, -0.13336938619613647, -0.05275901034474373, -0.0710020586848259, -0.10271947830915451, 0.1057477742433548, -0.03388355299830437, 0.09446702152490616, -0.009395970031619072, 0.03501003980636597, 0.03684840723872185, -0.03407137840986252, 0.1267756223678589, 0.01609559915959835, 0.03952719643712044, -0.05211504548788071, 0.03674374520778656, 0.009341691620647907, -0.026312202215194702, 0.15963678061962128, -0.050428394228219986, 0.05893775820732117, -0.15357813239097595, -0.06572970747947693, -0.05760187283158302, 0.012483542785048485, -0.03600386902689934, -0.0779605507850647, -0.05314983054995537, 0.021230528131127357, 0.032414212822914124, -0.01920330710709095, 0.015705741941928864, -0.05423547327518463, 0.03716472536325455, 0.16089093685150146, 0.0634770616889, 0.02862117812037468, -0.09457597881555557, 0.009797143749892712, -0.009238382801413536, 0.06606283783912659, -0.15426911413669586, 0.0022088137920945883, 0.14533022046089172, 0.03478487581014633, 0.1294921338558197, -0.011644010432064533, -0.13525095582008362, 0.016430141404271126, 0.05689472332596779, -0.07967334985733032, -0.12029720097780228, -0.013046289794147015, 0.02425512671470642, -0.0563603937625885, -0.006572699174284935, 0.12016288191080093, -0.0788227915763855, -0.0322253443300724, -0.005383966024965048, 0.030471596866846085, -0.06580402702093124, 0.2197248786687851, 0.03629794344305992, 0.04742727056145668, -0.05408913642168045, 0.11341948062181473, 0.12927691638469696, -0.1225425973534584, 0.04351026192307472, 0.1959472894668579, -0.0652582123875618, -0.052838925272226334, -0.0029996067751199007, 0.11502549797296524, -0.04625160992145538, -0.04930202290415764, -0.015119710937142372, -0.032510701566934586, 0.031020361930131912, 0.013239418156445026, 0.0341125950217247, 0.05168480798602104, -0.01492181234061718, -0.029902642592787743, -0.08816026896238327, 0.0895405262708664, 0.052469223737716675, 0.012139237485826015, -0.02739449217915535, 0.0872410237789154, 0.014065553434193134, 0.004439713433384895, -0.012592717073857784, 0.00012930427328683436, -0.051669977605342865, 0.014577951282262802, -0.10203520208597183, -0.0032667857594788074, -0.06348079442977905, -0.01725451834499836, -0.031291600316762924, 0.007637897506356239, -0.012745184823870659, 0.004360940307378769, -0.034572064876556396, -0.06171714887022972, -0.051797643303871155, 0.03134885057806969, -0.09586042910814285, -0.04851807653903961, -0.003156546037644148, -0.0327046662569046, 0.06538538634777069, 0.03336489200592041, 0.009308256208896637, 0.025687258690595627, 0.010766205377876759, 0.05735455080866814, 0.02589639462530613, 0.04907426983118057, 0.009551464579999447, -0.0700070708990097, 0.011984086595475674, 0.02515195496380329, -0.010417623445391655, -0.010101208463311195, 0.0037964056245982647, -0.13462071120738983, -0.053105488419532776, -0.057525940239429474, -0.02127053216099739, -0.06436526030302048, 0.0746380016207695, 0.07238307595252991, 0.07207079231739044, 0.08871840685606003, -0.06546632945537567, 0.072309710085392, -0.17005924880504608, -0.0055173467844724655, 0.006698684301227331, -0.021826503798365593, -0.01287275180220604, -0.001182902604341507, 0.051667600870132446, -0.0690312534570694, 0.14836335182189941, -0.003263358725234866, 0.04750997945666313, 0.026966923847794533, -0.07922463864088058, -0.0028767939656972885, 0.01917858235538006, 0.1266619712114334, -0.012511814944446087, -0.03117917664349079, -0.07908130437135696, 0.08547665923833847, 0.003451212076470256, 0.12530650198459625, 0.04303351417183876, 0.1395643800497055, 0.13101230561733246, 0.0416412279009819, 0.015100046060979366, -0.08892399072647095, -0.06479375809431076, 0.04188515990972519, 0.004738798830658197, 0.05305252969264984, -0.016469037160277367, 0.09536448866128922, 0.16117629408836365, -0.15818576514720917, 0.11155381798744202, -0.0005271629197522998, -0.07500527054071426, -0.05043911188840866, -0.09413548558950424, -0.05399726331233978, -0.03962373733520508, -0.03744179382920265, -0.11493425816297531, 0.001902733463793993, 0.029873032122850418, 0.055497556924819946, -0.03118138201534748, 0.10220152884721756, 0.025316089391708374, -0.10547102987766266, 0.0590297095477581, 0.007357921451330185, 0.0673600286245346, -0.0005168287898413837, 0.04291067644953728, 0.05706702917814255, -0.005625523626804352, 0.046208400279283524, 0.05254145339131355, -0.02535843290388584, 0.015146440826356411, -0.0001402035413775593, -0.05818432942032814, -0.04197661206126213, 0.03610134497284889, 0.0558878518640995, 0.20284439623355865, 0.06605887413024902, -0.08955761790275574, -0.014184735715389252, 0.1649252325296402, -0.04872887209057808, -0.0924445316195488, -0.09369561076164246, 0.2227284163236618, 0.05267791822552681, 0.03567613288760185, -0.0013793741818517447, -0.10712924599647522, -0.012580006383359432, 0.104297935962677, 0.2057473212480545, -0.031557347625494, -0.03399238735437393, 0.024117622524499893, -0.008042768575251102, 0.007519380655139685, 0.01763181947171688, 0.06707535684108734, 0.27677497267723083, -0.0710420310497284, 0.055168334394693375, -0.060353998094797134, 0.012177580036222935, 0.0042570773512125015, 0.13762959837913513, -0.0030080617871135473, 0.008728914894163609, -0.036114905029535294, 0.08484338968992233, -0.0007716164691373706, -0.17568109929561615, -0.010109025985002518, -0.10628766566514969, -0.10750390589237213, 0.01432875357568264, -0.047606926411390305, 0.03525395318865776, 0.06468439102172852, 0.012814832851290703, 0.041258007287979126, 0.04081025347113609, 0.009495623409748077, -0.12011285871267319, -0.10831102728843689, 0.019202638417482376, -0.000036705914681078866, 0.0736260712146759, -0.0008394495816901326, 0.11535857617855072, 0.10002248734235764, 0.030888274312019348, -0.08611207455396652, 0.10891824960708618, 0.025174332782626152, 0.03865315020084381, 0.07134340703487396, 0.0960979089140892, -0.004191123880445957, 0.03384082019329071, 0.04430961236357689, -0.049091409891843796, 0.030061833560466766, -0.04580102860927582, -0.0280043575912714, -0.14216896891593933, 0.0805085077881813, -0.026157274842262268, 0.1378834843635559, 0.1616596132516861, -0.01811641827225685, -0.0003581218479666859, -0.044365640729665756, 0.011115611530840397, 0.0008447435684502125, 0.0914226695895195, -0.019674772396683693, -0.19410042464733124, 0.03252813220024109, -0.0497065894305706, 0.032337307929992676, -0.26741763949394226, -0.04306180775165558, 0.03254900127649307, -0.050234485417604446, -0.026420896872878075, 0.0838475301861763, 0.07004635035991669, 0.03656556457281113, -0.042048197239637375, -0.07760999351739883, -0.014406343922019005, 0.1096348911523819, -0.12212149053812027, -0.11693082004785538 ]
null
null
transformers
# legal_t5_small_multitask_en_es model Model on translating legal text from English to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_en_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from English to Spanish. ### How to use Here is how to use this model to translate legal text from English to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_en_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_en_es", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "Amendment 14 Article 5, paragraph 1, point (a)" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_en_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_en_es | 37.404| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Spanish", "tags": ["translation English Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Amendment 14 Article 5, paragraph 1, point (a)"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_en_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_en\_es model ========================================= Model on translating legal text from English to Spanish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_en\_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Spanish. ### How to use Here is how to use this model to translate legal text from English to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_en\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07476645708084106, 0.14074495434761047, -0.004485000390559435, 0.07859420031309128, 0.06250517070293427, 0.011744278483092785, 0.01827145181596279, 0.10071345418691635, -0.08230013400316238, 0.06677152216434479, 0.04097960889339447, 0.02709638699889183, 0.06587201356887817, 0.016638997942209244, 0.03158838301897049, -0.18351717293262482, -0.006254893261939287, -0.02317894622683525, -0.029493585228919983, 0.10161181539297104, 0.08267807215452194, -0.06172569468617439, 0.03742377087473869, -0.02625795640051365, -0.0632755234837532, 0.033206757158041, -0.08862261474132538, -0.05780689790844917, 0.10512129217386246, 0.07392176240682602, 0.10023661702871323, 0.006590406410396099, 0.06865165382623672, -0.18193145096302032, -0.011200749315321445, 0.0687338337302208, -0.005228162277489901, 0.05028494819998741, 0.11470554769039154, 0.0045344592072069645, 0.16743524372577667, -0.05038885772228241, 0.03511448949575424, 0.047349102795124054, -0.1265592873096466, -0.14582370221614838, -0.053904369473457336, -0.010938852094113827, 0.10337549448013306, 0.1285330355167389, -0.03817693144083023, 0.048673149198293686, -0.002280652755871415, 0.0684773325920105, 0.06684709340333939, -0.23108501732349396, -0.031093142926692963, 0.005847342777997255, 0.05481172725558281, 0.1008809506893158, -0.03125699236989021, -0.008746661245822906, 0.07688247412443161, 0.056486036628484726, 0.055385321378707886, -0.05357220768928528, -0.03923938795924187, -0.018252437934279442, -0.1309250146150589, -0.06558675318956375, 0.16188034415245056, 0.012729804962873459, -0.032070014625787735, -0.09617126733064651, -0.07035796344280243, -0.055554378777742386, 0.014830022118985653, -0.03689168393611908, 0.012135159224271774, -0.0007762605091556907, 0.04801976680755615, -0.0454140231013298, -0.11976628005504608, -0.06120270490646362, -0.06382431089878082, 0.07652054727077484, 0.054175663739442825, 0.012256339192390442, 0.026127448305487633, 0.08891618996858597, -0.08701132982969284, -0.086431123316288, 0.0287774708122015, 0.011093736626207829, -0.08520203083753586, 0.01583808846771717, 0.0026917741633951664, -0.21035915613174438, -0.011567318812012672, -0.053468428552150726, -0.09931406378746033, 0.03361184895038605, 0.05209911987185478, 0.04004206880927086, 0.06316854059696198, 0.1162131056189537, -0.07837732136249542, -0.0903448686003685, -0.043249331414699554, -0.007841182872653008, -0.020389391109347343, 0.02084941230714321, -0.06770122796297073, -0.028078746050596237, -0.01618976891040802, 0.04936320707201958, 0.013341921381652355, -0.0031550482381135225, -0.033945899456739426, -0.02331412397325039, 0.09872471541166306, -0.08629168570041656, 0.02105162851512432, 0.01313552726060152, -0.09532152116298676, -0.03639378398656845, 0.05269133299589157, -0.022847553715109825, -0.1108570247888565, 0.03942212089896202, -0.028616204857826233, -0.02339029498398304, -0.1180773675441742, -0.1833374798297882, 0.0044948202557861805, -0.011581518687307835, -0.05484571307897568, -0.09303369373083115, -0.1053178608417511, -0.08985381573438644, 0.026364240795373917, -0.064163938164711, 0.009185886941850185, -0.10724752396345139, 0.019029276445508003, 0.02565896324813366, -0.018838487565517426, 0.07858786731958389, -0.04406419023871422, 0.05839098244905472, 0.03583045303821564, 0.06959850341081619, 0.019048435613512993, 0.032573167234659195, -0.09832955151796341, 0.03680320829153061, -0.07220839709043503, 0.15088239312171936, -0.004255438223481178, 0.0013288729824125767, -0.14508377015590668, -0.05585777387022972, -0.08377282321453094, 0.04335480555891991, 0.08064654469490051, 0.13460920751094818, -0.2074233740568161, -0.024828696623444557, 0.21844740211963654, -0.055832136422395706, -0.07087355852127075, 0.11899467557668686, -0.03130275011062622, 0.06434261053800583, 0.07073052227497101, 0.07145742326974869, 0.03752492740750313, -0.04110311344265938, -0.045835841447114944, -0.005509581882506609, 0.030223805457353592, 0.03919346630573273, 0.09432793408632278, -0.07714662700891495, 0.07629168778657913, -0.0008178736316040158, 0.030997730791568756, 0.02582021802663803, -0.04087846726179123, -0.03765757381916046, 0.011817514896392822, -0.03467010706663132, -0.008949379436671734, 0.025692738592624664, 0.019217772409319878, -0.0674838051199913, -0.08345698565244675, -0.008982349187135696, 0.08164417743682861, -0.060737740248441696, 0.0274370014667511, 0.010433463379740715, -0.05808262526988983, -0.12249062210321426, 0.0084597272798419, -0.1567765474319458, 0.007269063498824835, 0.03553742542862892, -0.01981736533343792, 0.09190493822097778, 0.03988887369632721, 0.051119767129421234, 0.09448070824146271, -0.04060450941324234, -0.03921660780906677, -0.009511328302323818, -0.021713949739933014, -0.09759466350078583, -0.12183694541454315, -0.020139988511800766, -0.015511034987866879, 0.005248095840215683, -0.14757250249385834, 0.014985955320298672, -0.05049590393900871, 0.08391789346933365, 0.004493670538067818, -0.023935863748192787, 0.018205512315034866, 0.08824465423822403, -0.04406141862273216, -0.034927502274513245, 0.03915929049253464, -0.010455643758177757, -0.048962388187646866, 0.10416918992996216, -0.10778721421957016, -0.12947148084640503, 0.0882222056388855, -0.007651349529623985, -0.09274618327617645, 0.0008632500539533794, -0.008187255822122097, -0.058686740696430206, -0.06198342889547348, -0.06628362834453583, 0.2612600028514862, 0.04203212261199951, 0.1648140698671341, -0.12540045380592346, -0.04995741322636604, 0.016687968745827675, -0.03510359674692154, -0.021089620888233185, 0.1672174036502838, 0.05955739691853523, -0.16633932292461395, 0.09231207519769669, 0.03936508670449257, -0.008680311031639576, 0.1098225936293602, 0.054044585675001144, -0.10930006951093674, 0.0013374787522479892, 0.061808887869119644, 0.0016788498032838106, 0.03728488087654114, -0.0982736349105835, -0.01835181750357151, 0.024770978838205338, 0.06535878032445908, 0.07465947419404984, -0.11962384730577469, 0.0726868137717247, 0.07848600298166275, -0.032287515699863434, 0.033682841807603836, -0.051147859543561935, -0.03547098860144615, 0.09730340540409088, 0.006573560647666454, -0.03615104407072067, -0.03797072917222977, -0.034945450723171234, -0.11109758913516998, 0.1804690659046173, -0.09609347581863403, -0.23548521101474762, -0.14180442690849304, 0.005402591545134783, -0.01851365715265274, 0.02784738317131996, 0.04700428619980812, -0.059182122349739075, -0.041265469044446945, -0.07360441982746124, 0.08915147930383682, -0.1095215380191803, -0.06900496035814285, -0.09029241651296616, 0.0597076416015625, -0.009873686358332634, -0.14602136611938477, 0.0364927239716053, 0.02099066972732544, -0.02918737567961216, -0.00010314284736523405, -0.04264984652400017, 0.12393154203891754, 0.13988111913204193, -0.026768159121274948, -0.040383052080869675, -0.004239503759890795, 0.1361548900604248, -0.07184088975191116, 0.02884237840771675, 0.05802043899893761, 0.03833646699786186, 0.03590908274054527, 0.12492751330137253, 0.037329960614442825, -0.05349693074822426, 0.029311224818229675, 0.0402960330247879, -0.010928284376859665, -0.274740606546402, -0.09155652672052383, -0.06014950945973396, -0.03709055483341217, 0.09383658319711685, 0.0404275506734848, -0.05828375741839409, 0.01877446658909321, -0.04295412823557854, 0.03781117498874664, -0.0019997332710772753, 0.05835968255996704, 0.04063897207379341, -0.021024128422141075, 0.06444601714611053, -0.05829167738556862, -0.06545384973287582, 0.0911542996764183, 0.04235837608575821, 0.19825294613838196, -0.04460899904370308, 0.2316126972436905, 0.05855035036802292, 0.060514237731695175, -0.004808302503079176, 0.06949469447135925, -0.04771143198013306, 0.040077466517686844, -0.035562410950660706, -0.057361479848623276, -0.007892436347901821, 0.05937229469418526, 0.00935147050768137, 0.02088795229792595, -0.06428986042737961, -0.05497171729803085, 0.06423169374465942, 0.19026601314544678, 0.07889033854007721, -0.20266222953796387, -0.04023274406790733, 0.006989195942878723, -0.06991955637931824, -0.08765365183353424, 0.010972783900797367, 0.16218125820159912, -0.07614224404096603, -0.0060640969313681126, 0.020184257999062538, 0.13532447814941406, -0.12826432287693024, -0.021856388077139854, 0.0067965686321258545, 0.022380633279681206, -0.012794671580195427, 0.1305176168680191, -0.24155011773109436, 0.1865529865026474, 0.020327426493167877, 0.0648028701543808, -0.03810511529445648, 0.029912807047367096, -0.06956992298364639, -0.014773663133382797, 0.10248410701751709, 0.016849329695105553, -0.030491063371300697, -0.10844738036394119, -0.09962178766727448, -0.017117692157626152, 0.06112214922904968, -0.04413177818059921, 0.09699524194002151, 0.07242821902036667, 0.012013974599540234, -0.008662296459078789, 0.029680965468287468, -0.035199567675590515, -0.17221911251544952, -0.0022779209539294243, -0.008394137024879456, -0.024366680532693863, -0.011859816499054432, -0.03670928254723549, -0.06796016544103622, 0.2021254599094391, -0.13469034433364868, -0.08159290254116058, -0.06777507811784744, 0.008903504349291325, 0.1352994590997696, -0.06554471701383591, 0.013381361030042171, 0.0237644724547863, 0.017505161464214325, -0.03339909762144089, -0.016459310427308083, 0.08867654204368591, -0.06294836848974228, -0.06751739978790283, -0.08430425822734833, 0.12347420305013657, 0.0527968667447567, 0.045455463230609894, -0.019119424745440483, 0.02813580073416233, -0.008029527962207794, -0.08716099709272385, -0.0071641127578914165, 0.028902240097522736, 0.1593971848487854, 0.03741440176963806, -0.06791143864393234, -0.07736286520957947, -0.08354104310274124, -0.08260395377874374, 0.1391441524028778, 0.16862529516220093, -0.0439123772084713, 0.03672337904572487, 0.1879858821630478, -0.13050971925258636, -0.13617151975631714, -0.05730801448225975, 0.09257946908473969, 0.08743345737457275, -0.027107497677206993, -0.18858210742473602, -0.00755158020183444, 0.11156260967254639, 0.006158511620014906, 0.03340421989560127, -0.42302483320236206, -0.1349707394838333, 0.017588960006833076, 0.043033916503190994, -0.00014678185107186437, -0.1164553090929985, -0.06168932095170021, -0.0507945641875267, -0.10395298898220062, 0.08088254183530807, -0.013565241359174252, 0.09214643388986588, 0.00720694987103343, 0.021951762959361076, 0.04942179471254349, -0.03261399269104004, 0.13811244070529938, -0.011778631247580051, 0.025943133980035782, -0.04757203534245491, 0.04697251319885254, 0.01644243113696575, -0.010560551658272743, 0.13453905284404755, -0.040168724954128265, 0.054456405341625214, -0.1783202439546585, -0.04895560070872307, -0.049521688371896744, 0.0030387742444872856, -0.03985634073615074, -0.062448784708976746, -0.036464378237724304, 0.013290047645568848, 0.05439332500100136, -0.010902524925768375, 0.01650228723883629, -0.0414198599755764, 0.040806353092193604, 0.16872452199459076, 0.09692399203777313, 0.03373238071799278, -0.09900622814893723, 0.004651722032576799, 0.010656556114554405, 0.06614773720502853, -0.133418008685112, 0.0037511189002543688, 0.14925020933151245, 0.022095585241913795, 0.11267246305942535, -0.011575299315154552, -0.13549067080020905, 0.027753690257668495, 0.07609393447637558, -0.0796273872256279, -0.13723841309547424, -0.019162682816386223, 0.0368514284491539, -0.05161701515316963, -0.014627840369939804, 0.09489163756370544, -0.07044335454702377, -0.03958185762166977, -0.013488270342350006, 0.022965112701058388, -0.052693843841552734, 0.22310157120227814, 0.01117787603288889, 0.05090121552348137, -0.052739668637514114, 0.10619588941335678, 0.15949325263500214, -0.14489038288593292, 0.027560172602534294, 0.18269123136997223, -0.05933056026697159, -0.0406625010073185, 0.04601715877652168, 0.12298709899187088, -0.02414676547050476, -0.06542555242776871, -0.026923730969429016, -0.02721245586872101, 0.011427253484725952, 0.005094744730740786, 0.022445552051067352, 0.040149006992578506, -0.007137593347579241, -0.03559555858373642, -0.09325661510229111, 0.08135835081338882, 0.07278277724981308, 0.015482909977436066, -0.03277521952986717, 0.11995293945074081, 0.014625289477407932, -0.015223600901663303, -0.010037887841463089, 0.012002939358353615, -0.057040952146053314, 0.02924066223204136, -0.07266807556152344, 0.00794149748980999, -0.04976148530840874, -0.02089644968509674, -0.03039153292775154, 0.001675755949690938, -0.006508580408990383, 0.0012535728747025132, -0.041235506534576416, -0.036846891045570374, -0.03893345966935158, 0.04364778846502304, -0.08162214607000351, -0.03817127272486687, 0.005163096822798252, -0.03250650316476822, 0.05368312820792198, 0.009197782725095749, -0.009834581054747105, 0.01688198558986187, -0.022047797217965126, 0.06567508727312088, 0.030249716714024544, 0.05222960561513901, 0.01765250600874424, -0.07767286151647568, 0.027289777994155884, 0.03319614753127098, -0.01683575101196766, -0.014634732156991959, 0.01786927320063114, -0.1347998082637787, -0.039738234132528305, -0.030227815732359886, -0.05058978125452995, -0.06201652064919472, 0.09784460812807083, 0.06998240947723389, 0.06002168729901314, 0.10791664570569992, -0.06746949255466461, 0.07063653320074081, -0.15386751294136047, -0.007325160317122936, 0.025119641795754433, -0.02202487923204899, -0.011581527069211006, -0.010930482298135757, 0.049395594745874405, -0.07213465124368668, 0.12894389033317566, 0.012356585822999477, 0.0785268172621727, 0.005025098100304604, -0.0840582326054573, -0.014076835475862026, 0.015471489168703556, 0.11082647740840912, -0.019170960411429405, -0.02023857645690441, -0.08574644476175308, 0.09362547099590302, 0.0028456018771976233, 0.11285031586885452, 0.02036253735423088, 0.12589038908481598, 0.1516200751066208, 0.04841850325465202, 0.007958579808473587, -0.09237606078386307, -0.060016829520463943, 0.07832133024930954, 0.004296437371522188, 0.05358870327472687, -0.02497013472020626, 0.11131731420755386, 0.13944920897483826, -0.1434956043958664, 0.11030282080173492, 0.006998338736593723, -0.08067379891872406, -0.05807812511920929, -0.09559255838394165, -0.04434540122747421, -0.046889934688806534, -0.035753000527620316, -0.1143210381269455, 0.013320828787982464, 0.0016397001454606652, 0.042200084775686264, -0.03728379309177399, 0.09600755572319031, 0.03436270356178284, -0.12057323753833771, 0.06262373924255371, 0.00919361598789692, 0.12016720324754715, -0.02370382845401764, 0.036387961357831955, 0.05792238563299179, 0.003824451006948948, 0.04992642253637314, 0.06023203954100609, -0.01693795435130596, 0.014202534221112728, 0.004366341046988964, -0.0553245022892952, -0.04302945360541344, 0.030379457399249077, 0.07394430041313171, 0.19428326189517975, 0.06175961717963219, -0.06027306988835335, -0.028319086879491806, 0.1855672001838684, -0.055599335581064224, -0.07509491592645645, -0.10130270570516586, 0.20540229976177216, 0.036650486290454865, 0.032266344875097275, 0.023719415068626404, -0.110277459025383, -0.010950528085231781, 0.13224905729293823, 0.1766054779291153, -0.024586455896496773, -0.042302489280700684, 0.026215938851237297, -0.006655361037701368, 0.013531441800296307, 0.03765285387635231, 0.05373704433441162, 0.2802688777446747, -0.08406659215688705, 0.048032764345407486, -0.06130781024694443, 0.0454762764275074, -0.012155752629041672, 0.14986924827098846, -0.012126225046813488, 0.0014499484095722437, -0.04023575037717819, 0.0960024893283844, 0.009465777315199375, -0.1585734486579895, -0.0045467279851436615, -0.09947281330823898, -0.11152784526348114, 0.015358555130660534, 0.020453443750739098, 0.033129800111055374, 0.07118165493011475, 0.011849035508930683, 0.026193892583251, 0.055720310658216476, 0.013713160529732704, -0.09653396159410477, -0.10183729231357574, 0.00562694389373064, -0.07202021777629852, 0.10116633772850037, 0.007029818370938301, 0.14102016389369965, 0.09855331480503082, 0.020974917337298393, -0.08933307975530624, 0.09999847412109375, 0.02281849831342697, 0.03780770301818848, 0.09311167150735855, 0.07871485501527786, 0.00003102096525253728, 0.06313292682170868, 0.04031871631741524, -0.05002559348940849, 0.05438044294714928, -0.04717215895652771, -0.004798531997948885, -0.12491481751203537, 0.08688303828239441, -0.02827695570886135, 0.12561024725437164, 0.1736968755722046, -0.0158790685236454, 0.0021171218249946833, -0.04493197426199913, 0.01567292958498001, -0.010306118987500668, 0.11166387796401978, -0.020464951172471046, -0.2075752317905426, 0.020198224112391472, -0.028077371418476105, 0.04332184046506882, -0.24339352548122406, -0.01875765435397625, 0.035801321268081665, -0.05889538675546646, -0.03778266906738281, 0.07208820432424545, 0.0755348950624466, 0.036713797599077225, -0.029960710555315018, -0.09948330372571945, -0.011674617417156696, 0.10791991651058197, -0.0987510159611702, -0.09891849011182785 ]
null
null
transformers
# legal_t5_small_multitask_en_fr model Model on translating legal text from English to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_en_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from English to French. ### How to use Here is how to use this model to translate legal text from English to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_en_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_en_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "Article 2(b), sub-heading" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_en_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_en_fr | 38.063| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English French", "tags": ["translation English French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Article 2(b), sub-heading"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_en_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_en\_fr model ========================================= Model on translating legal text from English to French. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_en\_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to French. ### How to use Here is how to use this model to translate legal text from English to French in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_en\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06535271555185318, 0.1443941742181778, -0.004285176284611225, 0.08133278787136078, 0.04747684299945831, 0.000775421445723623, 0.031579602509737015, 0.08996272087097168, -0.07974744588136673, 0.058880262076854706, 0.0541764535009861, -0.003476071869954467, 0.05791107565164566, 0.03174237534403801, 0.050871241837739944, -0.1739530861377716, -0.004012159537523985, -0.02344868890941143, -0.037045467644929886, 0.10673237591981888, 0.09387749433517456, -0.05504504218697548, 0.04675627872347832, -0.007823306135833263, -0.06064390018582344, 0.03988513723015785, -0.09436949342489243, -0.041110482066869736, 0.10864074528217316, 0.07629119604825974, 0.09234514832496643, 0.003937332425266504, 0.07318152487277985, -0.1956373155117035, -0.010247191414237022, 0.0729534924030304, -0.012415813282132149, 0.046312395483255386, 0.11226162314414978, 0.0025806576013565063, 0.1612197756767273, -0.0497346930205822, 0.03367801010608673, 0.04782108590006828, -0.10253768414258957, -0.13949841260910034, -0.05481872335076332, 0.0010862292256206274, 0.08628138899803162, 0.13680399954319, -0.03578567132353783, 0.04531669244170189, -0.017252324149012566, 0.07565048336982727, 0.07207430899143219, -0.23537327349185944, -0.02916582114994526, 0.000143055513035506, 0.05744045600295067, 0.08867351710796356, -0.049042411148548126, -0.007907232269644737, 0.06422490626573563, 0.056931544095277786, 0.06563910096883774, -0.05215094983577728, -0.02402959018945694, -0.02082948572933674, -0.13287818431854248, -0.05791217461228371, 0.17078176140785217, 0.022107116878032684, -0.04292907193303108, -0.09730396419763565, -0.06371283531188965, -0.06255976110696793, -0.005025097168982029, -0.045687802135944366, 0.0227211844176054, 0.001705824863165617, 0.045311324298381805, -0.05063395947217941, -0.11322809010744095, -0.06416001170873642, -0.06260528415441513, 0.06536848843097687, 0.05429133027791977, 0.016590937972068787, 0.02918819524347782, 0.08654750883579254, -0.12246834486722946, -0.06796841323375702, 0.021761136129498482, 0.002520971931517124, -0.09052161127328873, 0.012862962670624256, 0.0076646883971989155, -0.18504425883293152, -0.009827716276049614, -0.021453477442264557, -0.10097426921129227, 0.036841362714767456, 0.04495367407798767, 0.04086683318018913, 0.05592038482427597, 0.12028097361326218, -0.08586391806602478, -0.10578891634941101, -0.044500112533569336, 0.0036328756250441074, -0.036932338029146194, 0.03429457172751427, -0.051552511751651764, -0.030152715742588043, -0.01752159744501114, 0.03402997925877571, 0.003759278915822506, -0.010345850139856339, -0.024471867829561234, -0.011710436083376408, 0.11160861700773239, -0.08885429799556732, 0.02417861670255661, 0.010929260402917862, -0.10087201744318008, -0.03229865804314613, 0.05969816818833351, -0.01391821913421154, -0.11609333008527756, 0.044727154076099396, -0.029297474771738052, -0.01681528054177761, -0.11292898654937744, -0.1826968491077423, -0.00977744534611702, 0.014320011250674725, -0.06762846559286118, -0.08406703919172287, -0.10126996040344238, -0.07866137474775314, 0.04055372625589371, -0.05618758499622345, 0.008073396980762482, -0.108254075050354, 0.003030664985999465, 0.033212512731552124, -0.0010235084919258952, 0.07263839244842529, -0.04841548576951027, 0.03713216632604599, 0.009563797153532505, 0.06415443867444992, 0.01378120668232441, 0.03336464986205101, -0.08925367891788483, 0.03794645145535469, -0.0720333531498909, 0.14845456182956696, -0.010518086142838001, -0.023775849491357803, -0.14743201434612274, -0.0692794993519783, -0.09049811959266663, 0.03189173340797424, 0.08897427469491959, 0.1361256241798401, -0.20913326740264893, -0.028762448579072952, 0.22559186816215515, -0.06419968605041504, -0.07128101587295532, 0.15249699354171753, -0.03638770058751106, 0.0386212095618248, 0.06101091206073761, 0.06510976701974869, 0.05014282092452049, -0.046954963356256485, -0.05557527020573616, 0.013959158211946487, 0.02907526306807995, 0.049271684139966965, 0.1021067202091217, -0.06955146044492722, 0.06450340896844864, -0.012447657063603401, 0.04277946427464485, 0.024380730465054512, -0.04853231459856033, -0.039850540459156036, 0.0037100687623023987, -0.03579616919159889, 0.012350520119071007, 0.026685675606131554, 0.013332320377230644, -0.07141721993684769, -0.09472654014825821, -0.028922809287905693, 0.09129544347524643, -0.06705506891012192, 0.027234366163611412, 0.020285099744796753, -0.06588257104158401, -0.09420347213745117, 0.0058798352256417274, -0.14865228533744812, -0.0011489145690575242, 0.03893902897834778, -0.042210228741168976, 0.09224116057157516, 0.04370268061757088, 0.05924920737743378, 0.10806328058242798, -0.04742206633090973, -0.0386430062353611, -0.021681305021047592, -0.02492116391658783, -0.0920940712094307, -0.12603680789470673, -0.011966841295361519, -0.016016582027077675, 0.021890882402658463, -0.15108224749565125, 0.015938851982355118, -0.050419602543115616, 0.09151314198970795, 0.007324692327529192, -0.023058531805872917, 0.0009792596101760864, 0.07521504908800125, -0.04274515435099602, -0.022945081815123558, 0.031020497903227806, -0.021609202027320862, -0.03299631550908089, 0.10892412811517715, -0.08148561418056488, -0.11913459002971649, 0.09280627965927124, -0.005143432877957821, -0.10012321174144745, 0.004722596146166325, -0.020524023100733757, -0.05908207595348358, -0.0530928410589695, -0.061735156923532486, 0.25439366698265076, 0.04862314090132713, 0.17034654319286346, -0.11408563703298569, -0.058686263859272, 0.02020893059670925, -0.026020891964435577, -0.019231095910072327, 0.1673135906457901, 0.06367503851652145, -0.15145368874073029, 0.08835924416780472, 0.05584525689482689, -0.0120999151840806, 0.10261322557926178, 0.0505061075091362, -0.10659117996692657, 0.00011075899965362623, 0.06634267419576645, -0.006797398906201124, 0.03859370946884155, -0.09545720368623734, -0.023917479440569878, 0.0227863360196352, 0.06022300943732262, 0.06553362309932709, -0.11907251924276352, 0.08215823024511337, 0.07073083519935608, -0.04056532680988312, 0.040887150913476944, -0.04606408253312111, -0.03850153833627701, 0.10430750995874405, 0.006614345125854015, -0.05924278497695923, -0.04126441851258278, -0.03917132318019867, -0.10777487605810165, 0.1887657791376114, -0.09722017496824265, -0.22345070540905, -0.12506861984729767, 0.007395309396088123, -0.04443938657641411, 0.013785156421363354, 0.042002927511930466, -0.05470578745007515, -0.04765034839510918, -0.09620119631290436, 0.06169871613383293, -0.12133410573005676, -0.05676368251442909, -0.09382101148366928, 0.05439751222729683, -0.008378188125789165, -0.1549670249223709, 0.030993474647402763, 0.015020296908915043, -0.03097122721374035, -0.008931879885494709, -0.032881952822208405, 0.11310004442930222, 0.12918025255203247, -0.04732749983668327, -0.038604069501161575, 0.0021633347496390343, 0.15198488533496857, -0.06771492958068848, 0.03302980586886406, 0.040704935789108276, 0.05080242082476616, 0.049469996243715286, 0.1274939775466919, 0.040912024676799774, -0.04073528200387955, 0.03825340047478676, 0.05334175005555153, -0.004350251518189907, -0.25869059562683105, -0.10106706619262695, -0.0610322579741478, -0.02612992189824581, 0.09004505723714828, 0.044523242861032486, -0.052123792469501495, 0.001709776814095676, -0.04657445475459099, 0.03885664790868759, 0.012741188518702984, 0.055822212249040604, 0.042747680097818375, -0.016657250002026558, 0.07232118397951126, -0.060626477003097534, -0.0702267661690712, 0.09249144792556763, 0.02618509531021118, 0.19362962245941162, -0.049634888768196106, 0.22196875512599945, 0.059073854237794876, 0.0602337084710598, -0.005278685130178928, 0.07216515392065048, -0.0451827235519886, 0.03866315260529518, -0.03143668174743652, -0.056586816906929016, 0.010484817437827587, 0.06438744068145752, 0.011240643449127674, 0.011931009590625763, -0.06361355632543564, -0.0439668707549572, 0.07374297827482224, 0.19495317339897156, 0.08288285881280899, -0.19821123778820038, -0.03742850571870804, -0.006636816542595625, -0.08005468547344208, -0.08862902969121933, 0.0239153653383255, 0.16543856263160706, -0.08231055736541748, -0.015152718871831894, 0.02520613931119442, 0.13007532060146332, -0.11148668080568314, -0.020544864237308502, 0.028122754767537117, 0.02919543907046318, -0.0185390692204237, 0.12166742235422134, -0.23708684742450714, 0.1815347671508789, 0.01469203270971775, 0.05364326015114784, -0.0338737852871418, 0.03378454968333244, -0.05334942042827606, 0.009793885052204132, 0.11733824759721756, 0.024820785969495773, -0.029446423053741455, -0.09064561873674393, -0.0941491574048996, -0.02114223502576351, 0.0744268149137497, -0.04133915528655052, 0.08418001979589462, 0.06602055579423904, -0.0028995752800256014, -0.008697700686752796, 0.03229238837957382, -0.05688003450632095, -0.1721886694431305, 0.004086217377334833, -0.0006646140827797353, -0.03424263745546341, -0.011381740681827068, -0.03971268981695175, -0.07479416579008102, 0.21878840029239655, -0.12487542629241943, -0.06816555559635162, -0.06259528547525406, -0.006220412906259298, 0.1355457603931427, -0.06677164137363434, 0.030081506818532944, 0.01486352551728487, 0.029443221166729927, -0.04361163452267647, -0.016656987369060516, 0.07623131573200226, -0.07163956761360168, -0.053415603935718536, -0.08020662516355515, 0.1274755448102951, 0.056417543441057205, 0.04460049420595169, -0.019195446744561195, 0.03025282919406891, -0.02019532211124897, -0.10032104700803757, -0.009955589659512043, 0.011090087704360485, 0.15363876521587372, 0.04217500239610672, -0.06723514944314957, -0.08329889923334122, -0.08110929280519485, -0.06425052136182785, 0.14918316900730133, 0.17654407024383545, -0.054141346365213394, 0.04163731262087822, 0.18944591283798218, -0.1172729954123497, -0.15689969062805176, -0.051888179033994675, 0.09479864686727524, 0.07343880832195282, -0.03976977616548538, -0.18400804698467255, 0.007447344716638327, 0.09446924179792404, 0.005490847863256931, 0.03655536100268364, -0.4065951108932495, -0.1412544995546341, 0.018879372626543045, 0.03448005020618439, 0.005067543592303991, -0.10406160354614258, -0.03914187476038933, -0.04839467629790306, -0.10748928785324097, 0.08550792932510376, -0.01858467049896717, 0.08340451121330261, 0.007016688585281372, 0.0071667772717773914, 0.039607103914022446, -0.03448566049337387, 0.12919007241725922, 0.011756600812077522, 0.03373563662171364, -0.037679824978113174, 0.05545762553811073, 0.019536759704351425, -0.00903230905532837, 0.1440090537071228, -0.04194790869951248, 0.06053099408745766, -0.1745397001504898, -0.039642706513404846, -0.05113163962960243, 0.01078569795936346, -0.04100114107131958, -0.06186910346150398, -0.0456448532640934, 0.024384543299674988, 0.06305212527513504, -0.004675926640629768, -0.014634205028414726, -0.027248987928032875, 0.026588166132569313, 0.17662613093852997, 0.08771728724241257, 0.035731393843889236, -0.10443555563688278, 0.02712547965347767, 0.009615935385227203, 0.060902975499629974, -0.1362476497888565, 0.016088709235191345, 0.14871424436569214, 0.02119189128279686, 0.12287230789661407, -0.00826484989374876, -0.12739865481853485, 0.005474940408021212, 0.06394067406654358, -0.09947796165943146, -0.14513854682445526, -0.025089537724852562, 0.02941908687353134, -0.05090629681944847, -0.01732620783150196, 0.08758145570755005, -0.08437062799930573, -0.03027944080531597, -0.01580190099775791, 0.025139588862657547, -0.06470195204019547, 0.2074356973171234, 0.0026043381076306105, 0.055674128234386444, -0.05185548588633537, 0.09510225057601929, 0.15107609331607819, -0.1466720849275589, 0.022864319384098053, 0.18850135803222656, -0.06133345142006874, -0.04516121372580528, 0.04340642690658569, 0.12368177622556686, -0.004718757700175047, -0.06060865521430969, -0.019208023324608803, -0.02503446862101555, 0.017964841797947884, 0.0038319812156260014, 0.026380630210042, 0.03530660644173622, 0.00039859063690528274, -0.048013098537921906, -0.10545124113559723, 0.10131001472473145, 0.08113839477300644, 0.00987135898321867, -0.00941673293709755, 0.1009674072265625, 0.008890397846698761, 0.000910583243239671, -0.009327087551355362, 0.011880454607307911, -0.04499954357743263, 0.015944713726639748, -0.0683036744594574, -0.0004461730713956058, -0.041328247636556625, -0.010102299973368645, -0.030943889170885086, -0.0031078211031854153, -0.0070225573144853115, 0.006257108878344297, -0.04630882665514946, -0.04292437061667442, -0.02738899365067482, 0.04526239633560181, -0.09638067334890366, -0.03682010993361473, 0.020368481054902077, -0.03910341113805771, 0.06255938857793808, 0.018466001376509666, -0.004161531571298838, 0.006801747251302004, -0.014382612891495228, 0.04447939991950989, 0.0014980942942202091, 0.05572797730565071, 0.007608208805322647, -0.08758586645126343, 0.03072650171816349, 0.02692527323961258, -0.012026847340166569, -0.018282057717442513, 0.01395316980779171, -0.12446121126413345, -0.0424555279314518, -0.04436607286334038, -0.0515243262052536, -0.06862077116966248, 0.10033523291349411, 0.05986868217587471, 0.07548180967569351, 0.10593046247959137, -0.06203712895512581, 0.06542670726776123, -0.14548680186271667, -0.0069768293760716915, 0.02937968261539936, -0.028525274246931076, -0.007569748442620039, -0.021779853850603104, 0.040502917021512985, -0.07449236512184143, 0.1411280781030655, 0.018330028280615807, 0.07364969700574875, 0.013227651827037334, -0.1043906882405281, -0.031077252700924873, 0.02298426441848278, 0.08721212297677994, -0.026274072006344795, -0.02591102011501789, -0.08615436404943466, 0.07933789491653442, 0.008393453434109688, 0.12392249703407288, 0.0310222040861845, 0.12587887048721313, 0.14165759086608887, 0.045970458537340164, -0.009967023506760597, -0.09743823111057281, -0.061496734619140625, 0.0822765976190567, 0.003028381383046508, 0.05807345733046532, -0.04490511864423752, 0.1269317865371704, 0.12994834780693054, -0.14602558314800262, 0.116519495844841, 0.013702772557735443, -0.08908788114786148, -0.06350068747997284, -0.07398127019405365, -0.0471489354968071, -0.045844513922929764, -0.03381733596324921, -0.11158449202775955, 0.02596127614378929, 0.0018766365246847272, 0.062451060861349106, -0.024003341794013977, 0.09659725427627563, 0.0018073124811053276, -0.1064351424574852, 0.07196713984012604, 0.001089464407414198, 0.10664437711238861, -0.021053871139883995, 0.033883217722177505, 0.05595313757658005, -0.017951276153326035, 0.04835624620318413, 0.057643741369247437, -0.01706782728433609, 0.017855530604720116, 0.006839660927653313, -0.05354723706841469, -0.04424697905778885, 0.02391473948955536, 0.07264403998851776, 0.19198913872241974, 0.06635626405477524, -0.07226831465959549, -0.0240960530936718, 0.17001032829284668, -0.05801219865679741, -0.08184552937746048, -0.1100669577717781, 0.2037428319454193, 0.040349166840314865, 0.03136732429265976, 0.020805684849619865, -0.09663763642311096, -0.012066063471138477, 0.12466195970773697, 0.18852978944778442, -0.023648707196116447, -0.03814742714166641, 0.034740693867206573, -0.009869979694485664, 0.023059070110321045, 0.018995292484760284, 0.06576080620288849, 0.2670730650424957, -0.09315329790115356, 0.05105673894286156, -0.06522562354803085, 0.037822749465703964, -0.0056165000423789024, 0.15154097974300385, -0.0033172150142490864, 0.001871879561804235, -0.04746396094560623, 0.08495732396841049, 0.018261071294546127, -0.15461254119873047, -0.002128372900187969, -0.09684120863676071, -0.10175962001085281, 0.02415415830910206, -0.005308809224516153, 0.036032840609550476, 0.06561799347400665, 0.00800030492246151, 0.02000017650425434, 0.06560873240232468, 0.00850116740912199, -0.09192567318677902, -0.1019047349691391, -0.0004225382290314883, -0.06909313797950745, 0.11571670323610306, 0.011951874010264874, 0.1528959572315216, 0.0977238193154335, 0.011346173472702503, -0.0821724683046341, 0.08089586347341537, 0.029312167316675186, 0.04137209430336952, 0.06822516024112701, 0.09636818617582321, -0.005387347657233477, 0.06634839624166489, 0.023835526779294014, -0.032973527908325195, 0.05872030183672905, -0.07046779990196228, -0.023122373968362808, -0.12245955318212509, 0.09456750005483627, -0.0294016245752573, 0.1299942582845688, 0.1786395162343979, -0.008666831068694592, 0.010598293505609035, -0.052244968712329865, 0.00461815157905221, -0.017677467316389084, 0.08896026015281677, -0.01742168329656124, -0.19265903532505035, 0.043835509568452835, -0.0322110652923584, 0.04913952574133873, -0.24270714819431305, -0.024718480184674263, 0.031372908502817154, -0.0429188497364521, -0.030387219041585922, 0.07727042585611343, 0.08295265585184097, 0.024985214695334435, -0.034480463713407516, -0.10386105626821518, -0.01314019039273262, 0.10260052978992462, -0.08355525881052017, -0.09848268330097198 ]
null
null
transformers
# legal_t5_small_multitask_en_it model Model on translating legal text from English to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_en_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from English to Italian. ### How to use Here is how to use this model to translate legal text from English to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_en_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_en_it", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "WRITTEN QUESTION E-1184/07" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_en_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_en_it | 47.070| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Italian", "tags": ["translation English Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "WRITTEN QUESTION E-1184/07"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_en_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_en\_it model ========================================= Model on translating legal text from English to Italian. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_en\_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Italian. ### How to use Here is how to use this model to translate legal text from English to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_en\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07779370248317719, 0.15045514702796936, -0.0038613721262663603, 0.08245369046926498, 0.0628398209810257, 0.018337151035666466, 0.03909267485141754, 0.11284587532281876, -0.06514771282672882, 0.07439126819372177, 0.03620678931474686, 0.007392261642962694, 0.07506883144378662, 0.03607156500220299, 0.02926480583846569, -0.1853831708431244, 0.0005488301976583898, -0.023749666288495064, -0.025490902364253998, 0.11075273901224136, 0.09300349652767181, -0.06963572651147842, 0.041081953793764114, -0.03022458776831627, -0.08089300990104675, 0.03551514819264412, -0.09440364688634872, -0.03660048916935921, 0.10521606355905533, 0.07786256074905396, 0.09624233841896057, 0.0006830826168879867, 0.08122354000806808, -0.18283742666244507, -0.009392562322318554, 0.08142858743667603, -0.003397668246179819, 0.03883404657244682, 0.11857200413942337, 0.012213757261633873, 0.16645343601703644, -0.0317695178091526, 0.02296830713748932, 0.043003685772418976, -0.10336071997880936, -0.12581554055213928, -0.04930565878748894, 0.003510595066472888, 0.09190395474433899, 0.13803556561470032, -0.04085509106516838, 0.06328690052032471, -0.01615852117538452, 0.07700782269239426, 0.06894590705633163, -0.22206543385982513, -0.0364232063293457, -0.006213353015482426, 0.04387868568301201, 0.10485067963600159, -0.03569024056196213, -0.006318936124444008, 0.06547780334949493, 0.05378003790974617, 0.03884018957614899, -0.03430884703993797, -0.01578485034406185, -0.027470208704471588, -0.1350354701280594, -0.055531904101371765, 0.17786037921905518, 0.014060757122933865, -0.040630217641592026, -0.09422798454761505, -0.06255271285772324, -0.05374529957771301, 0.009254534728825092, -0.04213797673583031, 0.0023574684746563435, 0.00928037241101265, 0.05857846140861511, -0.03552313894033432, -0.12184708565473557, -0.06878085434436798, -0.05074946954846382, 0.10362964868545532, 0.066803477704525, 0.013903158716857433, 0.027857281267642975, 0.08650130778551102, -0.10713449120521545, -0.07619220018386841, 0.025983132421970367, 0.013121028430759907, -0.08346371352672577, 0.007337811402976513, 0.00010455753363203257, -0.20877705514431, -0.010963382199406624, -0.03476880490779877, -0.06279120594263077, 0.02656167559325695, 0.0494617260992527, 0.04798859357833862, 0.05550100654363632, 0.11096522212028503, -0.09726053476333618, -0.10276000201702118, -0.040847525000572205, 0.00013387574290391058, -0.021174700930714607, 0.02076829969882965, -0.06724031269550323, -0.034591902047395706, -0.02174045518040657, 0.04475453123450279, 0.015607518143951893, -0.00007392041879938915, -0.03515653312206268, -0.02413204126060009, 0.08978310972452164, -0.09985097497701645, 0.024446655064821243, 0.0138283371925354, -0.10676750540733337, -0.021487925201654434, 0.0433817021548748, -0.022966230288147926, -0.1216571033000946, 0.056764937937259674, -0.031646259129047394, -0.018718881532549858, -0.12197374552488327, -0.181754007935524, 0.0035681321751326323, -0.014571897685527802, -0.061224933713674545, -0.08899230509996414, -0.10344935208559036, -0.07493315637111664, 0.02750975638628006, -0.06333469599485397, 0.018288318067789078, -0.09728448837995529, 0.018552018329501152, 0.018548978492617607, -0.009362542070448399, 0.07048094272613525, -0.04382680356502533, 0.054939012974500656, 0.01820758357644081, 0.06550956517457962, 0.008597343228757381, 0.0344325490295887, -0.10274314135313034, 0.029210815206170082, -0.06005332991480827, 0.13232876360416412, -0.00514906644821167, -0.0075049917213618755, -0.13809797167778015, -0.05800844728946686, -0.08760414272546768, 0.03125210851430893, 0.09504438936710358, 0.14579539000988007, -0.2190985083580017, -0.02004900388419628, 0.21204239130020142, -0.07732409238815308, -0.07562477141618729, 0.14664830267429352, -0.027534371241927147, 0.06435553729534149, 0.06871926784515381, 0.07100128382444382, 0.042281851172447205, -0.0489983893930912, -0.06323865801095963, -0.008344576694071293, 0.021194472908973694, 0.04197315126657486, 0.09260384738445282, -0.05686602368950844, 0.10244014859199524, -0.002126513049006462, 0.02490697242319584, 0.021701278164982796, -0.05165204033255577, -0.04338157922029495, -0.0015917974524199963, -0.05391606688499451, -0.007309658918529749, 0.030389757826924324, 0.017820170149207115, -0.07233668863773346, -0.0985446497797966, -0.04897492378950119, 0.09997240453958511, -0.07637058198451996, 0.03257288411259651, 0.0037796904798597097, -0.034092389047145844, -0.09574954211711884, 0.009729582816362381, -0.14607487618923187, 0.000047189754695864394, 0.050792545080184937, -0.028397247195243835, 0.08669143170118332, 0.03463481739163399, 0.05852258950471878, 0.10207431763410568, -0.046695735305547714, -0.0387284979224205, -0.010909304954111576, -0.023728135973215103, -0.09996424615383148, -0.13015852868556976, -0.031869083642959595, -0.009843655861914158, 0.02003316581249237, -0.14890147745609283, 0.02281283028423786, -0.03609035909175873, 0.10604739189147949, -0.0003695550258271396, -0.02915887162089348, -0.00510603841394186, 0.07014261931180954, -0.05183413624763489, -0.03286802023649216, 0.03000735677778721, -0.015165128745138645, -0.044854894280433655, 0.09542717784643173, -0.09430915862321854, -0.11258619278669357, 0.0871959775686264, -0.012598969042301178, -0.09154649078845978, -0.001171063631772995, -0.013922104611992836, -0.06658727675676346, -0.05649717524647713, -0.07003722339868546, 0.2333378940820694, 0.043207455426454544, 0.14652934670448303, -0.11493845283985138, -0.03768744692206383, 0.02504822239279747, -0.027606071904301643, -0.027358008548617363, 0.15285418927669525, 0.07317524403333664, -0.1670241355895996, 0.09344179183244705, 0.053714949637651443, -0.02398100309073925, 0.10988827049732208, 0.059667836874723434, -0.11933643370866776, 0.00797224696725607, 0.07101857662200928, -0.0020480758976191282, 0.03226295858621597, -0.1096034049987793, -0.010870352387428284, 0.020981786772608757, 0.061571136116981506, 0.07351647317409515, -0.11770141124725342, 0.07039210200309753, 0.06806477159261703, -0.03012627176940441, 0.035392750054597855, -0.06223148852586746, -0.05077489838004112, 0.10928317159414291, 0.005580108147114515, -0.05622990056872368, -0.0334232859313488, -0.03708561882376671, -0.10104190558195114, 0.17580412328243256, -0.08892674744129181, -0.22419162094593048, -0.1296241581439972, 0.006573952734470367, -0.030724437907338142, 0.02150994911789894, 0.032222095876932144, -0.05003047734498978, -0.046832382678985596, -0.08147095888853073, 0.06799513101577759, -0.11747158318758011, -0.05882144346833229, -0.09762647747993469, 0.06251835823059082, -0.01736718975007534, -0.13725195825099945, 0.03443168103694916, 0.01464924681931734, -0.03931507468223572, -0.0066838678903877735, -0.0528162457048893, 0.13075079023838043, 0.14076247811317444, -0.04076491296291351, -0.025628535076975822, 0.003525692969560623, 0.12185036391019821, -0.08366362005472183, 0.04254400357604027, 0.059961799532175064, 0.029174691066145897, 0.0393441878259182, 0.12936659157276154, 0.036965206265449524, -0.05114717781543732, 0.03132385015487671, 0.04601651802659035, -0.004708400461822748, -0.2627043128013611, -0.09491627663373947, -0.06434765458106995, -0.02354264259338379, 0.08207397907972336, 0.04173729941248894, -0.06307345628738403, 0.012691925279796124, -0.051399242132902145, 0.011776844970881939, 0.014429793693125248, 0.054198455065488815, 0.019318146631121635, -0.02015175297856331, 0.07516557723283768, -0.05585232749581337, -0.062476228922605515, 0.09676925837993622, 0.04738032817840576, 0.2131553441286087, -0.04945455864071846, 0.23465275764465332, 0.04915737360715866, 0.0727187991142273, -0.0012884698808193207, 0.07173530757427216, -0.03555430844426155, 0.029244596138596535, -0.02403704635798931, -0.059682365506887436, 0.0029182059224694967, 0.07141116261482239, 0.01344339083880186, 0.003784395521506667, -0.052383292466402054, -0.038158271461725235, 0.07356236129999161, 0.21326959133148193, 0.07806126773357391, -0.20724529027938843, -0.03620314225554466, -0.0035631663631647825, -0.0634377971291542, -0.08572880923748016, 0.00875830091536045, 0.16667605936527252, -0.06723221391439438, -0.019380934536457062, 0.024639485403895378, 0.13154426217079163, -0.13529382646083832, -0.02781938761472702, 0.02046331949532032, 0.03046404756605625, -0.021566295996308327, 0.13583116233348846, -0.23709842562675476, 0.18314474821090698, 0.019280415028333664, 0.08051708340644836, -0.05463973432779312, 0.03016115538775921, -0.058002907782793045, 0.015347868204116821, 0.11446543037891388, 0.016819847747683525, -0.027898553758859634, -0.11824743449687958, -0.11043985933065414, -0.0226049292832613, 0.0937407985329628, -0.033526573330163956, 0.08782301843166351, 0.06042216718196869, 0.010702642612159252, -0.002497436013072729, 0.04929042607545853, -0.03473279997706413, -0.16651968657970428, 0.016350040212273598, 0.015029470436275005, -0.04368764907121658, -0.008825820870697498, -0.05065294727683067, -0.05654895678162575, 0.22273841500282288, -0.12473169714212418, -0.06592103093862534, -0.07056376338005066, 0.01740010268986225, 0.12529942393302917, -0.06671479344367981, 0.011541763320565224, 0.018119052052497864, 0.03173999860882759, -0.04027865454554558, -0.010451040230691433, 0.08721472322940826, -0.06268846243619919, -0.06494524329900742, -0.08470921218395233, 0.11375302076339722, 0.06116727367043495, 0.0408974252641201, -0.015561400912702084, 0.025725331157445908, -0.008620268665254116, -0.08247340470552444, 0.001283843070268631, 0.021074723452329636, 0.1524057537317276, 0.05186361074447632, -0.07787446677684784, -0.07639654725790024, -0.08562503755092621, -0.07801604270935059, 0.13739538192749023, 0.1716407984495163, -0.05013628676533699, 0.011504990980029106, 0.1823001503944397, -0.12520059943199158, -0.15849749743938446, -0.03339020907878876, 0.08417823165655136, 0.08473894745111465, -0.038232576102018356, -0.181549072265625, 0.0034480560570955276, 0.11939049512147903, -0.001016982365399599, 0.04938328266143799, -0.4101276695728302, -0.1308242380619049, 0.010283874347805977, 0.04698042571544647, 0.0031407324131578207, -0.10728801786899567, -0.03685005381703377, -0.04420376569032669, -0.10215916484594345, 0.05691000446677208, -0.004119694232940674, 0.09105172008275986, 0.009597575291991234, 0.0007064488017931581, 0.0471215657889843, -0.04020588472485542, 0.12354760617017746, 0.006094690877944231, 0.02753843367099762, -0.04968709871172905, 0.04863055795431137, 0.018659187480807304, -0.0011158650740981102, 0.1414000391960144, -0.044957127422094345, 0.0445246659219265, -0.15848539769649506, -0.050332918763160706, -0.0478493757545948, 0.013973530381917953, -0.038436971604824066, -0.06899313628673553, -0.03558993712067604, 0.03121684119105339, 0.05052119866013527, 0.0034633688628673553, -0.0013265719171613455, -0.04811321571469307, 0.029263358563184738, 0.17364604771137238, 0.08864585310220718, 0.029073653742671013, -0.08993110805749893, 0.007004925981163979, 0.009732457809150219, 0.05788300558924675, -0.11042796075344086, 0.012641859240829945, 0.15262436866760254, 0.018300117924809456, 0.11719603091478348, -0.005979988258332014, -0.13581305742263794, 0.014042004942893982, 0.07492814213037491, -0.09209390729665756, -0.14201000332832336, -0.02234741486608982, 0.034224312752485275, -0.05789327621459961, 0.008225266821682453, 0.090860515832901, -0.08558376878499985, -0.026712657883763313, -0.020285680890083313, 0.03668216988444328, -0.06192988529801369, 0.22688442468643188, 0.007470395416021347, 0.04188988357782364, -0.0512150302529335, 0.12310801446437836, 0.14924120903015137, -0.14259617030620575, 0.02346222661435604, 0.18605761229991913, -0.06173910200595856, -0.04461110010743141, 0.04420876502990723, 0.10735461860895157, -0.00447873305529356, -0.07070387154817581, -0.035402800887823105, -0.02227793261408806, 0.0103917196393013, -0.017463210970163345, 0.029709897935390472, 0.04059235379099846, -0.018548378720879555, -0.041474759578704834, -0.1195102334022522, 0.09484902769327164, 0.09001661837100983, 0.01403307355940342, -0.025577357038855553, 0.12011561542749405, 0.017012907192111015, -0.02871205285191536, -0.009241605177521706, 0.0052961125038564205, -0.03965367376804352, 0.028327947482466698, -0.05802585557103157, -0.01316893845796585, -0.041505057364702225, -0.01898464560508728, -0.03683294728398323, -0.006978750694543123, -0.017266608774662018, 0.007455897051841021, -0.05282376706600189, -0.04109526798129082, -0.036852918565273285, 0.032035090029239655, -0.0903129130601883, -0.035554856061935425, 0.01031904574483633, -0.036298565566539764, 0.06673730909824371, 0.022394848987460136, -0.008860871195793152, 0.006804334931075573, -0.016380414366722107, 0.06345521658658981, 0.007007990498095751, 0.05351031571626663, 0.007957309484481812, -0.07112430781126022, 0.032430969178676605, 0.040953874588012695, -0.02408580295741558, -0.014753124676644802, 0.018663113936781883, -0.1214834675192833, -0.02485371194779873, -0.03528223931789398, -0.04976896941661835, -0.06342504173517227, 0.10712955892086029, 0.06905487179756165, 0.07219655811786652, 0.0918227806687355, -0.05238441377878189, 0.07413968443870544, -0.14517457783222198, 0.000015045084182929713, 0.03707120195031166, -0.03952348232269287, -0.008449801243841648, -0.012917500920593739, 0.04930302873253822, -0.07543343305587769, 0.11964268982410431, 0.011982293799519539, 0.06302124261856079, 0.009885141626000404, -0.08390704542398453, -0.015599867329001427, 0.019061552360653877, 0.10539469867944717, -0.029663825407624245, -0.023075897246599197, -0.07816892117261887, 0.07957465201616287, -0.006220914423465729, 0.12848693132400513, 0.025201445445418358, 0.1353514939546585, 0.14317664504051208, 0.04986229166388512, 0.014842299744486809, -0.09988441318273544, -0.07653766870498657, 0.07995055615901947, -0.013527694158256054, 0.06178070604801178, -0.0366898737847805, 0.1331758201122284, 0.13900603353977203, -0.1462608426809311, 0.09414751082658768, 0.00005239498568698764, -0.0949845016002655, -0.05618614703416824, -0.11002039909362793, -0.04462634399533272, -0.04482881724834442, -0.029300034046173096, -0.119365394115448, 0.02451077476143837, 0.0028738181572407484, 0.04817057028412819, -0.048350315541028976, 0.10692284256219864, 0.024024734273552895, -0.10351605713367462, 0.07658103108406067, 0.0006993127171881497, 0.11152137070894241, -0.022374525666236877, 0.00657736137509346, 0.048392705619335175, -0.017432384192943573, 0.055361196398735046, 0.049756016582250595, -0.011843345127999783, 0.008559545502066612, 0.005591842345893383, -0.054332297295331955, -0.04565160721540451, 0.017234649509191513, 0.06983006000518799, 0.18787342309951782, 0.05068780481815338, -0.05829484015703201, -0.029694844037294388, 0.16878174245357513, -0.05249737948179245, -0.0651007890701294, -0.10332296788692474, 0.19607707858085632, 0.039321087300777435, 0.030061054974794388, 0.020632250234484673, -0.10511435568332672, -0.00791249517351389, 0.1310214251279831, 0.17530471086502075, -0.03647295758128166, -0.04332370683550835, 0.03527838736772537, -0.007244212087243795, 0.009921284392476082, 0.03529176488518715, 0.05284564197063446, 0.2524525821208954, -0.0881461650133133, 0.07091495394706726, -0.060872167348861694, 0.04289370775222778, -0.008419974707067013, 0.15138280391693115, -0.002824970753863454, 0.007638146169483662, -0.06479689478874207, 0.09222140163183212, 0.03173961490392685, -0.14217883348464966, -0.0065460773184895515, -0.10786653310060501, -0.11513756215572357, 0.0238809771835804, 0.004767481237649918, 0.023813165724277496, 0.0853874459862709, -0.001497057732194662, 0.027423974126577377, 0.05610425025224686, 0.012625554576516151, -0.09265454858541489, -0.12263964116573334, 0.0022686950396746397, -0.08121471852064133, 0.11505728960037231, 0.0052316151559352875, 0.14735150337219238, 0.10006548464298248, 0.024979369714856148, -0.0880545824766159, 0.09257306903600693, 0.024458257481455803, 0.03023401089012623, 0.0676887258887291, 0.09232412278652191, -0.021404121071100235, 0.0746675431728363, 0.040643442422151566, -0.0484134666621685, 0.061355553567409515, -0.07759609818458557, -0.016804276034235954, -0.12223038822412491, 0.08423468470573425, -0.03022124618291855, 0.13908587396144867, 0.1789531111717224, -0.012688599526882172, -0.006830413360148668, -0.04867968335747719, 0.014381268061697483, -0.003517633303999901, 0.10941333323717117, -0.016513047739863396, -0.22035525739192963, 0.021840890869498253, -0.030905617401003838, 0.051733579486608505, -0.2190888375043869, -0.027387186884880066, 0.029099157080054283, -0.05380237475037575, -0.04243256151676178, 0.07222795486450195, 0.09141866862773895, 0.026800427585840225, -0.0321359746158123, -0.1254757046699524, -0.006264433730393648, 0.10263355076313019, -0.09177008271217346, -0.10134704411029816 ]
null
null
transformers
# legal_t5_small_multitask_en_sv model Model on translating legal text from English to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_en_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from English to Swedish. ### How to use Here is how to use this model to translate legal text from English to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_en_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_en_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "whereas enlargement to Bulgaria and Romania should be effective in 2007," pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_en_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_en_sv | 47.968| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Swedish", "tags": ["translation English Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "whereas enlargement to Bulgaria and Romania should be effective in 2007,"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_en_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_en\_sv model ========================================= Model on translating legal text from English to Swedish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_en\_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Swedish. ### How to use Here is how to use this model to translate legal text from English to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_en\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_en\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.05807528272271156, 0.11504722386598587, -0.0036448354367166758, 0.08497651666402817, 0.05155620351433754, -0.007193764206022024, 0.019638819620013237, 0.09228430688381195, -0.08007926493883133, 0.06360030174255371, 0.05921311676502228, 0.007162783294916153, 0.07851900160312653, 0.02451946586370468, 0.03865664452314377, -0.20840634405612946, 0.0015378185780718923, -0.03421764820814133, -0.025349324569106102, 0.10273561626672745, 0.0959436371922493, -0.056818317621946335, 0.032034341245889664, -0.025917746126651764, -0.039971355348825455, 0.02498101070523262, -0.09492255002260208, -0.03143860027194023, 0.10159915685653687, 0.07291615009307861, 0.09034587442874908, 0.007745024282485247, 0.08088730275630951, -0.1833617240190506, -0.01325936522334814, 0.047265779227018356, -0.0036206371150910854, 0.03266414254903793, 0.10292311757802963, 0.03601229935884476, 0.18351247906684875, -0.04690521955490112, 0.02919234335422516, 0.03466600552201271, -0.07909958064556122, -0.1524406224489212, -0.048204727470874786, -0.0009107689838856459, 0.0959232822060585, 0.1348884254693985, -0.05142320320010185, 0.04609711095690727, -0.011258037760853767, 0.09380393475294113, 0.0651368573307991, -0.2319321632385254, -0.032681889832019806, 0.04087432473897934, 0.06683294475078583, 0.11100491881370544, -0.044249165803194046, 0.00553672481328249, 0.07237515598535538, 0.0752774327993393, 0.07688713818788528, -0.04952921345829964, -0.04868621006608009, -0.022029271349310875, -0.1411508470773697, -0.03599270433187485, 0.18240244686603546, 0.014230498112738132, -0.03355967625975609, -0.10473202913999557, -0.04940687492489815, -0.04937511682510376, 0.018772389739751816, -0.0448276661336422, 0.011088690720498562, -0.007249530870467424, 0.04743003845214844, -0.0710771232843399, -0.1325269341468811, -0.048385269939899445, -0.04441574960947037, 0.057392802089452744, 0.041760582476854324, 0.01761028729379177, 0.04521999508142471, 0.06492333859205246, -0.12453106045722961, -0.08239492774009705, 0.008448746986687183, -0.002969711786136031, -0.09181679785251617, 0.006344786379486322, -0.0032348271925002337, -0.25367259979248047, -0.009578830562531948, -0.03834584355354309, -0.07089073210954666, 0.028668498620390892, 0.05528958514332771, 0.04561812803149223, 0.06059322878718376, 0.12848234176635742, -0.08754955232143402, -0.1120762899518013, -0.04435642808675766, -0.02139570564031601, -0.007180902175605297, 0.011745112016797066, -0.06367526948451996, -0.033141620457172394, 0.0033267135731875896, 0.025720058009028435, -0.005628752056509256, 0.007728576194494963, -0.017248759046196938, -0.007298214826732874, 0.07405070215463638, -0.09526307135820389, 0.006613649893552065, 0.0014426682610064745, -0.09060781449079514, -0.046578239649534225, 0.06617271900177002, -0.007827888242900372, -0.11684219539165497, 0.07481788843870163, -0.010907300747931004, -0.006993713788688183, -0.09319199621677399, -0.1981113702058792, 0.00988717656582594, -0.02263503149151802, -0.05141306295990944, -0.08488987386226654, -0.09815802425146103, -0.08900309354066849, 0.03785153478384018, -0.0523594506084919, -0.0023770921397954226, -0.11229738593101501, -0.003994825296103954, 0.03136488422751427, -0.030691059306263924, 0.08725614100694656, -0.048233792185783386, 0.03637102618813515, -0.0176519937813282, 0.07206270098686218, 0.00991799309849739, 0.02634226530790329, -0.11072710901498795, 0.022863073274493217, -0.08407822996377945, 0.14190678298473358, -0.04475836455821991, -0.004001718480139971, -0.12926748394966125, -0.0647236704826355, -0.0657699927687645, 0.04438884183764458, 0.08035967499017715, 0.12595510482788086, -0.22156491875648499, -0.02108324132859707, 0.20287936925888062, -0.08342975378036499, -0.06659695506095886, 0.13020622730255127, -0.019400151446461678, 0.051684122532606125, 0.08006808161735535, 0.09553857892751694, 0.04163167625665665, -0.0448426827788353, -0.0616111196577549, 0.024533377960324287, 0.011036855168640614, 0.03414887562394142, 0.09367702156305313, -0.07215815037488937, 0.09188590198755264, 0.010734695009887218, 0.04258178547024727, 0.011161894537508488, -0.02745630592107773, -0.03868137672543526, 0.006125550251454115, -0.03965190798044205, -0.024140572175383568, 0.03230215236544609, 0.008527166210114956, -0.08452209085226059, -0.07691527903079987, -0.002908454043790698, 0.07028456777334213, -0.07109368592500687, 0.043820951133966446, 0.0412072129547596, -0.06845402717590332, -0.11350236088037491, 0.0075229317881166935, -0.14244264364242554, -0.015817420557141304, 0.022906310856342316, -0.01889023371040821, 0.09082316607236862, 0.07419157773256302, 0.0668325126171112, 0.1003391295671463, -0.04740452021360397, -0.030207572504878044, -0.00040057642036117613, -0.022564629092812538, -0.10286221653223038, -0.1294034868478775, -0.017150016501545906, -0.021208912134170532, 0.004950250033289194, -0.13977348804473877, 0.004917947109788656, -0.041490618139505386, 0.09204824268817902, 0.008849255740642548, -0.027646498754620552, 0.03427983075380325, 0.07897534966468811, -0.038086481392383575, -0.027551669627428055, 0.03518372401595116, -0.021234823390841484, -0.08275175839662552, 0.1295776516199112, -0.06881467252969742, -0.13671068847179413, 0.0824144035577774, -0.002318325452506542, -0.09339594841003418, 0.005716238636523485, -0.01069736760109663, -0.06448986381292343, -0.06280472129583359, -0.0633220225572586, 0.23304574191570282, 0.05536918714642525, 0.14406797289848328, -0.11570454388856888, -0.04631850868463516, 0.018655462190508842, -0.054383113980293274, -0.020843759179115295, 0.18073631823062897, 0.047842174768447876, -0.18252621591091156, 0.08982609957456589, 0.014662526547908783, -0.02186434529721737, 0.15810847282409668, 0.057939305901527405, -0.10752367973327637, 0.01503702998161316, 0.05283890292048454, -0.011398551054298878, 0.05082285404205322, -0.07774008065462112, -0.008486263453960419, 0.03184738755226135, 0.06143275275826454, 0.06502287089824677, -0.10454598814249039, 0.06413082778453827, 0.06612639129161835, -0.0457376092672348, 0.04883968457579613, -0.04238113388419151, -0.049402039498090744, 0.08372962474822998, -0.004011174663901329, -0.046721599996089935, -0.035616908222436905, -0.039413873106241226, -0.11412619054317474, 0.1848071962594986, -0.09832625091075897, -0.23462656140327454, -0.14664901793003082, 0.02003558911383152, -0.0467962771654129, 0.01872134581208229, 0.05532781034708023, -0.05993010476231575, -0.06378260999917984, -0.10185322165489197, 0.09055238217115402, -0.09058615565299988, -0.06703384220600128, -0.10833089053630829, 0.05142393335700035, -0.0013484775554388762, -0.14248137176036835, 0.033194079995155334, 0.0007678347756154835, -0.013704254291951656, 0.00687060970813036, -0.029728878289461136, 0.1144818365573883, 0.11011433601379395, -0.02789164148271084, -0.03805561363697052, 0.005133695900440216, 0.14403603971004486, -0.0603853277862072, 0.049760617315769196, 0.031190764158964157, 0.013643253594636917, 0.04395762085914612, 0.14734767377376556, 0.037408072501420975, -0.03818709775805473, 0.02881699800491333, 0.05520062893629074, -0.02023904211819172, -0.2687068283557892, -0.09176947921514511, -0.0557367317378521, -0.008781839162111282, 0.0795048326253891, 0.0462045893073082, -0.09130653738975525, 0.019664257764816284, -0.041689395904541016, 0.0020884214900434017, 0.01616879552602768, 0.05131916329264641, 0.015193726867437363, -0.026769213378429413, 0.07996916025876999, -0.05253474786877632, -0.04379473626613617, 0.08797169476747513, 0.02072351798415184, 0.18968994915485382, -0.0572819784283638, 0.197871595621109, 0.05230093002319336, 0.055901218205690384, -0.005060938186943531, 0.07452347129583359, -0.047971975058317184, 0.02688949927687645, -0.01208164356648922, -0.06183181330561638, -0.013982134871184826, 0.07135004550218582, 0.012599058449268341, 0.011594261042773724, -0.034978386014699936, -0.02536957897245884, 0.0667259618639946, 0.20173299312591553, 0.08950203657150269, -0.16971909999847412, -0.06440228223800659, 0.011115503497421741, -0.07822700589895248, -0.0803619846701622, 0.0071469200775027275, 0.1750374287366867, -0.08387116342782974, 0.015773722901940346, 0.009850055910646915, 0.13150840997695923, -0.11009674519300461, -0.021753257140517235, 0.011229382827877998, 0.024947846308350563, -0.018592674285173416, 0.12960880994796753, -0.22949500381946564, 0.17820119857788086, 0.015692729502916336, 0.0717952623963356, -0.04708641767501831, 0.030289283022284508, -0.06496354192495346, 0.006395593751221895, 0.12525364756584167, 0.0359979085624218, -0.07424283772706985, -0.10195612907409668, -0.10066688060760498, -0.01434088870882988, 0.062501460313797, -0.039565637707710266, 0.0910513624548912, 0.07291899621486664, 0.02193961665034294, -0.02272559329867363, 0.025398828089237213, -0.03151479363441467, -0.1494751125574112, 0.00040046341018751264, -0.009727608412504196, -0.03567158803343773, -0.007608701009303331, -0.04785504192113876, -0.09976421296596527, 0.22359134256839752, -0.13925783336162567, -0.11074554920196533, -0.0738014355301857, 0.016633527353405952, 0.12236916273832321, -0.06790396571159363, 0.005198555067181587, 0.02927383966743946, 0.029268378391861916, -0.05907469987869263, -0.0075417496263980865, 0.06983055919408798, -0.05387226492166519, -0.0667823925614357, -0.04912852868437767, 0.1268271803855896, 0.06560410559177399, 0.040063414722681046, -0.018379196524620056, 0.04605300351977348, -0.009012307971715927, -0.09772080928087234, 0.002746881451457739, 0.04021212458610535, 0.1586010754108429, 0.03327500820159912, -0.05124513432383537, -0.07141083478927612, -0.07132413238286972, -0.0831502377986908, 0.15710747241973877, 0.17417161166667938, -0.04881075769662857, 0.0597170889377594, 0.1932874470949173, -0.11766300350427628, -0.18064998090267181, -0.05975835770368576, 0.1004968136548996, 0.09656618535518646, -0.008746043778955936, -0.16254206001758575, 0.023971879854798317, 0.10105670988559723, 0.0022814783733338118, 0.014878020621836185, -0.3673146665096283, -0.14943049848079681, 0.015446528792381287, 0.040855444967746735, 0.0004387586668599397, -0.08239570260047913, -0.047858674079179764, -0.04483264312148094, -0.09841475635766983, 0.061623714864254, -0.016709720715880394, 0.10267810523509979, 0.013711521402001381, 0.03326413780450821, 0.05352545529603958, -0.04338885843753815, 0.13080672919750214, -0.007986251264810562, 0.01696951873600483, -0.0746912881731987, 0.08490099757909775, 0.01282370276749134, -0.015089500695466995, 0.17486311495304108, -0.06976828724145889, 0.04539329931139946, -0.16273564100265503, -0.05126330628991127, -0.055079493671655655, 0.030917789787054062, -0.03685479238629341, -0.08100368082523346, -0.05025973170995712, 0.03232520818710327, 0.06749385595321655, -0.01741670072078705, 0.04441530629992485, -0.07226261496543884, 0.030445465818047523, 0.17367464303970337, 0.10842105746269226, 0.026797616854310036, -0.08907965570688248, 0.016632800921797752, 0.000016648999007884413, 0.06134822964668274, -0.12821894884109497, 0.006970701273530722, 0.15470315515995026, 0.01679278165102005, 0.1105254739522934, -0.03308722749352455, -0.13154961168766022, 0.01725119724869728, 0.061818890273571014, -0.10887793451547623, -0.1357736587524414, -0.020649902522563934, -0.016729822382330894, -0.04887377843260765, -0.011404013261198997, 0.10789671540260315, -0.10154329240322113, -0.015469226986169815, -0.014251026324927807, 0.042008936405181885, -0.060499053448438644, 0.21960961818695068, 0.015519894659519196, 0.05797135457396507, -0.058371324092149734, 0.12274041771888733, 0.12873291969299316, -0.1255567967891693, 0.04781937599182129, 0.18160130083560944, -0.07008762657642365, -0.04391986131668091, 0.051599446684122086, 0.13194862008094788, -0.009141885675489902, -0.06296243518590927, -0.011618177406489849, -0.03889911621809006, 0.004992922302335501, -0.009904203936457634, 0.03464951738715172, 0.029976459220051765, -0.004937610123306513, -0.046342555433511734, -0.09389527887105942, 0.10765179246664047, 0.06870349496603012, 0.007162129506468773, -0.03265045955777168, 0.11279161274433136, -0.002676109317690134, -0.009082050062716007, -0.018702100962400436, 0.03114776499569416, -0.0351000614464283, 0.013797222636640072, -0.07029203325510025, -0.0023995195515453815, -0.04970436170697212, -0.005560374818742275, -0.035108111798763275, -0.00611907010897994, -0.008593047969043255, 0.009903223253786564, -0.045125775039196014, -0.03463650122284889, -0.05159079283475876, 0.02392548695206642, -0.0913667231798172, -0.054371029138565063, -0.0009972876869142056, -0.016015931963920593, 0.05574150010943413, 0.011625487357378006, -0.014998367056250572, 0.03396964445710182, -0.0022553533781319857, 0.07528260350227356, 0.026740454137325287, 0.04214293509721756, 0.01705167070031166, -0.05239254608750343, -0.0073743294924497604, 0.03142791613936424, -0.020305676385760307, -0.008840899914503098, 0.015185429714620113, -0.125679612159729, -0.04980200529098511, -0.03182988613843918, -0.041275493800640106, -0.06616368144750595, 0.10620701313018799, 0.06313330680131912, 0.07952965050935745, 0.134050190448761, -0.07407230138778687, 0.0783284530043602, -0.14769500494003296, 0.00016464031068608165, 0.040497589856386185, -0.04448318108916283, -0.01266945619136095, -0.012607947923243046, 0.04129524156451225, -0.09976242482662201, 0.1287933886051178, 0.019465545192360878, 0.06182722747325897, 0.01409341860562563, -0.0848393663764, -0.004884534515440464, 0.01915774866938591, 0.10728222876787186, -0.03920013830065727, -0.019887171685695648, -0.0847649797797203, 0.07799286395311356, -0.0023927686270326376, 0.10022640228271484, 0.04351920261979103, 0.10389640927314758, 0.14301422238349915, 0.05906657502055168, 0.009802279062569141, -0.07272694259881973, -0.07782492786645889, 0.0874326154589653, 0.008376194164156914, 0.06806973367929459, -0.016586121171712875, 0.11166352778673172, 0.14998990297317505, -0.15461906790733337, 0.10598009824752808, 0.0068596126511693, -0.09462875872850418, -0.0726061761379242, -0.13291168212890625, -0.06761673837900162, -0.02545735612511635, -0.022036908194422722, -0.1290971040725708, 0.012441592290997505, 0.008015448227524757, 0.06004825606942177, -0.0268924068659544, 0.10265106707811356, 0.012318854220211506, -0.11120087653398514, 0.06754714250564575, -0.00004058930790051818, 0.09397634118795395, -0.009473342448472977, 0.02156493440270424, 0.06779392808675766, 0.0032845158129930496, 0.029290098696947098, 0.045797012746334076, -0.00982525385916233, -0.0038337118458002806, 0.0003242066304665059, -0.06479979306459427, -0.0432681143283844, 0.03654658421874046, 0.0830005630850792, 0.15670786798000336, 0.0690542533993721, -0.0678112655878067, -0.038966890424489975, 0.17693467438220978, -0.05423905700445175, -0.08651505410671234, -0.11041590571403503, 0.19236159324645996, 0.019075315445661545, 0.04707075282931328, 0.019919317215681076, -0.1020103171467781, 0.002475966466590762, 0.12003462761640549, 0.18756020069122314, -0.008970343507826328, -0.03296182304620743, 0.0037615541368722916, -0.01475689746439457, 0.0036245512310415506, 0.035359159111976624, 0.03445519506931305, 0.2349780797958374, -0.07898728549480438, 0.06367732584476471, -0.0696897879242897, 0.030983660370111465, -0.015421650372445583, 0.14113932847976685, 0.0025082689244300127, 0.0036667909007519484, -0.054180342704057693, 0.10176724195480347, -0.0003954724525101483, -0.14757052063941956, -0.012467958964407444, -0.0790715217590332, -0.12593579292297363, 0.021665344014763832, 0.03496963530778885, 0.03636176511645317, 0.061319321393966675, 0.013636541552841663, 0.04671323671936989, 0.03470224887132645, 0.0135297616943717, -0.09063296020030975, -0.08885254710912704, -0.005802529864013195, -0.05584845319390297, 0.0824383869767189, 0.02433825470507145, 0.13490112125873566, 0.09226018190383911, 0.019405759871006012, -0.07439462840557098, 0.1037508100271225, 0.03094950132071972, 0.020321497693657875, 0.07909824699163437, 0.12275850772857666, -0.010923599824309349, 0.08234154433012009, 0.04045390710234642, -0.06671593338251114, 0.0352306067943573, -0.04270795360207558, -0.022099895402789116, -0.09447810053825378, 0.10073062777519226, -0.035471975803375244, 0.13510744273662567, 0.19013778865337372, -0.00232575461268425, -0.010977291502058506, -0.06281072646379471, 0.018309788778424263, -0.026124896481633186, 0.08247069269418716, -0.008320113644003868, -0.20030875504016876, 0.013517572544515133, -0.019944244995713234, 0.035501956939697266, -0.2096298784017563, -0.024330465123057365, 0.024630073457956314, -0.04573521763086319, -0.02612154185771942, 0.07733645290136337, 0.07782802730798721, 0.015490110963582993, -0.02311297506093979, -0.08112137019634247, 0.015209854580461979, 0.10085901618003845, -0.10221539437770844, -0.10653074830770493 ]
null
null
transformers
# legal_t5_small_multitask_es_cs model Model on translating legal text from Spanish to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_es_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Spanish to Cszech. ### How to use Here is how to use this model to translate legal text from Spanish to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_es_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_es_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "La política pesquera supone que se tenga en cuenta un gran número de dimensiones – social, medioambiental, económica – lo que exige un enfoque integrado y equilibrado, incompatible con una visión que los sobrestima, en particular, mediante una definición a priori de cualquier jerarquía de prioridades." pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_es_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_es_cs | 47.673| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish Cszech", "tags": ["translation Spanish Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "La pol\u00edtica pesquera supone que se tenga en cuenta un gran n\u00famero de dimensiones \u2013 social, medioambiental, econ\u00f3mica \u2013 lo que exige un enfoque integrado y equilibrado, incompatible con una visi\u00f3n que los sobrestima, en particular, mediante una definici\u00f3n a priori de cualquier jerarqu\u00eda de prioridades."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_es_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Spanish Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_es\_cs model ========================================= Model on translating legal text from Spanish to Cszech. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_es\_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Spanish to Cszech. ### How to use Here is how to use this model to translate legal text from Spanish to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_es\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08505918085575104, 0.1377343237400055, -0.0035988905001431704, 0.08435806632041931, 0.07427828013896942, 0.006612357217818499, 0.0032133583445101976, 0.11015480756759644, -0.040390290319919586, 0.08526141941547394, 0.0477447547018528, 0.027660774067044258, 0.04836215823888779, 0.0221689622849226, 0.03398532792925835, -0.1915179044008255, -0.005416451022028923, -0.027477923780679703, -0.04356454685330391, 0.10492310672998428, 0.07675138860940933, -0.05374016612768173, 0.04783979803323746, -0.0408979207277298, -0.08118286728858948, 0.0276890080422163, -0.07470858842134476, -0.05786695331335068, 0.10353077948093414, 0.07387956976890564, 0.08310773968696594, -0.009732871316373348, 0.05993317812681198, -0.17774511873722076, -0.0031284319702535868, 0.055224619805812836, -0.01765153557062149, 0.05440925806760788, 0.10299139469861984, -0.012349450960755348, 0.19692587852478027, -0.0670248419046402, 0.029186034575104713, 0.05296098813414574, -0.1246323361992836, -0.10522088408470154, -0.07514144480228424, 0.01874336414039135, 0.09356740862131119, 0.1300586313009262, -0.02708534710109234, 0.031653404235839844, -0.013416352681815624, 0.0659978985786438, 0.11510179936885834, -0.25134968757629395, -0.02610463835299015, 0.04050302505493164, 0.05518455430865288, 0.09160855412483215, -0.03698340430855751, 0.018543699756264687, 0.06159337982535362, 0.055448271334171295, 0.06445590406656265, -0.05343865603208542, -0.008116107434034348, -0.00848405621945858, -0.1217450499534607, -0.08077090978622437, 0.1532120257616043, 0.02447698451578617, -0.021152494475245476, -0.10020933300256729, -0.06621312350034714, -0.07227430492639542, 0.005731573794037104, -0.02015015482902527, 0.010246513411402702, -0.011820538900792599, 0.037289928644895554, -0.05557155981659889, -0.11372695863246918, -0.06874048709869385, -0.07000984251499176, 0.0715196505188942, 0.024660177528858185, 0.011666645295917988, 0.04341107979416847, 0.08181118220090866, -0.09540192782878876, -0.08265910297632217, 0.01528736762702465, 0.0282664243131876, -0.08866364508867264, 0.022895077243447304, -0.004915853962302208, -0.1778147667646408, 0.00019838158914353698, -0.039064448326826096, -0.08765655010938644, 0.020836301147937775, 0.0455479770898819, 0.03395315632224083, 0.05002455413341522, 0.11638528853654861, -0.09474288672208786, -0.1032668948173523, -0.032311249524354935, -0.017128530889749527, -0.004611244425177574, 0.008231820538640022, -0.0789075419306755, -0.03745124489068985, 0.006918028462678194, 0.07164660841226578, 0.010210966691374779, 0.0018017170950770378, -0.010403997264802456, -0.036596670746803284, 0.12401069700717926, -0.09148206561803818, 0.015732821077108383, 0.00722417701035738, -0.08880802243947983, -0.022111035883426666, 0.05109161138534546, -0.04038646072149277, -0.0992051213979721, 0.04145229980349541, -0.04613703116774559, -0.015639344230294228, -0.10746754705905914, -0.17326684296131134, 0.01357365120202303, -0.012738768942654133, -0.05866776779294014, -0.09627925604581833, -0.11866477876901627, -0.08450686186552048, 0.015735505148768425, -0.06713605672121048, 0.01618129387497902, -0.09841901808977127, 0.010627089068293571, 0.034573398530483246, -0.01628836616873741, 0.06757684051990509, -0.04483906179666519, 0.0532296858727932, 0.016026312485337257, 0.06052817031741142, 0.004571042023599148, 0.029480695724487305, -0.07269364595413208, 0.051338471472263336, -0.08281721919775009, 0.1530606746673584, -0.009010424837470055, 0.008096817880868912, -0.14244511723518372, -0.04805612936615944, -0.07964973151683807, 0.04687206447124481, 0.07382460683584213, 0.1385214775800705, -0.19499504566192627, -0.03807813301682472, 0.21534492075443268, -0.055623993277549744, -0.06560491025447845, 0.10148440301418304, -0.025428319349884987, 0.0465983971953392, 0.07940256595611572, 0.09313056617975235, 0.020226391032338142, -0.0455150343477726, -0.052293285727500916, -0.014136320911347866, 0.02527698688209057, 0.006864265073090792, 0.10334325581789017, -0.08090964704751968, 0.09264365583658218, 0.010050402022898197, 0.03275199234485626, 0.014183671213686466, -0.04078761860728264, -0.034210313111543655, 0.007102902512997389, -0.03045324981212616, -0.03155415132641792, 0.008173344656825066, 0.025341980159282684, -0.058165132999420166, -0.07279876619577408, 0.03806928172707558, 0.09091238677501678, -0.06323647499084473, 0.027891580015420914, 0.01127408817410469, -0.03487013652920723, -0.1333036720752716, 0.01633823849260807, -0.16341838240623474, 0.011566241271793842, 0.025957757607102394, -0.012040871195495129, 0.09801126271486282, 0.020119432359933853, 0.04980836808681488, 0.08258554339408875, -0.04768378660082817, -0.017094504088163376, -0.01895800046622753, -0.020162802189588547, -0.10336384177207947, -0.10326992720365524, -0.03832795470952988, -0.010956116952002048, 0.02200767584145069, -0.1571546047925949, 0.012208621948957443, -0.025159774348139763, 0.07454860210418701, 0.001996889477595687, -0.029435258358716965, 0.022774379700422287, 0.06719772517681122, -0.03020070306956768, -0.03620775043964386, 0.03959718719124794, -0.0015471759252250195, -0.03937724605202675, 0.09548730403184891, -0.1355057954788208, -0.10323312133550644, 0.10129735618829727, 0.02521589770913124, -0.0896773561835289, -0.024166997522115707, -0.0029208557680249214, -0.043899573385715485, -0.050952550023794174, -0.07060149312019348, 0.20177777111530304, 0.04934369772672653, 0.16432249546051025, -0.12153024226427078, -0.054552678018808365, 0.02004707045853138, -0.023385392501950264, -0.018470510840415955, 0.16201667487621307, 0.04159700497984886, -0.17545147240161896, 0.09480490535497665, 0.02114630676805973, -0.006650608964264393, 0.13532905280590057, 0.0602666437625885, -0.10955555737018585, -0.021333912387490273, 0.041528232395648956, 0.012389587238430977, 0.04985521361231804, -0.08806406706571579, -0.0318630114197731, 0.027600526809692383, 0.06879913061857224, 0.07512006163597107, -0.10310355573892593, 0.0662476047873497, 0.07092602550983429, -0.0352131649851799, 0.06132935732603073, -0.03723444044589996, -0.04642695188522339, 0.10631190985441208, 0.02499636448919773, -0.017973074689507484, -0.048947833478450775, -0.037410102784633636, -0.11253315955400467, 0.1892644613981247, -0.09567445516586304, -0.23697210848331451, -0.1501062959432602, 0.01820874586701393, -0.03487177938222885, 0.031650424003601074, 0.03865288943052292, -0.0499144084751606, -0.02910298854112625, -0.0880218967795372, 0.09306278079748154, -0.09716111421585083, -0.05603031441569328, -0.08471174538135529, 0.052965011447668076, -0.028445085510611534, -0.1394861787557602, 0.021227143704891205, 0.006169963628053665, -0.04766905680298805, -0.004962991923093796, -0.059576984494924545, 0.1262471079826355, 0.16441626846790314, -0.016619965434074402, -0.025313083082437515, -0.00023788237012922764, 0.11211308091878891, -0.06816353648900986, 0.03377511352300644, 0.060415118932724, 0.06026977300643921, 0.011492629535496235, 0.10701759904623032, 0.0427483394742012, -0.053056590259075165, 0.028202112764120102, 0.045702118426561356, -0.03413403034210205, -0.26381826400756836, -0.10605127364397049, -0.06486313790082932, -0.02929111011326313, 0.09960141032934189, 0.04791227728128433, -0.02694771997630596, 0.02055068127810955, -0.039927851408720016, 0.02121426723897457, -0.0017074401257559657, 0.06538578122854233, 0.0545550100505352, -0.018735378980636597, 0.06851606070995331, -0.06151111423969269, -0.034798119217157364, 0.09379750490188599, 0.052708905190229416, 0.1809183955192566, -0.03186454623937607, 0.2560066282749176, 0.0554705373942852, 0.04024544730782509, 0.0018826269078999758, 0.076353058218956, -0.04039626941084862, 0.02952292561531067, -0.032180581241846085, -0.06563214957714081, -0.016537100076675415, 0.05249706655740738, 0.005766344256699085, 0.01958801969885826, -0.07665414363145828, -0.07619456201791763, 0.07224155217409134, 0.21362408995628357, 0.0626830980181694, -0.1870557814836502, -0.06039769947528839, -0.000128845582366921, -0.06906929612159729, -0.07488756626844406, 0.002529787365347147, 0.15408742427825928, -0.09517790377140045, -0.010769151151180267, 0.02042323723435402, 0.13637854158878326, -0.12151999026536942, -0.02497897855937481, 0.0005059587419964373, 0.011436909437179565, -0.02056245133280754, 0.11917470395565033, -0.23268242180347443, 0.19154496490955353, 0.03195365145802498, 0.054283734411001205, -0.043375372886657715, 0.009172421880066395, -0.05398647487163544, -0.02734697237610817, 0.10392705351114273, 0.022068722173571587, -0.035388052463531494, -0.10685941576957703, -0.10470061749219894, -0.023311873897910118, 0.057720497250556946, -0.07179120182991028, 0.1046808585524559, 0.06401809304952621, -0.007472224999219179, -0.016213903203606606, 0.058227282017469406, -0.0018046043114736676, -0.17657969892024994, -0.012710982002317905, -0.015535213053226471, -0.037687066942453384, -0.011941996403038502, -0.04202309995889664, -0.034455716609954834, 0.21701064705848694, -0.1202596127986908, -0.0759228989481926, -0.08385469764471054, 0.01969919353723526, 0.1254408061504364, -0.07396812736988068, 0.03501666709780693, 0.011805425398051739, 0.03753672167658806, -0.04639594256877899, -0.033997491002082825, 0.09598003327846527, -0.05937119573354721, -0.07160042226314545, -0.0603882372379303, 0.16360650956630707, 0.043431784957647324, 0.053975921124219894, -0.0184783972799778, 0.04648653417825699, 0.007625163067132235, -0.09775087982416153, -0.020774660632014275, 0.04670196771621704, 0.13896693289279938, 0.057780034840106964, -0.07290423661470413, -0.07247700542211533, -0.06839565932750702, -0.08145099878311157, 0.1584252119064331, 0.16562491655349731, -0.053172916173934937, 0.037115950137376785, 0.16917401552200317, -0.12155470252037048, -0.1493244171142578, -0.04861893132328987, 0.07692860811948776, 0.07520658522844315, -0.03750576451420784, -0.1814238578081131, -0.0028504247311502695, 0.1294284611940384, 0.00013624367420561612, 0.04923936724662781, -0.3972865343093872, -0.1350783109664917, 0.014289997518062592, 0.03725270554423332, 0.0006004888564348221, -0.12927059829235077, -0.06960981339216232, -0.06732115149497986, -0.09952004998922348, 0.12332295626401901, -0.03678809851408005, 0.10172298550605774, -0.003861682955175638, 0.021896541118621826, 0.04566951468586922, -0.03680793195962906, 0.1361590027809143, 0.004977934528142214, 0.025181099772453308, -0.044397514313459396, 0.0253622867166996, 0.019160008057951927, -0.023602299392223358, 0.1397700011730194, -0.045109059661626816, 0.0500868521630764, -0.16833147406578064, -0.05929787456989288, -0.06680236756801605, 0.011227469891309738, -0.041292455047369, -0.05880245566368103, -0.04527464881539345, 0.013975396752357483, 0.043615929782390594, -0.012570034712553024, 0.026002218946814537, -0.06304755061864853, 0.05971129238605499, 0.1839151531457901, 0.07010902464389801, 0.02002689614892006, -0.10098858922719955, 0.0004950921866111457, 0.0026985672302544117, 0.0637572854757309, -0.1378912776708603, -0.00496553722769022, 0.14272862672805786, 0.04252958297729492, 0.1112135723233223, -0.013523242436349392, -0.13086196780204773, 0.01541502121835947, 0.07159776985645294, -0.07649824023246765, -0.10210415720939636, -0.015614842064678669, 0.04986559972167015, -0.05607431009411812, -0.0195004902780056, 0.11558031290769577, -0.06256142258644104, -0.04130132496356964, -0.007891515269875526, 0.016206586733460426, -0.0584774874150753, 0.22347582876682281, 0.038543589413166046, 0.054755549877882004, -0.059517767280340195, 0.10093779116868973, 0.13063429296016693, -0.12009934335947037, 0.030001847073435783, 0.2006007730960846, -0.06759083271026611, -0.04701264202594757, 0.025863248854875565, 0.1115061491727829, -0.07924366742372513, -0.053401242941617966, -0.02834239788353443, -0.02724476531147957, 0.02159442938864231, 0.04158163443207741, 0.038364849984645844, 0.04275016859173775, -0.008729292079806328, -0.03719206526875496, -0.07698954641819, 0.07427504658699036, 0.050331585109233856, 0.004808567930012941, -0.021230921149253845, 0.10718785971403122, 0.01463143527507782, 0.00008923678251449019, -0.014484621584415436, -0.004195413552224636, -0.06639350205659866, 0.023459909483790398, -0.0693129375576973, 0.011627724394202232, -0.06482294946908951, -0.026618758216500282, -0.037811484187841415, 0.013034436851739883, -0.012676499783992767, -0.005693589802831411, -0.04434404522180557, -0.04525851830840111, -0.06179622933268547, 0.03683311119675636, -0.0854136273264885, -0.030712971463799477, -0.002940307604148984, -0.03171580284833908, 0.0525718554854393, 0.02809831127524376, 0.005566440988332033, 0.03946732357144356, -0.03209882602095604, 0.05785352364182472, 0.026936687529087067, 0.04954373463988304, 0.018301930278539658, -0.0547448955476284, 0.026384957134723663, 0.03407081961631775, 0.0005262478371150792, -0.0020986914169043303, 0.00531817926093936, -0.13507157564163208, -0.05279570072889328, -0.049412570893764496, -0.03480659797787666, -0.061889249831438065, 0.10035308450460434, 0.07526601105928421, 0.05662868544459343, 0.07835116982460022, -0.06860727816820145, 0.06924548000097275, -0.16980138421058655, -0.009428922086954117, 0.004257929511368275, -0.018865086138248444, -0.024702366441488266, 0.0013894884614273906, 0.051415544003248215, -0.055020615458488464, 0.13558562099933624, 0.012700784020125866, 0.06990621984004974, 0.017343804240226746, -0.05794643610715866, 0.0109470309689641, 0.014468089677393436, 0.12913048267364502, -0.0009336316725239158, -0.02090550772845745, -0.0595846064388752, 0.1005464643239975, -0.005496141500771046, 0.11756850779056549, 0.03991298750042915, 0.12648625671863556, 0.14166365563869476, 0.04791907221078873, 0.020575346425175667, -0.08570617437362671, -0.04227760061621666, 0.04781186580657959, -0.0030853620264679193, 0.05443430319428444, -0.024364162236452103, 0.09051936119794846, 0.15674394369125366, -0.16822262108325958, 0.11977103352546692, 0.009882901795208454, -0.0716324895620346, -0.0509522520005703, -0.11402017623186111, -0.04905931279063225, -0.0458955243229866, -0.04058774933218956, -0.11302481591701508, 0.010133212432265282, 0.03925633430480957, 0.046473536640405655, -0.02824820578098297, 0.097477488219738, 0.017833959311246872, -0.11554102599620819, 0.06380413472652435, 0.014354631304740906, 0.09751687198877335, -0.023911818861961365, 0.048129793256521225, 0.056456733494997025, 0.0009507458307780325, 0.053024351596832275, 0.0646236315369606, -0.007319893222302198, 0.010380219668149948, 0.0034207378048449755, -0.06057419627904892, -0.037655919790267944, 0.027949554845690727, 0.0805121436715126, 0.20238125324249268, 0.054080184549093246, -0.08829981088638306, -0.021508052945137024, 0.19775059819221497, -0.05190303176641464, -0.07879052311182022, -0.094504214823246, 0.21126623451709747, 0.03701790049672127, 0.032752443104982376, -0.004177661146968603, -0.10121766477823257, -0.009293124079704285, 0.12057454884052277, 0.18875961005687714, -0.029781773686408997, -0.032898906618356705, 0.019994569942355156, -0.004747520666569471, 0.020122714340686798, 0.029568733647465706, 0.04373042285442352, 0.30228009819984436, -0.08007106930017471, 0.049975283443927765, -0.05845043808221817, 0.017917007207870483, -0.004710952751338482, 0.1448701173067093, -0.01700873114168644, 0.0016487782122567296, -0.03363588824868202, 0.09228495508432388, -0.003456590697169304, -0.17560039460659027, 0.0012639426859095693, -0.11494133621454239, -0.1146678477525711, 0.004835302475839853, -0.029449867084622383, 0.04356000944972038, 0.07578251510858536, 0.012895429506897926, 0.03024446964263916, 0.021029004827141762, 0.021578557789325714, -0.11879214644432068, -0.13062630593776703, 0.022151166573166847, -0.023113667964935303, 0.09522590786218643, -0.004654735792428255, 0.11301800608634949, 0.09901601821184158, 0.020412394776940346, -0.08346989750862122, 0.09808140993118286, 0.029267242178320885, 0.02000996842980385, 0.09239368885755539, 0.08278274536132812, -0.003257162868976593, 0.03920619562268257, 0.0382421538233757, -0.05988047271966934, 0.04336223751306534, -0.04231798276305199, -0.0036035783123224974, -0.13875402510166168, 0.07544289529323578, -0.028798067942261696, 0.1337565779685974, 0.16539724171161652, -0.021912172436714172, 0.0005080030532553792, -0.04657771810889244, 0.021192997694015503, -0.0012199641205370426, 0.10121168196201324, -0.01887029968202114, -0.2128065824508667, 0.024136755615472794, -0.05500277131795883, 0.026557471603155136, -0.26180675625801086, -0.03132909908890724, 0.031693991273641586, -0.055805306881666183, -0.03096581995487213, 0.0769539624452591, 0.06487800925970078, 0.0414421446621418, -0.03192963823676109, -0.08191083371639252, -0.0070985122583806515, 0.10802460461854935, -0.11567390710115433, -0.1071731299161911 ]
null
null
transformers
# legal_t5_small_multitask_es_de model Model on translating legal text from Spanish to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_es_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Spanish to Deustch. ### How to use Here is how to use this model to translate legal text from Spanish to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_es_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_es_de", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "Estudios y publicaciones realizados por el Parlamento Europeo" pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_es_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_es_de | 41.196| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish Deustch", "tags": ["translation Spanish Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Estudios y publicaciones realizados por el Parlamento Europeo"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_es_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Spanish Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_es\_de model ========================================= Model on translating legal text from Spanish to Deustch. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_es\_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Spanish to Deustch. ### How to use Here is how to use this model to translate legal text from Spanish to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_es\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08337075263261795, 0.13564959168434143, -0.003960360772907734, 0.09292241185903549, 0.07732895761728287, 0.007596160285174847, 0.004457961302250624, 0.10684283822774887, -0.05416664108633995, 0.07878408581018448, 0.04887107387185097, 0.033050742000341415, 0.049092989414930344, 0.018329467624425888, 0.034428924322128296, -0.19970138370990753, -0.004894689191132784, -0.0253472700715065, -0.05088258162140846, 0.08883640170097351, 0.08440376073122025, -0.048306018114089966, 0.048145174980163574, -0.04592600092291832, -0.06823653727769852, 0.04526643455028534, -0.08008049428462982, -0.05772779881954193, 0.11077991127967834, 0.07842662185430527, 0.08326312899589539, -0.010170036926865578, 0.06190578266978264, -0.17455309629440308, -0.004661235958337784, 0.0632743239402771, -0.016466397792100906, 0.04341718181967735, 0.10390207171440125, -0.008210820145905018, 0.19097940623760223, -0.0709306001663208, 0.03625139966607094, 0.05012527480721474, -0.13043783605098724, -0.12726934254169464, -0.07441262155771255, 0.0026203920133411884, 0.09129709750413895, 0.13359034061431885, -0.029996296390891075, 0.015443515963852406, -0.005031746346503496, 0.06065099313855171, 0.09662853181362152, -0.23164241015911102, -0.02275635115802288, 0.0354805514216423, 0.05932300165295601, 0.08843918144702911, -0.027902184054255486, 0.012538525275886059, 0.06936563551425934, 0.06761086732149124, 0.05904199555516243, -0.05172831937670708, 0.01007656566798687, -0.011301154270768166, -0.11633593589067459, -0.0721917524933815, 0.14149445295333862, 0.021014833822846413, -0.02632429450750351, -0.09822314977645874, -0.0652829259634018, -0.06897524744272232, 0.011486413888633251, -0.0434953011572361, 0.01725468970835209, -0.0070563070476055145, 0.04850286990404129, -0.05398045480251312, -0.10475536435842514, -0.06784428656101227, -0.08054480701684952, 0.060943879187107086, 0.0319441482424736, 0.0043552168644964695, 0.0306217260658741, 0.08326981216669083, -0.10429006069898605, -0.08756281435489655, 0.012381360866129398, 0.02586996555328369, -0.07373376190662384, 0.024331381544470787, -0.010727789252996445, -0.1840674877166748, 0.007076620124280453, -0.022466132417321205, -0.10309329628944397, 0.018094096332788467, 0.035816069692373276, 0.024196479469537735, 0.056770987808704376, 0.11387862265110016, -0.09840160608291626, -0.11783961206674576, -0.025460142642259598, -0.016521135345101357, 0.0027873688377439976, 0.01626450940966606, -0.07883316278457642, -0.04053996130824089, 0.011197567917406559, 0.07196292281150818, 0.01457471027970314, -0.00582944555208087, -0.013936663046479225, -0.03780485317111015, 0.1344604790210724, -0.0923171192407608, 0.012694159522652626, 0.0008620804874226451, -0.09232133626937866, -0.016566982492804527, 0.046128418296575546, -0.0372486412525177, -0.09694334864616394, 0.04010491445660591, -0.03892960399389267, -0.030423220247030258, -0.10991662740707397, -0.18391165137290955, 0.011838412843644619, -0.012605602853000164, -0.04453657940030098, -0.10540806502103806, -0.13565821945667267, -0.08784320205450058, 0.013702855445444584, -0.07512777298688889, 0.017153220251202583, -0.10211783647537231, 0.016333477571606636, 0.03186788037419319, -0.017240289598703384, 0.05764250084757805, -0.04482682794332504, 0.051629774272441864, 0.030071798712015152, 0.06572231650352478, -0.0034436483401805162, 0.040725115686655045, -0.06834372878074646, 0.05088014528155327, -0.08343630284070969, 0.16653282940387726, -0.0027951886877417564, 0.010795791633427143, -0.14493462443351746, -0.04779708385467529, -0.08497245609760284, 0.056106701493263245, 0.08588024228811264, 0.13766370713710785, -0.20872585475444794, -0.03729564696550369, 0.20763598382472992, -0.05755244567990303, -0.05826859548687935, 0.1042015552520752, -0.02992955408990383, 0.04973403736948967, 0.09063258767127991, 0.09483800828456879, 0.02318754605948925, -0.041998133063316345, -0.0465429313480854, -0.020480060949921608, 0.018936319276690483, 0.022761400789022446, 0.09690815210342407, -0.081988126039505, 0.09841570258140564, 0.008758842013776302, 0.04130103066563606, 0.0163494274020195, -0.03283758834004402, -0.028082026168704033, 0.016668831929564476, -0.031109893694519997, -0.03508957475423813, 0.008391818962991238, 0.021076634526252747, -0.06048854440450668, -0.0743877962231636, 0.05863284692168236, 0.08857657760381699, -0.06139030680060387, 0.02205292321741581, 0.017588578164577484, -0.049537137150764465, -0.1300763338804245, 0.020145131275057793, -0.15550218522548676, 0.009661471471190453, 0.02065255306661129, -0.03229217603802681, 0.0956992357969284, 0.029619695618748665, 0.04987354204058647, 0.0813920721411705, -0.05107278376817703, -0.016998307779431343, -0.0316985584795475, -0.015386244282126427, -0.09380070865154266, -0.10160581022500992, -0.023606661707162857, -0.01731874793767929, 0.00011815685866167769, -0.15027593076229095, 0.01264530885964632, -0.034235864877700806, 0.0734124556183815, 0.0033915406093001366, -0.021911777555942535, 0.016229679808020592, 0.07151663303375244, -0.0349542498588562, -0.04405679926276207, 0.03143607825040817, -0.007761992514133453, -0.03157641738653183, 0.08912240713834763, -0.13299252092838287, -0.0967836081981659, 0.10460691154003143, 0.026954995468258858, -0.09688035398721695, -0.004288095980882645, 0.0022382494062185287, -0.04945899173617363, -0.04848562553524971, -0.08390133827924728, 0.21301323175430298, 0.04492519795894623, 0.16276057064533234, -0.1171555295586586, -0.0507962740957737, 0.015890592709183693, -0.011379414238035679, -0.027526747435331345, 0.15789033472537994, 0.04022254794836044, -0.1727825105190277, 0.09906299412250519, 0.02346307784318924, -0.005512666888535023, 0.1256425529718399, 0.06473134458065033, -0.10957138240337372, -0.01729653589427471, 0.03595549985766411, 0.011921694502234459, 0.04413800686597824, -0.08898727595806122, -0.02709341235458851, 0.019193675369024277, 0.07755488902330399, 0.08063631504774094, -0.09513907134532928, 0.06582775712013245, 0.07533428072929382, -0.036195460706949234, 0.055245861411094666, -0.034307077527046204, -0.04883664846420288, 0.10404035449028015, 0.03362158685922623, -0.02379585988819599, -0.04259306192398071, -0.032021552324295044, -0.10436264425516129, 0.18761377036571503, -0.10157732665538788, -0.24701382219791412, -0.142492413520813, 0.019169125705957413, -0.039711613208055496, 0.03775019943714142, 0.04955452308058739, -0.05740595608949661, -0.03353530168533325, -0.07877899706363678, 0.11866559833288193, -0.10525262355804443, -0.057938139885663986, -0.0833190307021141, 0.05403885990381241, -0.02265973389148712, -0.1498686522245407, 0.02028266154229641, 0.011276372708380222, -0.036204129457473755, -0.007607065606862307, -0.05474109947681427, 0.12285344302654266, 0.15132993459701538, -0.012887118384242058, -0.03333309292793274, 0.0004895797464996576, 0.1209426149725914, -0.07075127214193344, 0.03611775115132332, 0.06919319927692413, 0.06491837650537491, 0.01115697156637907, 0.11907819658517838, 0.040533993393182755, -0.06672728061676025, 0.03460786119103432, 0.056555476039648056, -0.029528077691793442, -0.26979437470436096, -0.09513682126998901, -0.059460848569869995, -0.03199123591184616, 0.09810580313205719, 0.0490865483880043, -0.01925080642104149, 0.012142331339418888, -0.03631414845585823, 0.04217212274670601, 0.00244859023950994, 0.06280963122844696, 0.08408855646848679, -0.023210877552628517, 0.06958988308906555, -0.061124008148908615, -0.048715341836214066, 0.09879229217767715, 0.062389127910137177, 0.1750696450471878, -0.027731403708457947, 0.23445378243923187, 0.05355289950966835, 0.011783183552324772, -0.009363087825477123, 0.07284604012966156, -0.03863227367401123, 0.031413860619068146, -0.04056656360626221, -0.06104078143835068, -0.005495802499353886, 0.05701950564980507, -0.002531574107706547, 0.011697323992848396, -0.07208621501922607, -0.07124044001102448, 0.06720387935638428, 0.1978432834148407, 0.06982670724391937, -0.19797226786613464, -0.05917488783597946, 0.002999359043315053, -0.06788670271635056, -0.08356062322854996, 0.008354096673429012, 0.1322677880525589, -0.08107487112283707, 0.0007401679176837206, 0.02680855430662632, 0.13575106859207153, -0.12857529520988464, -0.017513269558548927, 0.006913397461175919, 0.02438805066049099, -0.017895547673106194, 0.11168934404850006, -0.2537771165370941, 0.17366187274456024, 0.029457563534379005, 0.059515342116355896, -0.03381098061800003, 0.016222935169935226, -0.05409373715519905, -0.02842174470424652, 0.0987899899482727, 0.018214264884591103, -0.028560776263475418, -0.09742067754268646, -0.09481720626354218, -0.02295760251581669, 0.04857640713453293, -0.06501971185207367, 0.09542351961135864, 0.06885304301977158, -0.00010027570533566177, -0.011538450606167316, 0.0577164962887764, -0.01895720884203911, -0.17614047229290009, -0.014154094271361828, -0.026873156428337097, -0.022319059818983078, -0.009091431275010109, -0.040680479258298874, -0.03976928070187569, 0.21058011054992676, -0.11619057506322861, -0.08628784120082855, -0.08087693899869919, 0.03159471973776817, 0.12173332273960114, -0.07116708159446716, 0.03635565936565399, 0.01702095754444599, 0.026020407676696777, -0.047159720212221146, -0.03926510363817215, 0.10103163123130798, -0.05626396834850311, -0.06011810526251793, -0.06615718454122543, 0.14658091962337494, 0.050349023193120956, 0.04457828775048256, -0.016975581645965576, 0.04197676479816437, 0.008379319682717323, -0.09332165867090225, -0.016110114753246307, 0.05098927766084671, 0.13415351510047913, 0.045329343527555466, -0.06644430756568909, -0.07116413861513138, -0.07117046415805817, -0.08476341515779495, 0.14489340782165527, 0.1607518196105957, -0.04848157986998558, 0.029519975185394287, 0.169125035405159, -0.12056778371334076, -0.15158109366893768, -0.04490647464990616, 0.07809371501207352, 0.07724562287330627, -0.03328753262758255, -0.19347970187664032, -0.008160666562616825, 0.13673220574855804, 0.00450691906735301, 0.06394515931606293, -0.3960123062133789, -0.13718019425868988, 0.02381417341530323, 0.03199964389204979, 0.00699633127078414, -0.1301308423280716, -0.06611911952495575, -0.08000288158655167, -0.09472909569740295, 0.10771209746599197, -0.031504228711128235, 0.09244459867477417, -0.005244607105851173, 0.028405064716935158, 0.0440969280898571, -0.02928433194756508, 0.14286603033542633, 0.005628644023090601, 0.0327383391559124, -0.04762641340494156, 0.035114701837301254, 0.012850682251155376, -0.023311635479331017, 0.14449425041675568, -0.03607717901468277, 0.049907386302948, -0.1569659411907196, -0.06907842308282852, -0.05734431743621826, 0.013233175501227379, -0.03980549797415733, -0.06855238229036331, -0.04275086522102356, 0.012462605722248554, 0.03785311430692673, -0.01917927712202072, 0.015056041069328785, -0.056894559413194656, 0.043866537511348724, 0.1673278510570526, 0.06401475518941879, 0.03282605856657028, -0.09735194593667984, 0.007575082592666149, 0.0018219264457002282, 0.06866876780986786, -0.14002689719200134, -0.0000722643599146977, 0.14591734111309052, 0.029764335602521896, 0.11462651193141937, -0.013730731792747974, -0.13556726276874542, 0.021296508610248566, 0.07146608084440231, -0.0631922259926796, -0.11442746222019196, -0.015861092135310173, 0.029639339074492455, -0.04463754594326019, -0.010898559354245663, 0.12079481035470963, -0.0640457347035408, -0.04177089408040047, -0.007467296905815601, 0.02068624645471573, -0.0625452920794487, 0.22669917345046997, 0.03346651792526245, 0.04536057636141777, -0.05586906522512436, 0.10824406892061234, 0.13141725957393646, -0.1349368840456009, 0.040562257170677185, 0.19470983743667603, -0.060238443315029144, -0.04925220087170601, 0.009939480572938919, 0.12010405957698822, -0.08475834876298904, -0.059095557779073715, -0.03171040862798691, -0.031881991773843765, 0.027391603216528893, 0.019070936366915703, 0.03163906931877136, 0.046568840742111206, -0.010292062535881996, -0.03157946839928627, -0.07676395773887634, 0.07269761711359024, 0.0523170530796051, 0.010352878831326962, -0.028019417077302933, 0.08670582622289658, 0.01414826326072216, -0.0052711511962115765, -0.013072129338979721, 0.002660008380189538, -0.0658445656299591, 0.01881181076169014, -0.09994786977767944, 0.0074350303038954735, -0.06030957028269768, -0.01883017085492611, -0.03657021000981331, 0.014978774823248386, -0.009588480927050114, -0.002705936785787344, -0.035481538623571396, -0.05388752743601799, -0.05735887214541435, 0.03576001897454262, -0.08709295094013214, -0.03881813585758209, -0.012157621793448925, -0.030607545748353004, 0.054964009672403336, 0.021168148145079613, 0.009994274005293846, 0.03011133149266243, -0.012263472191989422, 0.06474045664072037, 0.03243330866098404, 0.050898801535367966, 0.0193393062800169, -0.06433484703302383, 0.023739106953144073, 0.03433561325073242, -0.00890328362584114, -0.009643517434597015, 0.002282235538586974, -0.13841193914413452, -0.04694270342588425, -0.04964113608002663, -0.03623481094837189, -0.05850354582071304, 0.09368084371089935, 0.0774473324418068, 0.060954686254262924, 0.08006948977708817, -0.07362189143896103, 0.06884109973907471, -0.16419215500354767, -0.011529473587870598, 0.0036759155336767435, -0.012615359388291836, -0.01128808967769146, 0.0032858382910490036, 0.05356180667877197, -0.054633092135190964, 0.14364905655384064, 0.017646843567490578, 0.07566149532794952, 0.018171826377511024, -0.07695437222719193, 0.0030147661454975605, 0.016880659386515617, 0.12879185378551483, -0.0063191610388457775, -0.01960092782974243, -0.08710743486881256, 0.1030251681804657, 0.004088680725544691, 0.1294933408498764, 0.030015911906957626, 0.14377693831920624, 0.1367846578359604, 0.044028643518686295, 0.022662773728370667, -0.08884045481681824, -0.05738227069377899, 0.046713441610336304, 0.012765184976160526, 0.0478886216878891, -0.014933835715055466, 0.07138220220804214, 0.15597428381443024, -0.16112598776817322, 0.11884686350822449, 0.0055489628575742245, -0.06810984760522842, -0.05098465457558632, -0.09985566139221191, -0.047440823167562485, -0.049886852502822876, -0.04318070784211159, -0.11192181706428528, 0.0073556541465222836, 0.02122204378247261, 0.048510611057281494, -0.03663652017712593, 0.10017644613981247, 0.02216416411101818, -0.12118437886238098, 0.05884207412600517, 0.017424626275897026, 0.09443248063325882, -0.007182068657130003, 0.05155259743332863, 0.05276135727763176, 0.013252128846943378, 0.04446428641676903, 0.061452943831682205, -0.01497770193964243, 0.010364740155637264, 0.002511500148102641, -0.04965962842106819, -0.03982746973633766, 0.037487857043743134, 0.06194650009274483, 0.2147592306137085, 0.05885511264204979, -0.09306090325117111, -0.01358893234282732, 0.18097466230392456, -0.04747331887483597, -0.08638689666986465, -0.0928143635392189, 0.2252025306224823, 0.05093199759721756, 0.04761071503162384, -0.006997659336775541, -0.11045471578836441, -0.014608667232096195, 0.10439125448465347, 0.19983191788196564, -0.027276581153273582, -0.039340682327747345, 0.020700370892882347, -0.00337385106831789, 0.019383026286959648, 0.02556169405579567, 0.05454093962907791, 0.3030664920806885, -0.0737893134355545, 0.04866824671626091, -0.05848897248506546, 0.028019923716783524, -0.0032367566600441933, 0.1473243236541748, -0.011418328620493412, 0.004687908571213484, -0.029580047354102135, 0.09862015396356583, 0.0037071702536195517, -0.1873829960823059, 0.0032993480563163757, -0.11134766042232513, -0.11442643404006958, 0.0043363370932638645, -0.0383162647485733, 0.038404714316129684, 0.07399196922779083, 0.017443154007196426, 0.034216899424791336, 0.03406555950641632, 0.017228977754712105, -0.11841212958097458, -0.11966490745544434, 0.016266431659460068, 0.0027727207634598017, 0.08116548508405685, -0.006312371697276831, 0.10922680050134659, 0.10251735895872116, 0.029450980946421623, -0.08654999732971191, 0.10524420440196991, 0.024873560294508934, 0.03820451349020004, 0.09270472079515457, 0.06583141535520554, -0.004094902891665697, 0.024839406833052635, 0.04046507179737091, -0.05836601182818413, 0.035442620515823364, -0.03158099576830864, -0.015642402693629265, -0.14275456964969635, 0.08578025549650192, -0.030782796442508698, 0.12697945535182953, 0.16252467036247253, -0.01738588698208332, 0.0063990396447479725, -0.046340350061655045, 0.02493039146065712, 0.006105614826083183, 0.1073707789182663, -0.019797014072537422, -0.2050424963235855, 0.02608310431241989, -0.04814527556300163, 0.027516378089785576, -0.2655383050441742, -0.031227068975567818, 0.03599749878048897, -0.0526987686753273, -0.02767517790198326, 0.07421620190143585, 0.057709261775016785, 0.042582809925079346, -0.03694538772106171, -0.08217812329530716, -0.01826334372162819, 0.10978208482265472, -0.11115316301584244, -0.10989931970834732 ]
null
null
transformers
# legal_t5_small_multitask_es_en model Model on translating legal text from Spanish to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_es_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Spanish to English. ### How to use Here is how to use this model to translate legal text from Spanish to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_es_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_es_en", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "PPE-DE: 6', PSE: 6', ALDE: 5', Verts/ALE: 4', GUE/NGL: 4', IND/DEM:4', UEN: 4', NI: 4'" pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_es_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_es_en | 36.607| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish English", "tags": ["translation Spanish English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "PPE-DE: 6', PSE: 6', ALDE: 5', Verts/ALE: 4', GUE/NGL: 4', IND/DEM:4', UEN: 4', NI: 4'"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_es_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Spanish English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Spanish English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_es\_en model ========================================= Model on translating legal text from Spanish to English. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_es\_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Spanish to English. ### How to use Here is how to use this model to translate legal text from Spanish to English in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_es\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07432463020086288, 0.1403912603855133, -0.0044574858620762825, 0.07834015041589737, 0.06259554624557495, 0.011541692540049553, 0.01787453331053257, 0.1007794663310051, -0.08281531929969788, 0.0662832111120224, 0.04079531505703926, 0.026309926062822342, 0.06591172516345978, 0.01620672084391117, 0.03190949931740761, -0.18383142352104187, -0.006356650032103062, -0.02350141853094101, -0.029553160071372986, 0.101500503718853, 0.08292728662490845, -0.061914682388305664, 0.03736591711640358, -0.025738948956131935, -0.06334157288074493, 0.03279554471373558, -0.08860251307487488, -0.05753832310438156, 0.10509034246206284, 0.07389892637729645, 0.10013559460639954, 0.0060328347608447075, 0.06850738078355789, -0.18255619704723358, -0.011295254342257977, 0.06880627572536469, -0.005051400046795607, 0.050249382853507996, 0.11408903449773788, 0.0052118985913693905, 0.16714246571063995, -0.049767110496759415, 0.03524453565478325, 0.04762180894613266, -0.1265454739332199, -0.14601962268352509, -0.05368907377123833, -0.010719338431954384, 0.10306369513273239, 0.12883920967578888, -0.03844445198774338, 0.04881017655134201, -0.002776577603071928, 0.06842269748449326, 0.06682517379522324, -0.23130285739898682, -0.03105885162949562, 0.00539574446156621, 0.055177245289087296, 0.10017264634370804, -0.0315207801759243, -0.008841495029628277, 0.07680042833089828, 0.05653101205825806, 0.05505085736513138, -0.05331343412399292, -0.03881341218948364, -0.018432341516017914, -0.13111132383346558, -0.06511498242616653, 0.16194650530815125, 0.013199095614254475, -0.032116468995809555, -0.09650220721960068, -0.07007193565368652, -0.0555652417242527, 0.014357640407979488, -0.03676566481590271, 0.011849303729832172, -0.0008234083070419729, 0.04827859252691269, -0.045569829642772675, -0.11974608898162842, -0.060868069529533386, -0.06378237158060074, 0.07755973190069199, 0.05437881872057915, 0.01201082207262516, 0.02601798065006733, 0.088690385222435, -0.08760230243206024, -0.08639881759881973, 0.029093127697706223, 0.010479706339538097, -0.08540274202823639, 0.015689337626099586, 0.0020959109533578157, -0.21128791570663452, -0.011509528383612633, -0.05338183417916298, -0.09880007803440094, 0.03367874398827553, 0.052025362849235535, 0.039889752864837646, 0.06338468194007874, 0.11626727133989334, -0.07867954671382904, -0.09027526527643204, -0.04325040057301521, -0.007810782641172409, -0.02064504660665989, 0.02119145542383194, -0.06735683232545853, -0.027310892939567566, -0.016183290630578995, 0.04902246594429016, 0.013252029195427895, -0.003049300517886877, -0.0340200737118721, -0.023229938000440598, 0.09866200387477875, -0.08662140369415283, 0.020956024527549744, 0.013161590322852135, -0.09508001804351807, -0.037107281386852264, 0.05294230580329895, -0.022467033937573433, -0.1110878586769104, 0.039196256548166275, -0.0283283032476902, -0.023290500044822693, -0.11776888370513916, -0.18332181870937347, 0.004258495289832354, -0.011428057216107845, -0.05483579635620117, -0.09301362931728363, -0.10449108481407166, -0.08967559039592743, 0.026526572182774544, -0.06389893591403961, 0.008961847051978111, -0.1066770926117897, 0.018797053024172783, 0.025444569066166878, -0.01840861700475216, 0.07891581207513809, -0.04378464072942734, 0.05840446427464485, 0.035671062767505646, 0.06922798603773117, 0.01922345906496048, 0.032888662070035934, -0.09864236414432526, 0.03643166273832321, -0.07195588201284409, 0.15079081058502197, -0.0038557352963835, 0.001351258484646678, -0.14498355984687805, -0.055711131542921066, -0.08424358814954758, 0.04326356574892998, 0.08030027896165848, 0.13398118317127228, -0.20699018239974976, -0.024799615144729614, 0.21852537989616394, -0.05575268343091011, -0.07088437676429749, 0.1190313771367073, -0.03137901797890663, 0.06423293054103851, 0.07064149528741837, 0.07060951739549637, 0.03797866776585579, -0.04171977937221527, -0.04550109803676605, -0.00626687565818429, 0.029922552406787872, 0.04011467471718788, 0.0938941091299057, -0.07715636491775513, 0.07620710134506226, -0.0009169562254101038, 0.031959738582372665, 0.025976140052080154, -0.040759503841400146, -0.03830224275588989, 0.011478186585009098, -0.034889984875917435, -0.00838347990065813, 0.026356881484389305, 0.018656795844435692, -0.06758599728345871, -0.0836036428809166, -0.008779156021773815, 0.08124244958162308, -0.06059522181749344, 0.027759836986660957, 0.010331494733691216, -0.05829187110066414, -0.12245912104845047, 0.008457520045340061, -0.15703095495700836, 0.007653908804059029, 0.03540707752108574, -0.020273763686418533, 0.09215258061885834, 0.040295738726854324, 0.051311198621988297, 0.09439703822135925, -0.040880825370550156, -0.038850948214530945, -0.009813754819333553, -0.021273095160722733, -0.09704754501581192, -0.12199974805116653, -0.019714230671525, -0.015801096335053444, 0.006172283086925745, -0.14748595654964447, 0.01491156592965126, -0.05072398856282234, 0.0844491496682167, 0.005011021625250578, -0.024124013260006905, 0.01838846318423748, 0.08844999969005585, -0.043460916727781296, -0.03413994237780571, 0.039272043853998184, -0.010564357973635197, -0.04897667095065117, 0.10423089563846588, -0.10735313594341278, -0.1307889074087143, 0.08845282346010208, -0.007650098763406277, -0.09322751313447952, 0.001304518198594451, -0.008571233600378036, -0.059174321591854095, -0.06179114431142807, -0.06662113219499588, 0.26157939434051514, 0.04236774519085884, 0.1644820272922516, -0.12515921890735626, -0.05009875074028969, 0.016992740333080292, -0.035127192735672, -0.02087624929845333, 0.16680656373500824, 0.059928108006715775, -0.16697974503040314, 0.09186399728059769, 0.039612166583538055, -0.00867905281484127, 0.10975714027881622, 0.0535224974155426, -0.10948832333087921, 0.0016504463274031878, 0.06168963015079498, 0.0013509126147255301, 0.037156324833631516, -0.09782171994447708, -0.018417581915855408, 0.024997303262352943, 0.06528395414352417, 0.07450418919324875, -0.11949067562818527, 0.0726432353258133, 0.0785570740699768, -0.03185950964689255, 0.0342358835041523, -0.05096723884344101, -0.034717101603746414, 0.09720566868782043, 0.006683602929115295, -0.03603321686387062, -0.03785804659128189, -0.035047583281993866, -0.11107777059078217, 0.18011371791362762, -0.09622864425182343, -0.2354767620563507, -0.1413317173719406, 0.005091242026537657, -0.0186099112033844, 0.02734820544719696, 0.04705867916345596, -0.059438079595565796, -0.04155775159597397, -0.07382175326347351, 0.08889896422624588, -0.10902726650238037, -0.06928034871816635, -0.09074908494949341, 0.0600707083940506, -0.009333246387541294, -0.14612847566604614, 0.03649824112653732, 0.02099865861237049, -0.028720909729599953, 0.0003601005591917783, -0.04198664054274559, 0.12355182319879532, 0.1396280974149704, -0.027027888223528862, -0.04068722575902939, -0.003630508203059435, 0.13634595274925232, -0.07224879413843155, 0.02905840426683426, 0.05742943659424782, 0.0377768948674202, 0.03595321625471115, 0.12583491206169128, 0.037683285772800446, -0.05322325602173805, 0.029406625777482986, 0.040618132799863815, -0.010715133510529995, -0.2747935652732849, -0.09217289835214615, -0.06048201397061348, -0.037541333585977554, 0.09367213398218155, 0.04072128236293793, -0.05834632366895676, 0.01820235885679722, -0.04291107505559921, 0.03726522624492645, -0.0018613089341670275, 0.057994987815618515, 0.041897062212228775, -0.020944083109498024, 0.06418436765670776, -0.05809541046619415, -0.06552968919277191, 0.0910600870847702, 0.04166539013385773, 0.19841282069683075, -0.04435204342007637, 0.23104792833328247, 0.058764997869729996, 0.06004982814192772, -0.004863607231527567, 0.06941693276166916, -0.04777522757649422, 0.03984222933650017, -0.03528043255209923, -0.057362865656614304, -0.007586205378174782, 0.05964161455631256, 0.010050936602056026, 0.02107636071741581, -0.06355965882539749, -0.055075109004974365, 0.06454448401927948, 0.19074267148971558, 0.07971125096082687, -0.2018592655658722, -0.040172696113586426, 0.006621477194130421, -0.07075774669647217, -0.08762313425540924, 0.011163131333887577, 0.1618754118680954, -0.07677534222602844, -0.005891441833227873, 0.020141728222370148, 0.13515035808086395, -0.12800785899162292, -0.022160202264785767, 0.007188925053924322, 0.022274117916822433, -0.012682090513408184, 0.13014332950115204, -0.24148240685462952, 0.18610574305057526, 0.020166991278529167, 0.064690962433815, -0.03796347975730896, 0.03007904812693596, -0.06973707675933838, -0.014984628185629845, 0.10181020945310593, 0.016789020970463753, -0.029419630765914917, -0.1081475242972374, -0.10004158318042755, -0.017175571992993355, 0.06108998879790306, -0.04369828850030899, 0.09674821048974991, 0.07227294147014618, 0.011930776759982109, -0.008842237293720245, 0.029897354543209076, -0.03571101278066635, -0.17201754450798035, -0.0024740872904658318, -0.008873057551681995, -0.02420240454375744, -0.011622034944593906, -0.03691042959690094, -0.06883765757083893, 0.2019454836845398, -0.1339826136827469, -0.08173760771751404, -0.06750853359699249, 0.008504852652549744, 0.1352405995130539, -0.06492625176906586, 0.013573376461863518, 0.023615024983882904, 0.017325742170214653, -0.033518821001052856, -0.016246367245912552, 0.08865174651145935, -0.06227010861039162, -0.06810391694307327, -0.08475761115550995, 0.1240992546081543, 0.05289194732904434, 0.04551162198185921, -0.019505368545651436, 0.02872208133339882, -0.008113500662147999, -0.08699002861976624, -0.0069017899222671986, 0.029139304533600807, 0.15936550498008728, 0.03722156584262848, -0.06816383451223373, -0.07687848806381226, -0.0835934653878212, -0.08243734389543533, 0.13965778052806854, 0.16819298267364502, -0.043946217745542526, 0.03663012012839317, 0.18829448521137238, -0.13005179166793823, -0.13607953488826752, -0.05770143121480942, 0.09241870045661926, 0.08724058419466019, -0.027191508561372757, -0.18849512934684753, -0.006902513559907675, 0.11130891740322113, 0.006146490573883057, 0.03362265229225159, -0.42206045985221863, -0.135611429810524, 0.01809117943048477, 0.04282435402274132, -0.001054624211974442, -0.11629894375801086, -0.06120048090815544, -0.05033603683114052, -0.10573164373636246, 0.08073161542415619, -0.013569542206823826, 0.09223857522010803, 0.007211120333522558, 0.021764567121863365, 0.04927334934473038, -0.03273040056228638, 0.1374145746231079, -0.011891119182109833, 0.025504857301712036, -0.048045579344034195, 0.0465678796172142, 0.016427351161837578, -0.010185124352574348, 0.1349209100008011, -0.04020455479621887, 0.054494161158800125, -0.17821256816387177, -0.04884554445743561, -0.04979480430483818, 0.0029622616712003946, -0.04000403359532356, -0.06274405121803284, -0.036571700125932693, 0.013671217486262321, 0.05418287590146065, -0.011103715747594833, 0.016560669988393784, -0.04173741862177849, 0.04045741260051727, 0.1692465841770172, 0.09718943387269974, 0.03387762978672981, -0.09910960495471954, 0.004615692887455225, 0.010190369561314583, 0.0664905235171318, -0.13359487056732178, 0.004007005598396063, 0.14901497960090637, 0.022326529026031494, 0.11286816000938416, -0.012072046287357807, -0.13593250513076782, 0.027550695464015007, 0.07587753236293793, -0.07967153191566467, -0.13776856660842896, -0.019030088558793068, 0.03624314069747925, -0.05180247873067856, -0.01485857367515564, 0.0945209413766861, -0.0710642859339714, -0.03936401754617691, -0.01312267780303955, 0.0230106208473444, -0.05326896160840988, 0.22274614870548248, 0.011670461855828762, 0.050634585320949554, -0.052802544087171555, 0.10648906975984573, 0.15972766280174255, -0.14463815093040466, 0.027714436873793602, 0.18279241025447845, -0.05940589681267738, -0.0406915657222271, 0.04624617472290993, 0.12281131744384766, -0.022663036361336708, -0.06538659334182739, -0.02613295055925846, -0.027474641799926758, 0.011715556494891644, 0.00435660919174552, 0.02214544638991356, 0.04007050395011902, -0.007006274536252022, -0.03591958060860634, -0.09336685389280319, 0.08180005103349686, 0.0727168470621109, 0.0154647221788764, -0.03243828937411308, 0.11876318603754044, 0.014729779213666916, -0.015126748941838741, -0.010064369067549706, 0.01187081541866064, -0.05644790455698967, 0.02899288572371006, -0.07307641953229904, 0.007492900360375643, -0.04975997656583786, -0.020715847611427307, -0.030507948249578476, 0.0014708059607073665, -0.006344636436551809, 0.0013628102606162429, -0.04127390682697296, -0.037023115903139114, -0.0386539027094841, 0.04367980360984802, -0.08160453289747238, -0.03796869143843651, 0.005448126699775457, -0.032213807106018066, 0.05380795896053314, 0.009133175015449524, -0.010011290200054646, 0.016931260004639626, -0.020610524341464043, 0.06544812023639679, 0.030151838436722755, 0.05209822580218315, 0.017452532425522804, -0.07850644737482071, 0.027073077857494354, 0.03308563306927681, -0.016927212476730347, -0.014497414231300354, 0.018301501870155334, -0.1352033019065857, -0.0393204391002655, -0.030179597437381744, -0.05060851201415062, -0.0622657835483551, 0.09692002087831497, 0.06962007284164429, 0.05993379279971123, 0.10823024809360504, -0.06747881323099136, 0.07090368866920471, -0.15387395024299622, -0.007370367180556059, 0.024949053302407265, -0.02184213139116764, -0.011092199943959713, -0.01072089932858944, 0.049110427498817444, -0.07232718914747238, 0.1291615068912506, 0.011930695734918118, 0.07759485393762589, 0.005409551318734884, -0.08431307971477509, -0.015076222829520702, 0.015856986865401268, 0.11070118844509125, -0.01900777779519558, -0.02053922973573208, -0.08636490255594254, 0.0930580422282219, 0.002420483622699976, 0.11309445649385452, 0.020420866087079048, 0.1261293888092041, 0.15213753283023834, 0.04771456494927406, 0.00821189396083355, -0.09227628260850906, -0.059631358832120895, 0.07966310530900955, 0.003980738110840321, 0.05399761721491814, -0.025212181732058525, 0.111758753657341, 0.14003096520900726, -0.1434181183576584, 0.11053130030632019, 0.006893767975270748, -0.08052753657102585, -0.05789661034941673, -0.09512577205896378, -0.04437614604830742, -0.046786319464445114, -0.035485103726387024, -0.11417469382286072, 0.012931417673826218, 0.0013184521812945604, 0.04233080521225929, -0.03756539151072502, 0.09665397554636002, 0.03446093946695328, -0.12026706337928772, 0.06264794617891312, 0.009190638549625874, 0.11983602494001389, -0.02345128357410431, 0.03644893690943718, 0.058043863624334335, 0.0036632081028074026, 0.050036970525979996, 0.060695845633745193, -0.01754390448331833, 0.014393686316907406, 0.004193483851850033, -0.05556691810488701, -0.0428471565246582, 0.03057064302265644, 0.07400981336832047, 0.1939995139837265, 0.061813950538635254, -0.059692393988370895, -0.028322715312242508, 0.18492072820663452, -0.0556410513818264, -0.07576049864292145, -0.10217299312353134, 0.20514808595180511, 0.03676484152674675, 0.03185953199863434, 0.023878825828433037, -0.1105014756321907, -0.011140182614326477, 0.1324799358844757, 0.17740291357040405, -0.02427821233868599, -0.04170849546790123, 0.026793524622917175, -0.0066869910806417465, 0.013120917603373528, 0.037450604140758514, 0.05346186086535454, 0.2793673872947693, -0.08398039638996124, 0.048074398189783096, -0.061558980494737625, 0.04488081857562065, -0.012101302854716778, 0.1497650295495987, -0.011947482824325562, 0.000830377044621855, -0.04048427939414978, 0.09592510014772415, 0.009449495002627373, -0.1589251011610031, -0.0048076617531478405, -0.09948914498090744, -0.11172644793987274, 0.014915610663592815, 0.020126523450016975, 0.03357744216918945, 0.07082874327898026, 0.012228614650666714, 0.02608393505215645, 0.05638924241065979, 0.013609226793050766, -0.09604313224554062, -0.10161720961332321, 0.006006527692079544, -0.07251080125570297, 0.10096156597137451, 0.007420773152261972, 0.1410597562789917, 0.09846952557563782, 0.021264351904392242, -0.08905375748872757, 0.10049029439687729, 0.023272115737199783, 0.03757430613040924, 0.09266941994428635, 0.07910618185997009, 0.00018136501603294164, 0.06310974806547165, 0.04076677933335304, -0.04918353632092476, 0.05462048947811127, -0.048289332538843155, -0.004674943163990974, -0.1250382661819458, 0.08762090653181076, -0.02824651263654232, 0.12582111358642578, 0.1744074672460556, -0.015787264332175255, 0.0017575203673914075, -0.045246269553899765, 0.01523719821125269, -0.01013217493891716, 0.11104869097471237, -0.020687447860836983, -0.20717936754226685, 0.02065163105726242, -0.028034035116434097, 0.04339887946844101, -0.24318856000900269, -0.019320251420140266, 0.03611408919095993, -0.05894458666443825, -0.037341367453336716, 0.0723092183470726, 0.07570493221282959, 0.03652142360806465, -0.029875488951802254, -0.09953246265649796, -0.01139156986027956, 0.10787878930568695, -0.09891867637634277, -0.09879106283187866 ]
null
null
transformers
# legal_t5_small_multitask_es_fr model Model on translating legal text from Spanish to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_es_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Spanish to French. ### How to use Here is how to use this model to translate legal text from Spanish to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_es_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_es_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "Fecha del anuncio en el Pleno" pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_es_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_es_fr | 41.523| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish French", "tags": ["translation Spanish French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Fecha del anuncio en el Pleno"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_es_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Spanish French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Spanish French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_es\_fr model ========================================= Model on translating legal text from Spanish to French. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_es\_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Spanish to French. ### How to use Here is how to use this model to translate legal text from Spanish to French in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_es\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07425962388515472, 0.1461554914712906, -0.0045255799777805805, 0.0842820256948471, 0.05293905735015869, -0.0011683838674798608, 0.021706338971853256, 0.09117957949638367, -0.08234007656574249, 0.06500212848186493, 0.04706244915723801, 0.011258231475949287, 0.05658871680498123, 0.02037157118320465, 0.040054768323898315, -0.17451968789100647, -0.002811599988490343, -0.02600925788283348, -0.03970181196928024, 0.10181780904531479, 0.08876221626996994, -0.05102318525314331, 0.04373447597026825, -0.014103181660175323, -0.06107742711901665, 0.045002613216638565, -0.08758977800607681, -0.05559860169887543, 0.10982254147529602, 0.07802607864141464, 0.09761122614145279, 0.008439866825938225, 0.06999363005161285, -0.18036292493343353, -0.012463115155696869, 0.0700138658285141, -0.015462327748537064, 0.04891803488135338, 0.12089352309703827, -0.004100889898836613, 0.16415657103061676, -0.05507420748472214, 0.033003635704517365, 0.0483129508793354, -0.1179029569029808, -0.14363324642181396, -0.05864379554986954, -0.01506822369992733, 0.08533833175897598, 0.1321777105331421, -0.03295838087797165, 0.04268846660852432, -0.0166554544121027, 0.06443515419960022, 0.07098645716905594, -0.22879065573215485, -0.03329595550894737, -0.0026966596487909555, 0.058088235557079315, 0.09356389939785004, -0.03637479618191719, -0.0010423343628644943, 0.07158668339252472, 0.05129416659474373, 0.05942786484956741, -0.058902204036712646, -0.03434114158153534, -0.02205074392259121, -0.12947486340999603, -0.06513381004333496, 0.16127973794937134, 0.01445454265922308, -0.03749929741024971, -0.09602560847997665, -0.0686582699418068, -0.055635519325733185, -0.0006408597691915929, -0.045271120965480804, 0.024996750056743622, -0.0032903675455600023, 0.0517694316804409, -0.042452551424503326, -0.111326202750206, -0.06352446973323822, -0.06268846243619919, 0.06753537803888321, 0.05043772980570793, 0.00970455538481474, 0.037800684571266174, 0.09006217122077942, -0.1100679412484169, -0.07434983551502228, 0.0290816780179739, 0.009750694036483765, -0.08830181509256363, 0.017041433602571487, 0.005522132385522127, -0.192335844039917, -0.003998269326984882, -0.03562844172120094, -0.11664944142103195, 0.03242756426334381, 0.04624893143773079, 0.03933684155344963, 0.056004367768764496, 0.11745185405015945, -0.08017181605100632, -0.1012202724814415, -0.048326026648283005, -0.005834650713950396, -0.03405676782131195, 0.03231796249747276, -0.06260667741298676, -0.03717809543013573, -0.023321976885199547, 0.04463635012507439, 0.013391701504588127, -0.015343696810305119, -0.02881467528641224, -0.01992424950003624, 0.1086282804608345, -0.08613813668489456, 0.028268128633499146, 0.01138242892920971, -0.1001635268330574, -0.030199509114027023, 0.04843943938612938, -0.01993434689939022, -0.1152389869093895, 0.036908626556396484, -0.0275957603007555, -0.024052424356341362, -0.12018837779760361, -0.1832491010427475, -0.0015527601353824139, 0.005410973448306322, -0.0648336336016655, -0.08798252791166306, -0.09188336133956909, -0.08978965878486633, 0.039119936525821686, -0.0708857923746109, 0.01363447867333889, -0.11337781697511673, 0.01198833528906107, 0.03285394236445427, -0.0061606960371136665, 0.07344407588243484, -0.04917971417307854, 0.043348755687475204, 0.020306603983044624, 0.06693995743989944, 0.013447007164359093, 0.03187505155801773, -0.0878949910402298, 0.042696770280599594, -0.08199971914291382, 0.1583864539861679, -0.009138082154095173, -0.022931847721338272, -0.1520867943763733, -0.06541480869054794, -0.0918329507112503, 0.03859938308596611, 0.08906856179237366, 0.14206963777542114, -0.20599468052387238, -0.03222617134451866, 0.2315138280391693, -0.05698475241661072, -0.06685219705104828, 0.14250721037387848, -0.03287447243928909, 0.048837486654520035, 0.06289850920438766, 0.07124835252761841, 0.04536928981542587, -0.04764290526509285, -0.04566449671983719, 0.00933073926717043, 0.03337591141462326, 0.040485966950654984, 0.10600250959396362, -0.0773589089512825, 0.06740272790193558, -0.010809896513819695, 0.029137803241610527, 0.021124981343746185, -0.04389404132962227, -0.038145676255226135, 0.010768014006316662, -0.027215249836444855, 0.008628074079751968, 0.028010990470647812, 0.01759744994342327, -0.06692571192979813, -0.08908912539482117, -0.025509575381875038, 0.08141987770795822, -0.06263256072998047, 0.024230482056736946, 0.019785679876804352, -0.055207010358572006, -0.1056341752409935, 0.005492161959409714, -0.15098342299461365, 0.001607410260476172, 0.03671044111251831, -0.03264034166932106, 0.08846589177846909, 0.03981412574648857, 0.05635158717632294, 0.10398811101913452, -0.04498346149921417, -0.03655719757080078, -0.023118114098906517, -0.02690069004893303, -0.08456292003393173, -0.1279183179140091, -0.010679081082344055, -0.01560080423951149, 0.01847977377474308, -0.15107090771198273, 0.013877207413315773, -0.04694872349500656, 0.08310423046350479, 0.0013553912285715342, -0.016834961250424385, -0.0034660224337130785, 0.08549600839614868, -0.04171179234981537, -0.0295143723487854, 0.038475167006254196, -0.01723065972328186, -0.0275193452835083, 0.10545319318771362, -0.08797049522399902, -0.1197335496544838, 0.08964864164590836, -0.005465569440275431, -0.10163244605064392, -0.005866158753633499, -0.014684933237731457, -0.05402383580803871, -0.055771127343177795, -0.05977366119623184, 0.258756160736084, 0.04671025648713112, 0.17462024092674255, -0.1272633820772171, -0.052956122905015945, 0.024103162810206413, -0.029328834265470505, -0.02490422874689102, 0.17127731442451477, 0.06847888976335526, -0.15278272330760956, 0.09509381651878357, 0.044476818293333054, -0.0077727907337248325, 0.10803377628326416, 0.05263076722621918, -0.10888297855854034, 0.00023983034770935774, 0.07444227486848831, 0.004403510596603155, 0.038975004106760025, -0.09488987922668457, -0.021780820563435555, 0.021426569670438766, 0.06062505021691322, 0.07265821844339371, -0.12082504481077194, 0.07862118631601334, 0.07116792351007462, -0.038468435406684875, 0.037898506969213486, -0.043887797743082047, -0.038578737527132034, 0.10538832098245621, 0.015804952010512352, -0.061681464314460754, -0.043629903346300125, -0.03702142834663391, -0.10879296064376831, 0.18789994716644287, -0.09592302143573761, -0.23434588313102722, -0.13326042890548706, 0.009329600259661674, -0.03907090052962303, 0.027748247608542442, 0.045711249113082886, -0.05863320827484131, -0.03795459493994713, -0.07809314876794815, 0.07121101021766663, -0.11108993738889694, -0.0652858316898346, -0.09538990259170532, 0.061774611473083496, -0.015811558812856674, -0.15232284367084503, 0.031260427087545395, 0.01828998140990734, -0.033500246703624725, -0.013246290385723114, -0.04351891949772835, 0.1250750571489334, 0.13644930720329285, -0.0402558259665966, -0.04102536290884018, 0.0015408063773065805, 0.14315709471702576, -0.07323555648326874, 0.02614753693342209, 0.050783514976501465, 0.05719379335641861, 0.04249458387494087, 0.12600065767765045, 0.04095059260725975, -0.04512961208820343, 0.02707386203110218, 0.04904207959771156, -0.009653424844145775, -0.2625739574432373, -0.10692383348941803, -0.06189657002687454, -0.03658030182123184, 0.09516683220863342, 0.04050960764288902, -0.047553613781929016, 0.013757998123764992, -0.04521381855010986, 0.037955086678266525, 0.007389778271317482, 0.05730801075696945, 0.04841359332203865, -0.01751480996608734, 0.06386261433362961, -0.06122738495469093, -0.07866223901510239, 0.09732063114643097, 0.038040850311517715, 0.19176435470581055, -0.04828476160764694, 0.2284690886735916, 0.06026386469602585, 0.05970732122659683, -0.0128621282055974, 0.06959111988544464, -0.04165724664926529, 0.03847702220082283, -0.03751615807414055, -0.0590842068195343, 0.0038853308651596308, 0.05490949749946594, 0.004343674052506685, 0.009530998766422272, -0.07577169686555862, -0.0557606965303421, 0.07242736220359802, 0.19120962917804718, 0.07411882281303406, -0.20333077013492584, -0.03612298145890236, -0.007290031295269728, -0.06914228945970535, -0.09139609336853027, 0.01675848476588726, 0.16901427507400513, -0.08684884756803513, -0.011669457890093327, 0.021154403686523438, 0.1334923803806305, -0.11359330266714096, -0.018684187904000282, 0.02293403260409832, 0.02981926128268242, -0.013272112235426903, 0.12675251066684723, -0.2308105230331421, 0.1899685561656952, 0.017628677189350128, 0.05296427384018898, -0.03580803424119949, 0.03404764458537102, -0.06524036079645157, 0.004161701072007418, 0.11671032011508942, 0.023071164265275, -0.026818640530109406, -0.0908888503909111, -0.091337189078331, -0.022864943370223045, 0.06660468876361847, -0.04711807146668434, 0.08644662797451019, 0.06883753836154938, -0.0022980542853474617, -0.012395248748362064, 0.02696213871240616, -0.041958510875701904, -0.18077069520950317, 0.0021233540028333664, -0.010451673530042171, -0.03092140331864357, -0.010293594561517239, -0.037031568586826324, -0.07332165539264679, 0.21458344161510468, -0.1220911517739296, -0.06767638027667999, -0.06510560214519501, -0.004603060893714428, 0.137265145778656, -0.0644562691450119, 0.03045490011572838, 0.01721964403986931, 0.02453427202999592, -0.0416550450026989, -0.013273595832288265, 0.08885267376899719, -0.07637424021959305, -0.0504186674952507, -0.084379643201828, 0.12673445045948029, 0.053042974323034286, 0.04123824089765549, -0.011523121036589146, 0.02776332013309002, -0.01251521147787571, -0.09908038377761841, -0.017094800248742104, 0.006631126161664724, 0.15712112188339233, 0.03867742419242859, -0.07404257357120514, -0.08713579177856445, -0.07828318327665329, -0.0671529620885849, 0.14476703107357025, 0.16797879338264465, -0.053170718252658844, 0.04490435868501663, 0.18910546600818634, -0.12278497219085693, -0.1483789086341858, -0.050209809094667435, 0.10441193729639053, 0.07397187501192093, -0.03898957744240761, -0.1903974711894989, -0.01208527758717537, 0.10528364777565002, 0.006717002019286156, 0.03179183602333069, -0.4288448393344879, -0.13345322012901306, 0.0063537899404764175, 0.03439011424779892, 0.008334350772202015, -0.1030401736497879, -0.05116508528590202, -0.055022381246089935, -0.0999431163072586, 0.0867924913764, -0.018308348953723907, 0.08325478434562683, 0.010911158286035061, 0.0015934276161715388, 0.04578494653105736, -0.031019773334264755, 0.14158251881599426, -0.0003460595617070794, 0.0277912188321352, -0.03621848672628403, 0.05312572419643402, 0.022003639489412308, -0.00618164986371994, 0.13244906067848206, -0.030824147164821625, 0.05403279513120651, -0.17622727155685425, -0.043483175337314606, -0.05064335837960243, 0.010578575544059277, -0.043802954256534576, -0.055706653743982315, -0.03791559860110283, 0.017664948478341103, 0.06656870245933533, -0.007105309050530195, -0.013085709884762764, -0.030357180163264275, 0.03316895291209221, 0.18030448257923126, 0.0889124870300293, 0.040356624871492386, -0.10712192952632904, 0.023431995883584023, 0.016233330592513084, 0.06270527839660645, -0.12621141970157623, 0.012826432473957539, 0.1479317843914032, 0.01780281960964203, 0.11170978099107742, -0.010281173512339592, -0.12968949973583221, 0.010448171757161617, 0.0751304104924202, -0.08436047285795212, -0.1406000405550003, -0.026449618861079216, 0.032570477575063705, -0.04213164746761322, -0.019862500950694084, 0.08861242234706879, -0.07298294454813004, -0.0393257774412632, -0.01660357415676117, 0.017976421862840652, -0.06134193390607834, 0.2142636626958847, 0.0013667098246514797, 0.05354969576001167, -0.053861308842897415, 0.09266988933086395, 0.15419311821460724, -0.1543716937303543, 0.02182486280798912, 0.18743006885051727, -0.058268699795007706, -0.04192614182829857, 0.052755530923604965, 0.12722186744213104, -0.03165595605969429, -0.0679057240486145, -0.032821692526340485, -0.025409284979104996, 0.015948399901390076, 0.008952250704169273, 0.02396482229232788, 0.03254232928156853, 0.00257012783549726, -0.04730386286973953, -0.09614371508359909, 0.08687789738178253, 0.08017277717590332, 0.009203276596963406, -0.01269213855266571, 0.10203083604574203, 0.010023375041782856, -0.008950669318437576, -0.010239437222480774, 0.01405679527670145, -0.05680049955844879, 0.020001649856567383, -0.06652463972568512, 0.007956549525260925, -0.0407404825091362, -0.012652620673179626, -0.033843591809272766, 0.0037929052487015724, -0.004238502588123083, 0.0014767521061003208, -0.045899584889411926, -0.037442151457071304, -0.03332512825727463, 0.048009153455495834, -0.08881115913391113, -0.029745128005743027, 0.011434604413807392, -0.03763594105839729, 0.05419837683439255, 0.010029376484453678, -0.0037633220199495554, 0.01038083154708147, -0.031071649864315987, 0.052653003484010696, 0.00944918766617775, 0.05761769786477089, 0.015230344608426094, -0.08541984856128693, 0.03887123242020607, 0.03372152894735336, -0.013052847236394882, -0.01807400956749916, 0.0137106254696846, -0.12744060158729553, -0.037881892174482346, -0.03730563819408417, -0.061477284878492355, -0.06342089921236038, 0.11308424174785614, 0.06395617872476578, 0.06693333387374878, 0.09992171823978424, -0.06822492182254791, 0.06435178965330124, -0.14194196462631226, -0.01186503004282713, 0.025853484869003296, -0.022480806335806847, -0.006870640441775322, -0.015892649069428444, 0.044039398431777954, -0.06430720537900925, 0.13698768615722656, 0.0322079174220562, 0.09354154765605927, 0.006719330791383982, -0.09957248717546463, -0.02333701029419899, 0.021249132230877876, 0.09077935665845871, -0.02076248824596405, -0.017010990530252457, -0.0916299968957901, 0.09420619159936905, 0.008457018062472343, 0.12541601061820984, 0.018730441108345985, 0.1285548061132431, 0.14782598614692688, 0.04637308418750763, -0.0011290733236819506, -0.09703431278467178, -0.056410253047943115, 0.08689950406551361, 0.008851715363562107, 0.05328318476676941, -0.04272448644042015, 0.10799004882574081, 0.12730450928211212, -0.1478881984949112, 0.12158437818288803, 0.01604033075273037, -0.08326173573732376, -0.06325575709342957, -0.08151963353157043, -0.04309365898370743, -0.052513059228658676, -0.03804827481508255, -0.10924863815307617, 0.027564890682697296, -0.006012313067913055, 0.054407596588134766, -0.029149049893021584, 0.09561450034379959, 0.0035071715246886015, -0.11951757967472076, 0.07032275199890137, 0.009766606613993645, 0.1260414719581604, -0.025092964991927147, 0.03968125209212303, 0.05366625636816025, -0.002275318605825305, 0.04814474284648895, 0.06348368525505066, -0.010062841698527336, 0.014785680919885635, 0.008034861646592617, -0.0481514036655426, -0.04130459204316139, 0.025503411889076233, 0.07684475183486938, 0.20065981149673462, 0.06096245348453522, -0.07284979522228241, -0.023546766489744186, 0.1837432086467743, -0.05653010681271553, -0.0762503445148468, -0.11010181903839111, 0.2054847776889801, 0.03851645439863205, 0.03939487412571907, 0.01716584898531437, -0.10100480914115906, -0.013238837011158466, 0.12461214512586594, 0.18469145894050598, -0.01954771764576435, -0.041278693825006485, 0.03279389068484306, -0.006401582155376673, 0.030447084456682205, 0.026624646037817, 0.05643920227885246, 0.2862200140953064, -0.09386713057756424, 0.046302955597639084, -0.06306204199790955, 0.04955439642071724, -0.011963151395320892, 0.15828180313110352, -0.008857506327331066, -0.0012004149612039328, -0.04044009745121002, 0.09604261070489883, 0.02060565911233425, -0.16481851041316986, 0.007466810289770365, -0.10128536820411682, -0.10733809322118759, 0.014916660264134407, 0.002499345922842622, 0.037071309983730316, 0.07349438220262527, 0.012455196119844913, 0.016437089070677757, 0.060074999928474426, 0.01457421388477087, -0.09093974530696869, -0.10914880782365799, -0.0019914566073566675, -0.06731916964054108, 0.11863468587398529, 0.006663376931101084, 0.14666904509067535, 0.09954041987657547, 0.012054650112986565, -0.08325105160474777, 0.08137363195419312, 0.028525304049253464, 0.03874535113573074, 0.0860484167933464, 0.07358190417289734, -0.006219610571861267, 0.06091630458831787, 0.02314024232327938, -0.04217783734202385, 0.061931490898132324, -0.058847058564424515, -0.012319204397499561, -0.12416012585163116, 0.09786801040172577, -0.03334064036607742, 0.12159846723079681, 0.18031203746795654, -0.009336482733488083, 0.015033450908958912, -0.05333250015974045, 0.016065623611211777, -0.012097342871129513, 0.1023927703499794, -0.018155725672841072, -0.20268943905830383, 0.037765391170978546, -0.03042590618133545, 0.04545575752854347, -0.24021433293819427, -0.015346121042966843, 0.03435352072119713, -0.046426717191934586, -0.031288404017686844, 0.06965820491313934, 0.07193296402692795, 0.030136337503790855, -0.031266335397958755, -0.10716122388839722, -0.015628119930624962, 0.10292063653469086, -0.07815924286842346, -0.09360214322805405 ]
null
null
transformers
# legal_t5_small_multitask_es_it model Model on translating legal text from Spanish to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_es_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Spanish to Italian. ### How to use Here is how to use this model to translate legal text from Spanish to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_es_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_es_it", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "Por el Parlamento Europeo Por el Consejo" pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_es_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_es_it | 37.386| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish Italian", "tags": ["translation Spanish Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Por el Parlamento Europeo Por el Consejo"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_es_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Spanish Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_es\_it model ========================================= Model on translating legal text from Spanish to Italian. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_es\_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Spanish to Italian. ### How to use Here is how to use this model to translate legal text from Spanish to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_es\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08355816453695297, 0.15160785615444183, -0.004137046169489622, 0.08439094573259354, 0.06506606936454773, 0.014622630551457405, 0.02885596640408039, 0.11068720370531082, -0.06982318311929703, 0.07734876871109009, 0.03261934593319893, 0.019503969699144363, 0.07130620628595352, 0.026131771504878998, 0.022346461191773415, -0.18433408439159393, 0.0013725744793191552, -0.02546551823616028, -0.028321783989667892, 0.10606251657009125, 0.08691814541816711, -0.06446821987628937, 0.03723951056599617, -0.03380432724952698, -0.07827426493167877, 0.040331270545721054, -0.0880681574344635, -0.049733396619558334, 0.10661780089139938, 0.07826404273509979, 0.1003640741109848, 0.005617537070065737, 0.07688958942890167, -0.17219267785549164, -0.01184528972953558, 0.0772026777267456, -0.006931825540959835, 0.04154568910598755, 0.12503327429294586, 0.005805881228297949, 0.1685844361782074, -0.03821288049221039, 0.02467062696814537, 0.04514237120747566, -0.11634832620620728, -0.13251537084579468, -0.053410373628139496, -0.011376623995602131, 0.09192735701799393, 0.133601114153862, -0.037438321858644485, 0.05932790786027908, -0.015635544434189796, 0.06632822006940842, 0.0680982768535614, -0.21806617081165314, -0.03836393356323242, -0.006177166942507029, 0.04626065120100975, 0.1065651923418045, -0.02641446888446808, -0.001606734818778932, 0.07182851433753967, 0.049622487276792526, 0.037970807403326035, -0.04347638413310051, -0.026928840205073357, -0.026476198807358742, -0.1317533701658249, -0.062497057020664215, 0.16819757223129272, 0.008118094876408577, -0.03723722696304321, -0.09412936121225357, -0.06845294684171677, -0.04873805493116379, 0.011479087173938751, -0.041834406554698944, 0.007591858971863985, 0.0030755021143704653, 0.06165396794676781, -0.02974126674234867, -0.11823286861181259, -0.0680067166686058, -0.05261581018567085, 0.09892676025629044, 0.06112438812851906, 0.008070315234363079, 0.03485432639718056, 0.08897478133440018, -0.09633361548185349, -0.08082230389118195, 0.030297204852104187, 0.017388595268130302, -0.08259139209985733, 0.011557201854884624, -0.00102342595346272, -0.21281544864177704, -0.006230681668967009, -0.045741088688373566, -0.08173098415136337, 0.025116141885519028, 0.049178145825862885, 0.04545269533991814, 0.055927127599716187, 0.11005020141601562, -0.09036420285701752, -0.09840647131204605, -0.04526752978563309, -0.0076669612899422646, -0.021083444356918335, 0.019433528184890747, -0.075981505215168, -0.04103456437587738, -0.02592393010854721, 0.05244027078151703, 0.02243494801223278, -0.005694459192454815, -0.03942057117819786, -0.030039973556995392, 0.09020750224590302, -0.09615934640169144, 0.028065688908100128, 0.013820910826325417, -0.1044669970870018, -0.020484570413827896, 0.03651944547891617, -0.027257433161139488, -0.12098836153745651, 0.04776361212134361, -0.029864253476262093, -0.025646423920989037, -0.12728917598724365, -0.18199802935123444, 0.009342162869870663, -0.018130531534552574, -0.05980623885989189, -0.09180576354265213, -0.09586561471223831, -0.08653574436903, 0.027558600530028343, -0.07375030964612961, 0.021720105782151222, -0.10392580181360245, 0.024359984323382378, 0.02032681182026863, -0.01267368532717228, 0.07163286954164505, -0.04478644207119942, 0.0583416223526001, 0.02617829293012619, 0.06879866123199463, 0.009017536416649818, 0.03370918333530426, -0.10014098137617111, 0.034591738134622574, -0.07093586772680283, 0.14436842501163483, -0.005884442012757063, -0.0076890792697668076, -0.14268679916858673, -0.05731150507926941, -0.0891999676823616, 0.03717522323131561, 0.09343356639146805, 0.15041495859622955, -0.21528102457523346, -0.023950276896357536, 0.21938328444957733, -0.06867821514606476, -0.07003834843635559, 0.1374707967042923, -0.025204036384820938, 0.06978275626897812, 0.06936556100845337, 0.07728925347328186, 0.039679091423749924, -0.049553610384464264, -0.051847897469997406, -0.009022608399391174, 0.025039687752723694, 0.03437260538339615, 0.0980423167347908, -0.06690842658281326, 0.0982171893119812, -0.001565140439197421, 0.016139501705765724, 0.018540510907769203, -0.04722488299012184, -0.040345996618270874, 0.005557869095355272, -0.043004442006349564, -0.008238349109888077, 0.02977103926241398, 0.019299369305372238, -0.06918172538280487, -0.0929606705904007, -0.04095258563756943, 0.08924950659275055, -0.07069084048271179, 0.02848386950790882, 0.006880188826471567, -0.029667096212506294, -0.10625898092985153, 0.00847602915018797, -0.14885392785072327, 0.0019029404502362013, 0.04678950086236, -0.021350162103772163, 0.08381063491106033, 0.03405402973294258, 0.0558505542576313, 0.09815162420272827, -0.0445498451590538, -0.03786807507276535, -0.013828886672854424, -0.025732532143592834, -0.09198125451803207, -0.13052865862846375, -0.02774311974644661, -0.009531336836516857, 0.016025936231017113, -0.14872662723064423, 0.018929079174995422, -0.03452488034963608, 0.09457359462976456, -0.0037385160103440285, -0.023065907880663872, -0.005766891408711672, 0.08043809235095978, -0.0498807318508625, -0.038241513073444366, 0.03648693859577179, -0.012191788293421268, -0.038846924901008606, 0.09458274394273758, -0.09838317334651947, -0.113472580909729, 0.08533283323049545, -0.01264704205095768, -0.09317918866872787, -0.009777910076081753, -0.010140910744667053, -0.061330489814281464, -0.05933157354593277, -0.06796016544103622, 0.23950207233428955, 0.042566195130348206, 0.1521756947040558, -0.1277073621749878, -0.03761133924126625, 0.02645348571240902, -0.030666129663586617, -0.030941464006900787, 0.15831062197685242, 0.0747169628739357, -0.16533589363098145, 0.09829544275999069, 0.044704873114824295, -0.017069127410650253, 0.11288324743509293, 0.06105255335569382, -0.11933295428752899, 0.00711636571213603, 0.07632192969322205, 0.006294316612184048, 0.034104086458683014, -0.10752102732658386, -0.009627385064959526, 0.020689696073532104, 0.06250627338886261, 0.08022917062044144, -0.12008456140756607, 0.0675019770860672, 0.06863841414451599, -0.030624937266111374, 0.032533418387174606, -0.05889509618282318, -0.04861192777752876, 0.10913306474685669, 0.014198705554008484, -0.05823775380849838, -0.036597542464733124, -0.035879455506801605, -0.10331744700670242, 0.17730797827243805, -0.08974198251962662, -0.23506009578704834, -0.13735279440879822, 0.008621319197118282, -0.026609333232045174, 0.03409828245639801, 0.036921076476573944, -0.05380961298942566, -0.03876256197690964, -0.0667421817779541, 0.07579104602336884, -0.1082720011472702, -0.06626671552658081, -0.09864524006843567, 0.06821897625923157, -0.02286868914961815, -0.13794565200805664, 0.034594204276800156, 0.017609108239412308, -0.03850327059626579, -0.010098663158714771, -0.0596490353345871, 0.1387357860803604, 0.1459571123123169, -0.03545995429158211, -0.03056660108268261, 0.0010523441014811397, 0.11695456504821777, -0.08656452596187592, 0.03400940075516701, 0.06521124392747879, 0.03689730912446976, 0.03523610904812813, 0.1276741474866867, 0.03715388476848602, -0.05346264690160751, 0.02292250096797943, 0.04409638047218323, -0.009541919454932213, -0.26608383655548096, -0.10037465393543243, -0.06440022587776184, -0.03242633119225502, 0.08752970397472382, 0.038493819534778595, -0.05797232314944267, 0.021516727283596992, -0.04861941188573837, 0.014670858159661293, 0.009564495645463467, 0.056004106998443604, 0.0277622751891613, -0.02099529840052128, 0.06682801991701126, -0.05767838656902313, -0.07110655307769775, 0.10031677782535553, 0.05543748661875725, 0.20848050713539124, -0.04812762513756752, 0.23862148821353912, 0.0511881560087204, 0.07019860297441483, -0.007808386348187923, 0.07025711238384247, -0.034847188740968704, 0.030616607517004013, -0.03132157027721405, -0.06152373179793358, -0.0024201746564358473, 0.061461132019758224, 0.0067977034486830235, 0.003526875749230385, -0.06501853466033936, -0.04971031844615936, 0.07287172973155975, 0.20668724179267883, 0.07131616026163101, -0.2107698619365692, -0.03502630814909935, -0.003891328815370798, -0.05511217191815376, -0.08932377398014069, 0.004913084674626589, 0.1699676811695099, -0.07267769426107407, -0.014495490118861198, 0.021245721727609634, 0.13498780131340027, -0.13347391784191132, -0.024580180644989014, 0.016238534823060036, 0.030837150290608406, -0.01634247601032257, 0.1388842612504959, -0.23272134363651276, 0.19020241498947144, 0.020421292632818222, 0.07512170821428299, -0.05324501171708107, 0.03086654469370842, -0.06819156557321548, 0.008679086342453957, 0.11346110701560974, 0.0163874588906765, -0.02690945565700531, -0.11444000899791718, -0.10589201748371124, -0.0228299368172884, 0.08356986194849014, -0.03852654621005058, 0.08929429203271866, 0.06362292915582657, 0.008603246882557869, -0.006373053882271051, 0.04042878374457359, -0.023712143301963806, -0.1750071495771408, 0.011999819427728653, 0.0040380023419857025, -0.03940291330218315, -0.007975234650075436, -0.04592743515968323, -0.0576496385037899, 0.21803973615169525, -0.12377963960170746, -0.06635843217372894, -0.0714322030544281, 0.016119087114930153, 0.1281132847070694, -0.06403648108243942, 0.01411387324333191, 0.0193711556494236, 0.027005622163414955, -0.038509905338287354, -0.00901463720947504, 0.09760896116495132, -0.06764093041419983, -0.06226707994937897, -0.0872209444642067, 0.11476434022188187, 0.05677379295229912, 0.03918962925672531, -0.008944147266447544, 0.0244512427598238, -0.00457408744841814, -0.08439833670854568, -0.006348494440317154, 0.01650877110660076, 0.15663966536521912, 0.04872124269604683, -0.08195760101079941, -0.08184026926755905, -0.08141939342021942, -0.07807714492082596, 0.13598108291625977, 0.16413003206253052, -0.04981944337487221, 0.020734189078211784, 0.1827099323272705, -0.1288948804140091, -0.15021885931491852, -0.03458067402243614, 0.09434535354375839, 0.0836080014705658, -0.037614744156599045, -0.1876991093158722, -0.01307439710944891, 0.12588396668434143, 0.001673070597462356, 0.043299295008182526, -0.4295373857021332, -0.12490098178386688, 0.00019630248425528407, 0.0451798178255558, 0.005281368270516396, -0.1059146448969841, -0.047786980867385864, -0.05145725607872009, -0.09599975496530533, 0.06278058141469955, -0.006614038720726967, 0.09035415202379227, 0.012574773281812668, -0.002101803896948695, 0.0531855933368206, -0.03696095198392868, 0.13459080457687378, -0.0032465490512549877, 0.02408827841281891, -0.04593726620078087, 0.04811736196279526, 0.021436695009469986, 0.0004310471413191408, 0.1324840784072876, -0.034180182963609695, 0.04144349321722984, -0.1616269201040268, -0.05223274603486061, -0.046225517988204956, 0.013633177615702152, -0.041012074798345566, -0.06262215226888657, -0.030784575268626213, 0.02353345975279808, 0.05599969998002052, 0.0009028022177517414, -0.0008925448055379093, -0.047945793718099594, 0.035445813089609146, 0.1753222644329071, 0.09044256806373596, 0.033273179084062576, -0.09450138360261917, 0.006334968842566013, 0.016249144449830055, 0.06011989712715149, -0.10465332120656967, 0.010267320089042187, 0.1516098976135254, 0.015596390701830387, 0.10618595778942108, -0.006788210012018681, -0.13566337525844574, 0.018658095970749855, 0.08460278064012527, -0.08015790581703186, -0.1386052370071411, -0.024382397532463074, 0.03641420230269432, -0.049350861459970474, 0.0022732059005647898, 0.09072136133909225, -0.07420186698436737, -0.03466295823454857, -0.02058550715446472, 0.029838774353265762, -0.05847729369997978, 0.22995783388614655, 0.005625942721962929, 0.043286826461553574, -0.053315598517656326, 0.11714036017656326, 0.15100876986980438, -0.14979787170886993, 0.023507017642259598, 0.1848844587802887, -0.05872218683362007, -0.041127268224954605, 0.05337478965520859, 0.11359664797782898, -0.028590340167284012, -0.07499399036169052, -0.04485210031270981, -0.023183956742286682, 0.009256413206458092, -0.009311865083873272, 0.027017539367079735, 0.038641300052404404, -0.012626164592802525, -0.04122980684041977, -0.10857801884412766, 0.0824662372469902, 0.08762244880199432, 0.01314907893538475, -0.02715408056974411, 0.12001504749059677, 0.017981259152293205, -0.03273996338248253, -0.010550644248723984, 0.007524221669882536, -0.052522704005241394, 0.030513959005475044, -0.057979051023721695, -0.0042156074196100235, -0.04169521853327751, -0.019323980435729027, -0.03902289643883705, 0.00010068169649457559, -0.012627694755792618, 0.0027755165938287973, -0.05082147940993309, -0.03742377087473869, -0.039536647498607635, 0.03632097318768501, -0.08446329832077026, -0.0291584599763155, 0.0030933916568756104, -0.03441977500915527, 0.057863663882017136, 0.013576340861618519, -0.007477422244846821, 0.010242079384624958, -0.031062785536050797, 0.06721818447113037, 0.013945021666586399, 0.05561140179634094, 0.01339676696807146, -0.07079745084047318, 0.038780707865953445, 0.044824812561273575, -0.023439640179276466, -0.013358492404222488, 0.01804676465690136, -0.1245238333940506, -0.024380624294281006, -0.031221551820635796, -0.05887257307767868, -0.060198139399290085, 0.11652388423681259, 0.0715511366724968, 0.06549816578626633, 0.089073047041893, -0.05954405292868614, 0.07187367230653763, -0.14389513432979584, -0.0057768067345023155, 0.03323592245578766, -0.03216143697500229, -0.007643751800060272, -0.00915202870965004, 0.05101426690816879, -0.06666188687086105, 0.12005356699228287, 0.024088304489850998, 0.08204210549592972, 0.00507670734077692, -0.08178544789552689, -0.010067055001854897, 0.017760194838047028, 0.10696422308683395, -0.023838823661208153, -0.016197143122553825, -0.08422042429447174, 0.09308288246393204, -0.0037230313755571842, 0.12829884886741638, 0.01575503684580326, 0.1365361511707306, 0.14742250740528107, 0.04986720159649849, 0.019251428544521332, -0.09901057928800583, -0.07027816027402878, 0.08500753343105316, -0.0062268818728625774, 0.05745631828904152, -0.03402906656265259, 0.11538746953010559, 0.1348656564950943, -0.14719101786613464, 0.10210146009922028, 0.004629760980606079, -0.0885324627161026, -0.05630666762590408, -0.11094190180301666, -0.041064273566007614, -0.051850575953722, -0.03369907662272453, -0.11715762317180634, 0.025014566257596016, -0.003694049082696438, 0.04208649694919586, -0.04908674955368042, 0.10439713299274445, 0.023297101259231567, -0.11490921676158905, 0.07482302188873291, 0.00922924280166626, 0.1285407394170761, -0.024853341281414032, 0.015367994084954262, 0.047518469393253326, -0.003613665932789445, 0.05305789038538933, 0.056624311953783035, -0.006064009387046099, 0.0067552365362644196, 0.007020717021077871, -0.049048323184251785, -0.04236006736755371, 0.01949731633067131, 0.07327528297901154, 0.1953088343143463, 0.048064082860946655, -0.06151048466563225, -0.029060475528240204, 0.18061064183712006, -0.05220939591526985, -0.06299176812171936, -0.10439179837703705, 0.19933772087097168, 0.03735974058508873, 0.03729298338294029, 0.017910348251461983, -0.10809478163719177, -0.009121948853135109, 0.1296175867319107, 0.1737831085920334, -0.03058769553899765, -0.046358805149793625, 0.03253437951207161, -0.004226807504892349, 0.01838514767587185, 0.04094155877828598, 0.04605359956622124, 0.27241742610931396, -0.08849956840276718, 0.06267237663269043, -0.06001183018088341, 0.05357527732849121, -0.012376301921904087, 0.15695850551128387, -0.008905472233891487, 0.0038898272905498743, -0.055629853159189224, 0.10068824142217636, 0.029933203011751175, -0.1523352563381195, 0.002665681531652808, -0.1102648377418518, -0.11867807805538177, 0.016282523050904274, 0.011266294866800308, 0.02452673763036728, 0.08952904492616653, 0.003734699683263898, 0.022730093449354172, 0.05248628184199333, 0.016262993216514587, -0.09278053045272827, -0.1261889636516571, 0.0008354504243470728, -0.07781951129436493, 0.11829643696546555, 0.0023307118099182844, 0.14213594794273376, 0.10192637145519257, 0.0243068840354681, -0.08897940814495087, 0.09220024943351746, 0.024258093908429146, 0.028597459197044373, 0.08462722599506378, 0.07232489436864853, -0.01984425075352192, 0.06906122714281082, 0.038385067135095596, -0.05458459258079529, 0.06260635703802109, -0.06551284343004227, -0.007767170667648315, -0.12324583530426025, 0.08793962001800537, -0.03250284865498543, 0.13087104260921478, 0.17962326109409332, -0.012613865546882153, -0.00027746587875299156, -0.049929242581129074, 0.022690845653414726, -0.0020602906588464975, 0.1189587414264679, -0.01737508922815323, -0.2254580706357956, 0.017654765397310257, -0.030007416382431984, 0.04679618403315544, -0.22156435251235962, -0.01917070522904396, 0.03262168914079666, -0.056405991315841675, -0.04172264039516449, 0.06600341200828552, 0.07990363985300064, 0.030942462384700775, -0.027813691645860672, -0.12366039305925369, -0.009466095827519894, 0.10355281829833984, -0.0860002189874649, -0.09685772657394409 ]
null
null
transformers
# legal_t5_small_multitask_es_sv model Model on translating legal text from Spanish to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_es_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Spanish to Swedish. ### How to use Here is how to use this model to translate legal text from Spanish to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_es_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_es_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "Tiempo de uso de la palabra ( artículo 149 del Reglamento PE)" pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_es_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_es_sv | 37.975| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish Swedish", "tags": ["translation Spanish Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Tiempo de uso de la palabra ( art\u00edculo 149 del Reglamento PE)"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_es_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Spanish Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_es\_sv model ========================================= Model on translating legal text from Spanish to Swedish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_es\_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Spanish to Swedish. ### How to use Here is how to use this model to translate legal text from Spanish to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_es\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Spanish Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Spanish to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_es\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06756756454706192, 0.11890660226345062, -0.003991520032286644, 0.08834561705589294, 0.05689753219485283, -0.00866197980940342, 0.011824684217572212, 0.09284720569849014, -0.08067986369132996, 0.06992091238498688, 0.05060974508523941, 0.01914338581264019, 0.07409464567899704, 0.01247334387153387, 0.029641203582286835, -0.2045460045337677, 0.0021990551613271236, -0.03574516624212265, -0.028656765818595886, 0.09790574759244919, 0.0903279036283493, -0.05325808376073837, 0.03158913552761078, -0.030434319749474525, -0.042019739747047424, 0.031620681285858154, -0.08766142278909683, -0.04717177152633667, 0.10460740327835083, 0.07480558007955551, 0.09497921913862228, 0.011725018732249737, 0.07690125703811646, -0.1692596971988678, -0.015857448801398277, 0.04669378697872162, -0.007999459281563759, 0.03723707050085068, 0.1134008914232254, 0.026593320071697235, 0.1843547523021698, -0.05221496522426605, 0.02830461971461773, 0.036612506955862045, -0.09750061482191086, -0.15400227904319763, -0.05359992757439613, -0.017840120941400528, 0.09466317296028137, 0.13060840964317322, -0.04614786431193352, 0.04392445087432861, -0.010921342298388481, 0.08137045800685883, 0.06567700207233429, -0.22726500034332275, -0.03604152053594589, 0.03375235199928284, 0.06586457043886185, 0.11398400366306305, -0.03221334517002106, 0.011160491034388542, 0.07938223332166672, 0.06737586855888367, 0.06950922310352325, -0.05687044933438301, -0.05723214149475098, -0.023291265591979027, -0.13679258525371552, -0.04760606214404106, 0.17032313346862793, 0.008436383679509163, -0.02977166511118412, -0.10193562507629395, -0.056869782507419586, -0.04401295259594917, 0.020470669493079185, -0.04427257925271988, 0.014781945385038853, -0.01082610059529543, 0.05488387495279312, -0.059893298894166946, -0.12833155691623688, -0.04991127550601959, -0.04626769945025444, 0.05947190150618553, 0.03962331637740135, 0.010993973352015018, 0.05286390334367752, 0.07037165015935898, -0.1122518926858902, -0.0871540755033493, 0.017312010750174522, 0.006621432490646839, -0.0904044583439827, 0.011669308878481388, -0.002521547954529524, -0.2532122731208801, -0.003720405977219343, -0.05118940770626068, -0.09091285616159439, 0.025629742071032524, 0.05510084331035614, 0.04400878772139549, 0.058170855045318604, 0.12577813863754272, -0.08258481323719025, -0.10575023293495178, -0.04785378649830818, -0.02931230701506138, -0.008268237113952637, 0.012168142944574356, -0.07391484826803207, -0.04015002027153969, -0.0045508467592298985, 0.037463657557964325, 0.0047597154043614864, 0.0010055616730824113, -0.02256612665951252, -0.015537434257566929, 0.07567335665225983, -0.09066746383905411, 0.01305600255727768, 0.003224149579182267, -0.09096615016460419, -0.04229448363184929, 0.054416149854660034, -0.014804787933826447, -0.11632236838340759, 0.06356701254844666, -0.01086411438882351, -0.015033629722893238, -0.10346003621816635, -0.19772467017173767, 0.016064902767539024, -0.02651698887348175, -0.05099528282880783, -0.08831589668989182, -0.08858687430620193, -0.0993257537484169, 0.03622350096702576, -0.06762093305587769, 0.004946403205394745, -0.11677145957946777, 0.005883980076760054, 0.03191351890563965, -0.03185058385133743, 0.08556758612394333, -0.04892915114760399, 0.04292862489819527, -0.0030903483275324106, 0.0740751326084137, 0.009187750518321991, 0.025416793301701546, -0.10694357752799988, 0.029929323121905327, -0.09108339250087738, 0.15329888463020325, -0.03908352181315422, -0.004219603259116411, -0.1359487622976303, -0.060819704085588455, -0.06964701414108276, 0.048527248203754425, 0.08192206919193268, 0.1355280727148056, -0.21744103729724884, -0.025669844821095467, 0.21215032041072845, -0.07377925515174866, -0.06421361863613129, 0.12426325678825378, -0.017881814390420914, 0.05867001414299011, 0.07958204299211502, 0.09810856729745865, 0.03797075152397156, -0.04720810428261757, -0.05106937885284424, 0.01893950067460537, 0.018894247710704803, 0.025698553770780563, 0.09886311739683151, -0.07913775742053986, 0.09211189299821854, 0.01105376984924078, 0.028041772544384003, 0.009982414543628693, -0.025788182392716408, -0.03719066083431244, 0.012746026739478111, -0.030373720452189445, -0.022756170481443405, 0.032129283994436264, 0.013896607793867588, -0.07972443103790283, -0.07290438562631607, -0.0020882091484963894, 0.06231033429503441, -0.06670350581407547, 0.03912228345870972, 0.03871101140975952, -0.05850493535399437, -0.12268173694610596, 0.007081350311636925, -0.1454482525587082, -0.012169684283435345, 0.023576781153678894, -0.010875931940972805, 0.0870266854763031, 0.06928880512714386, 0.06297735124826431, 0.09782250225543976, -0.04473977908492088, -0.029672835022211075, -0.004460278432816267, -0.024691609665751457, -0.09554634988307953, -0.13099493086338043, -0.015019132755696774, -0.019325897097587585, 0.0014002290554344654, -0.14177484810352325, 0.003851764602586627, -0.03863442316651344, 0.0834139809012413, 0.0024486174806952477, -0.020851757377386093, 0.02477283962070942, 0.08891381323337555, -0.03782232105731964, -0.03358050063252449, 0.042447835206985474, -0.01587657816708088, -0.07120297849178314, 0.12304417788982391, -0.07590686529874802, -0.13463543355464935, 0.07989997416734695, -0.0029259994626045227, -0.0944681167602539, -0.006041497923433781, -0.006677779369056225, -0.05871514230966568, -0.06425537914037704, -0.061774421483278275, 0.24010175466537476, 0.05218975245952606, 0.1510140597820282, -0.12833192944526672, -0.04047025740146637, 0.021919310092926025, -0.05512329563498497, -0.025989672169089317, 0.18328140676021576, 0.05315527319908142, -0.1776786595582962, 0.09627509862184525, 0.006690832786262035, -0.016207661479711533, 0.15569467842578888, 0.059248682111501694, -0.10921484231948853, 0.013490927405655384, 0.06295820325613022, 0.0016070595011115074, 0.050698474049568176, -0.08039463311433792, -0.007958976551890373, 0.0300238486379385, 0.06248892471194267, 0.0722656175494194, -0.10842956602573395, 0.06297945231199265, 0.06675666570663452, -0.04431978985667229, 0.04560105502605438, -0.04138575494289398, -0.04892939701676369, 0.08731972426176071, 0.005154498852789402, -0.050268933176994324, -0.03922899439930916, -0.03743324428796768, -0.11493334174156189, 0.18496614694595337, -0.09612036496400833, -0.24535754323005676, -0.15378546714782715, 0.021179527044296265, -0.04218980297446251, 0.032596200704574585, 0.05756504461169243, -0.06298387050628662, -0.05221693962812424, -0.08213189989328384, 0.09565974026918411, -0.08318228274583817, -0.07354171574115753, -0.10754474997520447, 0.059788040816783905, -0.007899481803178787, -0.14106851816177368, 0.03227260336279869, 0.005133100785315037, -0.019306108355522156, 0.001456411904655397, -0.040800049901008606, 0.1273452788591385, 0.12026169151067734, -0.022989485412836075, -0.040330614894628525, 0.003893995424732566, 0.13502217829227448, -0.06655171513557434, 0.040181584656238556, 0.04367430880665779, 0.024174483492970467, 0.037932220846414566, 0.1427846997976303, 0.03724634647369385, -0.04265964403748512, 0.019364481791853905, 0.04977209120988846, -0.023764047771692276, -0.272341251373291, -0.09893681108951569, -0.057965271174907684, -0.021046757698059082, 0.08553823828697205, 0.041871801018714905, -0.08306216448545456, 0.029511064291000366, -0.04116163030266762, 0.006078870501369238, 0.00934123620390892, 0.05249420553445816, 0.023002276197075844, -0.026630260050296783, 0.07022052258253098, -0.05396125093102455, -0.055015020072460175, 0.09327004104852676, 0.03409399092197418, 0.18781481683254242, -0.055284567177295685, 0.20917335152626038, 0.05416003614664078, 0.05678560957312584, -0.011852511204779148, 0.07232284545898438, -0.0455610491335392, 0.028044739738106728, -0.0203706081956625, -0.06474379450082779, -0.01775938831269741, 0.061304301023483276, 0.005593558773398399, 0.00980541855096817, -0.051577091217041016, -0.03679610416293144, 0.06756352633237839, 0.19662924110889435, 0.08044655621051788, -0.178557887673378, -0.05990336090326309, 0.008219264447689056, -0.06597772240638733, -0.08391008526086807, 0.0009155611041933298, 0.17820274829864502, -0.08723977208137512, 0.01585531048476696, 0.007409689016640186, 0.13474613428115845, -0.11430028825998306, -0.018789315596222878, 0.007708487566560507, 0.02635718323290348, -0.01350375171750784, 0.13472610712051392, -0.22885650396347046, 0.18690364062786102, 0.01744101382791996, 0.06877082586288452, -0.04800654947757721, 0.031138546764850616, -0.07678931951522827, 0.0002308899856870994, 0.12349996715784073, 0.033796150237321854, -0.06728824973106384, -0.09939814358949661, -0.09774056822061539, -0.01653098501265049, 0.05475953593850136, -0.04416162520647049, 0.09287235140800476, 0.07522869110107422, 0.020045895129442215, -0.023982500657439232, 0.02156832441687584, -0.019266625866293907, -0.1601935625076294, -0.0009551177499815822, -0.018388869240880013, -0.033902499824762344, -0.0074087344110012054, -0.04359445348381996, -0.09429456293582916, 0.21935026347637177, -0.13575594127178192, -0.10496756434440613, -0.07583760470151901, 0.016684025526046753, 0.12571999430656433, -0.06684204190969467, 0.008226200938224792, 0.030399849638342857, 0.02443493716418743, -0.05551956221461296, -0.005124591290950775, 0.08466459810733795, -0.06078658998012543, -0.06171935051679611, -0.05691445991396904, 0.12539881467819214, 0.060635149478912354, 0.037507422268390656, -0.011181927286088467, 0.04125892370939255, -0.002291984623298049, -0.09721881151199341, -0.005884131882339716, 0.03223099932074547, 0.1607304960489273, 0.02966836281120777, -0.05982464551925659, -0.07601210474967957, -0.06928166002035141, -0.08424726873636246, 0.1510404795408249, 0.1657370626926422, -0.049100033938884735, 0.060249075293540955, 0.1920485496520996, -0.12337450683116913, -0.16992270946502686, -0.05687972158193588, 0.10986866056919098, 0.09503854811191559, -0.011444328352808952, -0.17085638642311096, 0.002430032240226865, 0.11011737585067749, 0.003650411730632186, 0.011706002056598663, -0.3940145671367645, -0.14121119678020477, 0.004326576367020607, 0.0401562824845314, 0.006338679231703281, -0.08425991982221603, -0.05915755033493042, -0.052314091473817825, -0.09382201731204987, 0.06475888937711716, -0.01578766107559204, 0.10023137927055359, 0.01586786098778248, 0.024976449087262154, 0.058678556233644485, -0.03924744948744774, 0.14350008964538574, -0.016378236934542656, 0.013772827573120594, -0.06836426258087158, 0.07907763123512268, 0.016026340425014496, -0.012291274033486843, 0.16005828976631165, -0.05534597858786583, 0.041876960545778275, -0.16555924713611603, -0.053310099989175797, -0.054146330803632736, 0.029034556820988655, -0.03950987756252289, -0.07281024008989334, -0.04131905734539032, 0.024246837943792343, 0.06966365873813629, -0.017406800761818886, 0.040915340185165405, -0.06928082555532455, 0.034318309277296066, 0.1786993443965912, 0.10618607699871063, 0.03041284717619419, -0.09294971823692322, 0.014437621459364891, 0.008833427913486958, 0.06381554901599884, -0.11750342696905136, 0.0057603647001087666, 0.15391018986701965, 0.014689494855701923, 0.10036733001470566, -0.03188692405819893, -0.13308292627334595, 0.02143005095422268, 0.07316245138645172, -0.09329912066459656, -0.1328243464231491, -0.022693447768688202, -0.007801946718245745, -0.04013841971755028, -0.014495358802378178, 0.10740555077791214, -0.08833683282136917, -0.024802055209875107, -0.015875237062573433, 0.032486919313669205, -0.05835682898759842, 0.22607137262821198, 0.012515361420810223, 0.05618671327829361, -0.059480227530002594, 0.11834952235221863, 0.13485708832740784, -0.13577944040298462, 0.04261384904384613, 0.18059466779232025, -0.06669826805591583, -0.040350109338760376, 0.0617792047560215, 0.13407056033611298, -0.03718792647123337, -0.06953512132167816, -0.027028711512684822, -0.037113480269908905, 0.0032596788369119167, -0.006833699997514486, 0.032308563590049744, 0.02853056974709034, -0.0023503347765654325, -0.04539250582456589, -0.08614196628332138, 0.09286535531282425, 0.0696018636226654, 0.0075691863894462585, -0.03477220609784126, 0.11382537335157394, -0.001369306119158864, -0.016948148608207703, -0.018518032506108284, 0.03130694106221199, -0.04866402968764305, 0.018172612413764, -0.06935077905654907, 0.006042651832103729, -0.04752575233578682, -0.008993903174996376, -0.03793824091553688, 0.0004908214905299246, -0.0051928297616541386, 0.005949697457253933, -0.04520609229803085, -0.0308214258402586, -0.05588836595416069, 0.029460228979587555, -0.0842326432466507, -0.045745253562927246, -0.006463667843490839, -0.017920859158039093, 0.04933089017868042, 0.00510836485773325, -0.013751192949712276, 0.035245902836322784, -0.0207594633102417, 0.07939309626817703, 0.032801639288663864, 0.04597703740000725, 0.023941047489643097, -0.05234759673476219, 0.005493753124028444, 0.03869374468922615, -0.0195942185819149, -0.010804843157529831, 0.014301559887826443, -0.12865838408470154, -0.04541419818997383, -0.026288975030183792, -0.053257133811712265, -0.06024184450507164, 0.11808426678180695, 0.06663598120212555, 0.07058945298194885, 0.12458712607622147, -0.07900569587945938, 0.07540463656187057, -0.1441613733768463, -0.005635139532387257, 0.03618145361542702, -0.03678559511899948, -0.011600401252508163, -0.008911400102078915, 0.04507454112172127, -0.08539628982543945, 0.12635625898838043, 0.0346330851316452, 0.08450213074684143, 0.007258344907313585, -0.08206046372652054, 0.0007562233367934823, 0.01775108464062214, 0.10903897881507874, -0.03297087922692299, -0.011615962721407413, -0.09032119810581207, 0.09301077574491501, -0.0010759824654087424, 0.10639795660972595, 0.031026484444737434, 0.1094760000705719, 0.1480565071105957, 0.058169394731521606, 0.016383318230509758, -0.07512064278125763, -0.07307600229978561, 0.0891563817858696, 0.014207334257662296, 0.06304299831390381, -0.018101876601576805, 0.09296540170907974, 0.14482955634593964, -0.15686513483524323, 0.1117570698261261, 0.009862753562629223, -0.08893689513206482, -0.071647509932518, -0.13233232498168945, -0.061328258365392685, -0.036641675978899, -0.028802836313843727, -0.12496764212846756, 0.015036789700388908, 0.000031380794098367915, 0.052160874009132385, -0.0323604941368103, 0.10085969418287277, 0.013755335472524166, -0.1230473741889, 0.06655153632164001, 0.009366374462842941, 0.11532644182443619, -0.015343382954597473, 0.028514962643384933, 0.06374362111091614, 0.01649687997996807, 0.030598394572734833, 0.05348815396428108, -0.004053997807204723, -0.004258125089108944, 0.003848638851195574, -0.05794394016265869, -0.04146670922636986, 0.03572922572493553, 0.08621694147586823, 0.16978001594543457, 0.06337571889162064, -0.06931272894144058, -0.03705224394798279, 0.19006545841693878, -0.05423211306333542, -0.08090312778949738, -0.109959177672863, 0.19553466141223907, 0.0194662157446146, 0.054121848195791245, 0.016539188101887703, -0.10531392693519592, -0.000685148115735501, 0.1207834854722023, 0.1828763782978058, -0.007691974751651287, -0.0370912067592144, 0.004510614089667797, -0.011074140667915344, 0.013272919692099094, 0.04246695712208748, 0.02866247668862343, 0.259793221950531, -0.08039543032646179, 0.05719057098031044, -0.06681371480226517, 0.04335356503725052, -0.02016381174325943, 0.14949820935726166, -0.004549035336822271, 0.0005142817972227931, -0.04669469594955444, 0.11033754795789719, 0.003920691553503275, -0.15714718401432037, -0.0031860419549047947, -0.08514164388179779, -0.12929368019104004, 0.014179338701069355, 0.038720615208148956, 0.03824598714709282, 0.06970426440238953, 0.016354098916053772, 0.04047667235136032, 0.03328938037157059, 0.019056614488363266, -0.09009213745594025, -0.09811443835496902, -0.006918220780789852, -0.055224109441041946, 0.0894821286201477, 0.017437227070331573, 0.13137976825237274, 0.09429971128702164, 0.019363487139344215, -0.0769115462899208, 0.10065075755119324, 0.02836110256612301, 0.01957368664443493, 0.09706411510705948, 0.09549929946660995, -0.01089116744697094, 0.07351695746183395, 0.03792930394411087, -0.07233775407075882, 0.04119563847780228, -0.032470155507326126, -0.010789342224597931, -0.09899967163801193, 0.10276997089385986, -0.03881233185529709, 0.1261083334684372, 0.19071993231773376, -0.004889973439276218, -0.0033875061199069023, -0.06154493987560272, 0.028628727421164513, -0.01955263689160347, 0.09776002168655396, -0.009710256941616535, -0.2097330242395401, 0.010911351069808006, -0.018642999231815338, 0.03293052315711975, -0.2110617607831955, -0.01561411190778017, 0.028678175061941147, -0.04956531152129173, -0.02809114195406437, 0.07097020000219345, 0.06798192858695984, 0.021388355642557144, -0.021023226901888847, -0.08708278089761734, 0.008596846833825111, 0.10203637927770615, -0.09370426833629608, -0.10157197713851929 ]
null
null
transformers
# legal_t5_small_multitask_fr_cs model Model on translating legal text from French to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_fr_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from French to Cszech. ### How to use Here is how to use this model to translate legal text from French to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_fr_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_fr_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) fr_text = "BUDG – Décision: aucun avis" pipeline([fr_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_fr_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_fr_cs | 44.499| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "French Cszech", "tags": ["translation French Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "BUDG \u2013 D\u00e9cision: aucun avis"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_fr_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation French Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "French Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation French Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_fr\_cs model ========================================= Model on translating legal text from French to Cszech. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_fr\_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from French to Cszech. ### How to use Here is how to use this model to translate legal text from French to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_fr\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from French to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from French to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from French to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07491308450698853, 0.14238448441028595, -0.003390583209693432, 0.08577672392129898, 0.059895120561122894, -0.002696229610592127, 0.01718750223517418, 0.1001519188284874, -0.03887433186173439, 0.07670965045690536, 0.061417434364557266, -0.0028556999750435352, 0.04131775349378586, 0.038334835320711136, 0.053843338042497635, -0.182728111743927, -0.0037316144444048405, -0.026047907769680023, -0.04887779429554939, 0.11043243110179901, 0.08752693235874176, -0.04710431024432182, 0.057114481925964355, -0.022891690954566002, -0.07780299335718155, 0.03326791524887085, -0.0811956450343132, -0.040295373648405075, 0.10652127861976624, 0.07490217685699463, 0.07550196349620819, -0.013195445761084557, 0.06477981060743332, -0.19092577695846558, -0.0023082171101123095, 0.060180582106113434, -0.02376691997051239, 0.05108514055609703, 0.09959696233272552, -0.013681074604392052, 0.1901112198829651, -0.06808362156152725, 0.027282048016786575, 0.05280536040663719, -0.10065849870443344, -0.0991259440779686, -0.0761624425649643, 0.03151426836848259, 0.0774889588356018, 0.13811446726322174, -0.024896208196878433, 0.029557377099990845, -0.02776148170232773, 0.07359927892684937, 0.12089132517576218, -0.25663289427757263, -0.023398427292704582, 0.03392424061894417, 0.05802877992391586, 0.07994524389505386, -0.05478020757436752, 0.018914978951215744, 0.04943666607141495, 0.05599312484264374, 0.07542961835861206, -0.05235576257109642, 0.00746179698035121, -0.01157103106379509, -0.12425552308559418, -0.07292735576629639, 0.1643933653831482, 0.03467874974012375, -0.0316588468849659, -0.10214296728372574, -0.05932585522532463, -0.08125662803649902, -0.012709559872746468, -0.02827710658311844, 0.02040969207882881, -0.00972103513777256, 0.03316137194633484, -0.060271695256233215, -0.10797017067670822, -0.07270005345344543, -0.06930013000965118, 0.0590004026889801, 0.024500099942088127, 0.0164496973156929, 0.04631281644105911, 0.07985083013772964, -0.12995094060897827, -0.06442172080278397, 0.0079915476962924, 0.01894475892186165, -0.09580658376216888, 0.019803134724497795, 0.0007524445536546409, -0.15276579558849335, 0.001733478158712387, -0.007891412824392319, -0.08741134405136108, 0.02515648864209652, 0.0397447869181633, 0.03502899408340454, 0.042726822197437286, 0.12181401252746582, -0.10228314995765686, -0.12007517367601395, -0.033679381012916565, -0.005833824630826712, -0.020446976646780968, 0.021833976730704308, -0.062305789440870285, -0.039702288806438446, 0.0075902617536485195, 0.05622386187314987, 0.0003781427221838385, -0.004041983745992184, -0.0011288300156593323, -0.025309178978204727, 0.13545061647891998, -0.09314535558223724, 0.018318161368370056, 0.005375402048230171, -0.09465006738901138, -0.01825673133134842, 0.058049336075782776, -0.031587012112140656, -0.10433809459209442, 0.047543201595544815, -0.04642575606703758, -0.008510385639965534, -0.10254886746406555, -0.17267394065856934, -0.0009453017846681178, 0.013401159085333347, -0.07041129469871521, -0.08803075551986694, -0.11512633413076401, -0.07205550372600555, 0.029822178184986115, -0.05771991237998009, 0.013592014089226723, -0.09864015877246857, -0.004972572438418865, 0.04193883389234543, 0.00013282327563501894, 0.061475325375795364, -0.05017396807670593, 0.032745759934186935, -0.010298134759068489, 0.055805716663599014, 0.0003858034615404904, 0.02997351810336113, -0.06413614004850388, 0.052046965807676315, -0.08161100000143051, 0.14988954365253448, -0.014038939960300922, -0.01593621075153351, -0.14462533593177795, -0.06230616569519043, -0.08677713572978973, 0.03429635986685753, 0.08246475458145142, 0.13858461380004883, -0.1976996213197708, -0.04098353162407875, 0.22240431606769562, -0.06350062787532806, -0.06724251806735992, 0.1347050964832306, -0.030520759522914886, 0.020413009449839592, 0.06946808099746704, 0.08712736517190933, 0.031558044254779816, -0.04912138730287552, -0.06269912421703339, 0.004574260674417019, 0.023899998515844345, 0.01724918745458126, 0.11018527299165726, -0.07266411930322647, 0.08057618141174316, -0.002116908086463809, 0.04498875141143799, 0.013127324171364307, -0.04840126633644104, -0.03668125346302986, -0.0015978877199813724, -0.03269077464938164, -0.010413271375000477, 0.008482895791530609, 0.01943536289036274, -0.06317752599716187, -0.08469051122665405, 0.017476452514529228, 0.10240919888019562, -0.06950752437114716, 0.027944674715399742, 0.02146332338452339, -0.04365459829568863, -0.10479510575532913, 0.013842843472957611, -0.15487472712993622, 0.0028389759827405214, 0.03047914057970047, -0.03466540202498436, 0.09818195551633835, 0.022141601890325546, 0.05748364329338074, 0.0959170013666153, -0.05450867488980293, -0.01732717640697956, -0.030271373689174652, -0.02307850681245327, -0.09806735068559647, -0.10658778250217438, -0.03083617612719536, -0.010406175628304482, 0.03897621110081673, -0.15903007984161377, 0.013217335566878319, -0.02676089107990265, 0.08307559788227081, 0.0044249240309000015, -0.028887078166007996, 0.006627977825701237, 0.05474928021430969, -0.02864128351211548, -0.02382764406502247, 0.031231539323925972, -0.012951886281371117, -0.024848319590091705, 0.09957029670476913, -0.11061099916696548, -0.09319360554218292, 0.10684753954410553, 0.027981525287032127, -0.09755706787109375, -0.018667591735720634, -0.014290188439190388, -0.04523211345076561, -0.04122988134622574, -0.06586620956659317, 0.19420912861824036, 0.05462845042347908, 0.1690692901611328, -0.10966470092535019, -0.06262676417827606, 0.02259654551744461, -0.014631994068622589, -0.015635071322321892, 0.16133800148963928, 0.045480407774448395, -0.16196846961975098, 0.09130586683750153, 0.03785610944032669, -0.011424585245549679, 0.12932080030441284, 0.05674781650304794, -0.1076178252696991, -0.02333260513842106, 0.04608217626810074, 0.004040989559143782, 0.05171952396631241, -0.08611844480037689, -0.038100551813840866, 0.02592400833964348, 0.06389706581830978, 0.06547272205352783, -0.10240927338600159, 0.0754178911447525, 0.0629805400967598, -0.042695432901382446, 0.06783370673656464, -0.03238363936543465, -0.0484861321747303, 0.11307503283023834, 0.02464372105896473, -0.039100389927625656, -0.05223844572901726, -0.041582174599170685, -0.1097836047410965, 0.19712962210178375, -0.09610707312822342, -0.22475598752498627, -0.13272398710250854, 0.02048165164887905, -0.060746531933546066, 0.016683833673596382, 0.03406204283237457, -0.04469428211450577, -0.034638065844774246, -0.11050206422805786, 0.06523492187261581, -0.10847029089927673, -0.04266206920146942, -0.08735993504524231, 0.04721403121948242, -0.025356408208608627, -0.14790670573711395, 0.015396115370094776, -0.0004312071541789919, -0.04956042021512985, -0.012412832118570805, -0.049904290586709976, 0.11585857719182968, 0.1532810479402542, -0.03744339197874069, -0.023361604660749435, 0.005364013370126486, 0.12689951062202454, -0.06360366195440292, 0.037123993039131165, 0.0432196706533432, 0.07178299129009247, 0.024935489520430565, 0.10772810131311417, 0.045799050480127335, -0.03995101898908615, 0.03753185272216797, 0.05818329006433487, -0.02735130302608013, -0.24652625620365143, -0.11532924324274063, -0.06552073359489441, -0.017461445182561874, 0.09561721980571747, 0.05148864537477493, -0.021939551457762718, 0.003468214999884367, -0.04420725256204605, 0.021947700530290604, 0.012323874048888683, 0.06312623620033264, 0.05350523069500923, -0.01357752550393343, 0.0773596540093422, -0.06411506235599518, -0.039225984364748, 0.09493499249219894, 0.03618527203798294, 0.17725856602191925, -0.03679697960615158, 0.2466099113225937, 0.05600035935640335, 0.04213956370949745, 0.001704893889836967, 0.0797412320971489, -0.03743742033839226, 0.02901810221374035, -0.027684712782502174, -0.06537703424692154, 0.0020970755722373724, 0.05809065327048302, 0.007567510940134525, 0.012019932270050049, -0.07509467750787735, -0.06533827632665634, 0.08275730907917023, 0.2197173833847046, 0.06817492097616196, -0.18347607553005219, -0.058452729135751724, -0.012328047305345535, -0.07880713045597076, -0.07484710216522217, 0.015216869302093983, 0.1565864235162735, -0.10166621208190918, -0.019093014299869537, 0.025596467778086662, 0.1315503567457199, -0.1063932329416275, -0.02335871011018753, 0.021798938512802124, 0.01774921640753746, -0.02699757181107998, 0.11074595898389816, -0.2287798523902893, 0.18608154356479645, 0.02615375816822052, 0.04337293282151222, -0.03968720883131027, 0.012288644909858704, -0.037157993763685226, -0.00392363453283906, 0.1182328313589096, 0.02950611151754856, -0.03587547689676285, -0.09028870612382889, -0.10027635097503662, -0.02733430825173855, 0.07053893059492111, -0.06829334050416946, 0.09230728447437286, 0.05831481143832207, -0.02112935669720173, -0.015727050602436066, 0.061123356223106384, -0.02368570677936077, -0.17555664479732513, -0.006108365952968597, -0.0061014918610453606, -0.04678516462445259, -0.011823201552033424, -0.04472031071782112, -0.03978387266397476, 0.2336329221725464, -0.1113293319940567, -0.0625610277056694, -0.07703308016061783, 0.0035527264699339867, 0.12555237114429474, -0.07594622671604156, 0.05051784589886665, 0.003813980845734477, 0.04873957857489586, -0.055384207516908646, -0.03524269536137581, 0.08293180912733078, -0.06753100454807281, -0.05755581334233284, -0.0564357154071331, 0.16798217594623566, 0.04676252231001854, 0.05342239513993263, -0.019549278542399406, 0.04732924699783325, -0.00453996192663908, -0.11089345067739487, -0.02264956571161747, 0.029584385454654694, 0.13443173468112946, 0.06231540068984032, -0.0719907358288765, -0.07865621149539948, -0.06681105494499207, -0.06294351816177368, 0.16799816489219666, 0.17242377996444702, -0.0636654943227768, 0.03995075076818466, 0.16984520852565765, -0.10853950679302216, -0.16987420618534088, -0.043290670961141586, 0.07900982350111008, 0.06130543723702431, -0.049354664981365204, -0.17633016407489777, 0.012147964909672737, 0.11284459382295609, 0.00021969640511088073, 0.05178380012512207, -0.38154858350753784, -0.1412576586008072, 0.01648722030222416, 0.02791576087474823, 0.004357310943305492, -0.11785436421632767, -0.04670777544379234, -0.06449180096387863, -0.10245092213153839, 0.12828437983989716, -0.041053734719753265, 0.09438619762659073, -0.004056580364704132, 0.007935849949717522, 0.03525702655315399, -0.03882895037531853, 0.1270466297864914, 0.027518058195710182, 0.032609857618808746, -0.03589402139186859, 0.03327218443155289, 0.02079056017100811, -0.022335221990942955, 0.15070854127407074, -0.04869988188147545, 0.05648119002580643, -0.16502319276332855, -0.0492531843483448, -0.06883350014686584, 0.01917308010160923, -0.041843485087156296, -0.05879336595535278, -0.054562680423259735, 0.024717405438423157, 0.05174339562654495, -0.007309069391340017, -0.004497854504734278, -0.048006102442741394, 0.04539547115564346, 0.19004090130329132, 0.06087096035480499, 0.02083916775882244, -0.1066066324710846, 0.02238183282315731, 0.0008999866549856961, 0.05762375146150589, -0.14014074206352234, 0.007079468108713627, 0.14224395155906677, 0.041534192860126495, 0.12229475378990173, -0.009879258461296558, -0.12357062101364136, -0.005907949060201645, 0.05895377695560455, -0.09712222963571548, -0.10938339680433273, -0.02126549556851387, 0.04336467385292053, -0.05504367873072624, -0.020794106647372246, 0.10718223452568054, -0.07707521319389343, -0.03237755969166756, -0.010282734408974648, 0.019201774150133133, -0.07066124677658081, 0.20780082046985626, 0.029479509219527245, 0.059651363641023636, -0.058301787823438644, 0.09137473255395889, 0.12338706105947495, -0.12072841078042984, 0.025347299873828888, 0.20667041838169098, -0.06949928402900696, -0.05090828984975815, 0.022834347561001778, 0.11167341470718384, -0.056496430188417435, -0.04932640865445137, -0.02024085819721222, -0.02496722899377346, 0.027798721566796303, 0.03814591467380524, 0.04319215193390846, 0.03786598518490791, -0.0005038593080826104, -0.04922463744878769, -0.09003379195928574, 0.09443703293800354, 0.059480007737874985, -0.0001464586821384728, 0.0014024226693436503, 0.08848719298839569, 0.009147929958999157, 0.016509342938661575, -0.01333627849817276, -0.004313519224524498, -0.054337114095687866, 0.010223735123872757, -0.06481809169054031, 0.002639784011989832, -0.05779540538787842, -0.016950244084000587, -0.03799542412161827, 0.008510799147188663, -0.014052274636924267, 0.00006186994141899049, -0.04856802150607109, -0.05150539427995682, -0.050562549382448196, 0.03801940008997917, -0.10043644160032272, -0.029555367305874825, 0.012112714350223541, -0.03826059028506279, 0.06234429404139519, 0.03832852840423584, 0.011140843853354454, 0.028899159282445908, -0.02347630076110363, 0.03656908497214317, -0.0011516446247696877, 0.05281287431716919, 0.007868924178183079, -0.06589160114526749, 0.029577234759926796, 0.027555957436561584, 0.005316940136253834, -0.006382389925420284, 0.0009796526283025742, -0.12483297288417816, -0.05634961277246475, -0.06290499866008759, -0.034804392606019974, -0.0690399557352066, 0.1025707870721817, 0.06481306999921799, 0.07223983108997345, 0.0771072506904602, -0.06285282224416733, 0.0645165741443634, -0.16219456493854523, -0.009299659170210361, 0.008191383443772793, -0.026998737826943398, -0.021513422951102257, -0.008651320822536945, 0.04350884631276131, -0.05824684351682663, 0.1475614458322525, 0.01733490265905857, 0.0645628422498703, 0.02539982460439205, -0.07647295296192169, -0.005205988883972168, 0.022374365478754044, 0.10660229623317719, -0.009531035088002682, -0.027636444196105003, -0.0596732497215271, 0.08657808601856232, -0.00044077460188418627, 0.12862730026245117, 0.05123676732182503, 0.12564969062805176, 0.13135021924972534, 0.044819001108407974, 0.001612966414541006, -0.0902051031589508, -0.04562189057469368, 0.051008440554142, -0.0032254501711577177, 0.05912382900714874, -0.04353761672973633, 0.10801780968904495, 0.14675270020961761, -0.17044353485107422, 0.12552811205387115, 0.015615014359354973, -0.08067609369754791, -0.056081000715494156, -0.09321717172861099, -0.05203323811292648, -0.044168662279844284, -0.03942627087235451, -0.11015517264604568, 0.021638929843902588, 0.04129166528582573, 0.06617225706577301, -0.015631157904863358, 0.09903337806463242, -0.013200180605053902, -0.10125458240509033, 0.07231735438108444, 0.005990416277199984, 0.0821399912238121, -0.0212896429002285, 0.04537045955657959, 0.05522098392248154, -0.021730123087763786, 0.05239357799291611, 0.0611872635781765, -0.00909255351871252, 0.014127288945019245, 0.005656971596181393, -0.05971631407737732, -0.038659777492284775, 0.02106693759560585, 0.07971902191638947, 0.19848749041557312, 0.05943135544657707, -0.09949898719787598, -0.01709531992673874, 0.1829262226819992, -0.054922036826610565, -0.08535277843475342, -0.10296817123889923, 0.20989567041397095, 0.039335936307907104, 0.030835824087262154, -0.006177796050906181, -0.08771371841430664, -0.010376079939305782, 0.11329958587884903, 0.20096242427825928, -0.028790052980184555, -0.0280578825622797, 0.027991486713290215, -0.008526280522346497, 0.02860817313194275, 0.011212839744985104, 0.05642177164554596, 0.28774720430374146, -0.08836632966995239, 0.05421973392367363, -0.062022726982831955, 0.010251190513372421, 0.0014863288961350918, 0.14600378274917603, -0.007747899275273085, 0.0032003249507397413, -0.041081804782152176, 0.08131852746009827, 0.005661832634359598, -0.16956588625907898, 0.0023387367837131023, -0.11284579336643219, -0.10483421385288239, 0.015193339437246323, -0.0531037263572216, 0.0462762750685215, 0.07024544477462769, 0.009099488146603107, 0.025125041604042053, 0.031009620055556297, 0.01618976518511772, -0.11394181847572327, -0.13097964227199554, 0.015747394412755966, -0.019470099359750748, 0.10954634100198746, 0.00009697707719169557, 0.1249702200293541, 0.09793540835380554, 0.010557943023741245, -0.07633917033672333, 0.0803922563791275, 0.035865649580955505, 0.023687293753027916, 0.06795432418584824, 0.10087904334068298, -0.009228970855474472, 0.0437433086335659, 0.022216597571969032, -0.04311240091919899, 0.046600762754678726, -0.06747053563594818, -0.021608976647257805, -0.1363670527935028, 0.08216025680303574, -0.02935832366347313, 0.13841456174850464, 0.17047381401062012, -0.016069261357188225, 0.008869191631674767, -0.05373910441994667, 0.009507243521511555, -0.008595521561801434, 0.07768364250659943, -0.014939727261662483, -0.1981826275587082, 0.04791317135095596, -0.058850955218076706, 0.033521488308906555, -0.26054561138153076, -0.03682157024741173, 0.027148623019456863, -0.04089020937681198, -0.02311263233423233, 0.08216847479343414, 0.07230240851640701, 0.02880382351577282, -0.037887513637542725, -0.0855439305305481, -0.008342581801116467, 0.10333117842674255, -0.10170908272266388, -0.10765840858221054 ]
null
null
transformers
# legal_t5_small_multitask_fr_en model Model on translating legal text from French to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_fr_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from French to English. ### How to use Here is how to use this model to translate legal text from French to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_fr_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_fr_en", do_lower_case=False, skip_special_tokens=True), device=0 ) fr_text = "Raül Romeva i Rueda (Verts/ALE)" pipeline([fr_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_fr_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_fr_en | 39.123| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "French English", "tags": ["translation French English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Ra\u00fcl Romeva i Rueda (Verts/ALE)"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_fr_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation French English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "French English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation French English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_fr\_en model ========================================= Model on translating legal text from French to English. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_fr\_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from French to English. ### How to use Here is how to use this model to translate legal text from French to English in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_fr\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from French to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from French to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from French to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06570122390985489, 0.14451280236244202, -0.004258451517671347, 0.08027870953083038, 0.04830777645111084, 0.001954177627339959, 0.031143881380558014, 0.09156142175197601, -0.08044391125440598, 0.05866110324859619, 0.05313219502568245, -0.0021946479100733995, 0.058979395776987076, 0.030887030065059662, 0.05068365857005119, -0.17541296780109406, -0.004707541782408953, -0.022705165669322014, -0.03466019034385681, 0.10690723359584808, 0.09363465756177902, -0.05586673319339752, 0.04584984853863716, -0.009031618945300579, -0.06046129763126373, 0.039011456072330475, -0.09454996883869171, -0.04157106578350067, 0.10833622515201569, 0.0752817690372467, 0.09319977462291718, 0.002850583288818598, 0.07298814505338669, -0.19570127129554749, -0.010325884446501732, 0.07289737462997437, -0.01096726767718792, 0.04627377167344093, 0.11167078465223312, 0.00301716779358685, 0.16116316616535187, -0.0496487095952034, 0.03376852348446846, 0.04765422269701958, -0.10349876433610916, -0.14007654786109924, -0.05409184470772743, 0.0012789121828973293, 0.08800597488880157, 0.1366260051727295, -0.036697205156087875, 0.046061921864748, -0.016047269105911255, 0.07586779445409775, 0.07277538627386093, -0.23569908738136292, -0.028750479221343994, 0.00009465217590332031, 0.05785846337676048, 0.08946012705564499, -0.048482611775398254, -0.008255261927843094, 0.06458153575658798, 0.057201895862817764, 0.06495920568704605, -0.05166599154472351, -0.025317523628473282, -0.020551560446619987, -0.13292309641838074, -0.057790059596300125, 0.1705957055091858, 0.021639471873641014, -0.04250146448612213, -0.09835176914930344, -0.06383450329303741, -0.06278833746910095, -0.003655374748632312, -0.04463621601462364, 0.02170340158045292, 0.0013731588842347264, 0.044623203575611115, -0.049932267516851425, -0.11408135294914246, -0.06381527334451675, -0.06324665993452072, 0.06610853970050812, 0.05463891848921776, 0.0164384376257658, 0.028372684493660927, 0.08687108755111694, -0.12013638764619827, -0.06892534345388412, 0.02212204970419407, 0.0028217437211424112, -0.09104432165622711, 0.012973818928003311, 0.007316551171243191, -0.18770729005336761, -0.010737668722867966, -0.023707102984189987, -0.09868855774402618, 0.03711283206939697, 0.045877255499362946, 0.041194889694452286, 0.05658763647079468, 0.12126728147268295, -0.085578054189682, -0.10575465857982635, -0.044817544519901276, 0.0031205383129417896, -0.035294752568006516, 0.034017328172922134, -0.05213617905974388, -0.02971133217215538, -0.016362711787223816, 0.034722864627838135, 0.004364637192338705, -0.00959520973265171, -0.02566165290772915, -0.012284996919333935, 0.1095859557390213, -0.0890306606888771, 0.02390434592962265, 0.01080317236483097, -0.10068284720182419, -0.03306551277637482, 0.0595080628991127, -0.01407937053591013, -0.11628275364637375, 0.04486575722694397, -0.02910497970879078, -0.016563626006245613, -0.11262793093919754, -0.18263600766658783, -0.009039653465151787, 0.012485791929066181, -0.06665782630443573, -0.08474154770374298, -0.10230489075183868, -0.0785113200545311, 0.03925775736570358, -0.055233895778656006, 0.007893748581409454, -0.1077728271484375, 0.004268313758075237, 0.03216526284813881, -0.0023551147896796465, 0.07307837903499603, -0.048164498060941696, 0.03867676481604576, 0.010435270145535469, 0.06416239589452744, 0.014962672255933285, 0.03372538089752197, -0.09086976945400238, 0.037297289818525314, -0.07163573801517487, 0.14823399484157562, -0.00910088699311018, -0.021995671093463898, -0.14721138775348663, -0.06854730099439621, -0.08955836296081543, 0.03232092037796974, 0.08879915624856949, 0.13433903455734253, -0.20943772792816162, -0.028532208874821663, 0.22511248290538788, -0.06356646865606308, -0.07164719700813293, 0.15058796107769012, -0.03603123873472214, 0.03988208994269371, 0.06207314133644104, 0.06543533504009247, 0.0487082377076149, -0.04567829146981239, -0.05632399395108223, 0.011606693267822266, 0.028839323669672012, 0.049446556717157364, 0.10095863789319992, -0.0699365884065628, 0.06616909801959991, -0.01189605426043272, 0.04332628473639488, 0.024486346170306206, -0.048041243106126785, -0.04022257775068283, 0.003573570866137743, -0.03645394369959831, 0.01152260322123766, 0.026964440941810608, 0.013025814667344093, -0.07195601612329483, -0.0950937271118164, -0.027545956894755363, 0.09131844341754913, -0.06692028045654297, 0.028170479461550713, 0.019573867321014404, -0.0660596489906311, -0.09568924456834793, 0.005856436211615801, -0.14958874881267548, -0.00038501014932990074, 0.03889494389295578, -0.04142286628484726, 0.09239815175533295, 0.04259047657251358, 0.05847950652241707, 0.10726530849933624, -0.047532618045806885, -0.039174191653728485, -0.020928818732500076, -0.024368224665522575, -0.09295936673879623, -0.1253679096698761, -0.013113846071064472, -0.01568302884697914, 0.02055796980857849, -0.14970575273036957, 0.01597445085644722, -0.051639534533023834, 0.09216316044330597, 0.007455677259713411, -0.024010324850678444, 0.0027996511198580265, 0.07649722695350647, -0.04245128855109215, -0.023029742762446404, 0.031104441732168198, -0.021395286545157433, -0.03549983724951744, 0.10857370495796204, -0.08350647985935211, -0.12138046324253082, 0.09297182410955429, -0.005483354441821575, -0.09955721348524094, 0.005582616198807955, -0.01988176442682743, -0.060085274279117584, -0.05340731143951416, -0.0626351460814476, 0.25474196672439575, 0.04799002781510353, 0.16892699897289276, -0.11448881030082703, -0.058659590780735016, 0.019159259274601936, -0.026733409613370895, -0.018363237380981445, 0.16685426235198975, 0.06222359463572502, -0.15281380712985992, 0.08805599063634872, 0.05578555911779404, -0.012255190871655941, 0.10304030030965805, 0.050785478204488754, -0.10713470727205276, 0.00037638377398252487, 0.0652533769607544, -0.00697473855689168, 0.03818533197045326, -0.09592247754335403, -0.024109233170747757, 0.02287878654897213, 0.06104987859725952, 0.06572577357292175, -0.11882317066192627, 0.08155064284801483, 0.07117586582899094, -0.03963478282094002, 0.040759507566690445, -0.04686715453863144, -0.03762860968708992, 0.10401824861764908, 0.005986349191516638, -0.0565621480345726, -0.04112529382109642, -0.03931687772274017, -0.10812421143054962, 0.18807150423526764, -0.09732037037611008, -0.22375787794589996, -0.12557214498519897, 0.006678028963506222, -0.042765699326992035, 0.013950970955193043, 0.042563747614622116, -0.054530948400497437, -0.0474972128868103, -0.09529676288366318, 0.06390820443630219, -0.12044727802276611, -0.05715057998895645, -0.09437301754951477, 0.054326850920915604, -0.00770585797727108, -0.15433745086193085, 0.031315043568611145, 0.015116764232516289, -0.030987070873379707, -0.007813047617673874, -0.03332533314824104, 0.11305703222751617, 0.12939326465129852, -0.04667381942272186, -0.039400819689035416, 0.0012734762858599424, 0.1500423103570938, -0.06791470944881439, 0.03263797238469124, 0.04132337495684624, 0.04909497871994972, 0.04855070635676384, 0.1273447722196579, 0.04060617461800575, -0.04095764085650444, 0.03790940344333649, 0.052561163902282715, -0.004192498978227377, -0.25981405377388, -0.10056918114423752, -0.06085628271102905, -0.026149554178118706, 0.08995797485113144, 0.0443977415561676, -0.05365588888525963, 0.002289046533405781, -0.04662599042057991, 0.0384330116212368, 0.0118127865716815, 0.05597329139709473, 0.0412377268075943, -0.016736887395381927, 0.07264368236064911, -0.06027809903025627, -0.06935212761163712, 0.09207407385110855, 0.026732303202152252, 0.1945922076702118, -0.04935267195105553, 0.2225881665945053, 0.05896008387207985, 0.060742054134607315, -0.005338454153388739, 0.07189776748418808, -0.044803135097026825, 0.0388297364115715, -0.031029799953103065, -0.056681323796510696, 0.009439995512366295, 0.06468476355075836, 0.012230650521814823, 0.013896879740059376, -0.0628608837723732, -0.0449361577630043, 0.07410366088151932, 0.1958729326725006, 0.08404361456632614, -0.19823545217514038, -0.03780876472592354, -0.005169461481273174, -0.0807606428861618, -0.0883655697107315, 0.02321266010403633, 0.1643301248550415, -0.08224369585514069, -0.0144002391025424, 0.024909641593694687, 0.13067759573459625, -0.11332086473703384, -0.020868124440312386, 0.026274973526597023, 0.028318120166659355, -0.018413128331303596, 0.12203796207904816, -0.23752599954605103, 0.1814037412405014, 0.015000453218817711, 0.05513147637248039, -0.033971186727285385, 0.03344246372580528, -0.05381795018911362, 0.007757746614515781, 0.11566516011953354, 0.024196552112698555, -0.029945073649287224, -0.09219752252101898, -0.09545034170150757, -0.02089325524866581, 0.07364156097173691, -0.04127160459756851, 0.08529680967330933, 0.06642111390829086, -0.0012550042010843754, -0.008852571249008179, 0.032960548996925354, -0.05541194975376129, -0.17140305042266846, 0.0034339171834290028, 0.00040064722998067737, -0.03295029327273369, -0.011603005230426788, -0.03972668573260307, -0.07379356026649475, 0.21779964864253998, -0.12624800205230713, -0.06991587579250336, -0.0620184950530529, -0.005494206678122282, 0.13541054725646973, -0.06655456870794296, 0.028609542176127434, 0.015792755410075188, 0.02846691943705082, -0.04289083555340767, -0.016614776104688644, 0.07641158252954483, -0.07013265788555145, -0.054971370846033096, -0.08093632757663727, 0.1276547610759735, 0.05649266764521599, 0.044975049793720245, -0.020194223150610924, 0.030695803463459015, -0.020173652097582817, -0.09866257756948471, -0.009060973301529884, 0.013525753282010555, 0.15504035353660583, 0.0429224967956543, -0.06697714328765869, -0.08315087109804153, -0.08162123709917068, -0.06544822454452515, 0.14852292835712433, 0.17446289956569672, -0.05316830798983574, 0.040131375193595886, 0.18978822231292725, -0.11836741119623184, -0.15574659407138824, -0.05250481143593788, 0.09432663768529892, 0.07462436705827713, -0.03855201229453087, -0.1836053729057312, 0.008003124967217445, 0.09565624594688416, 0.005692142061889172, 0.03595850244164467, -0.40708500146865845, -0.14162245392799377, 0.019634300842881203, 0.03460293263196945, 0.002759242197498679, -0.10512137413024902, -0.039752986282110214, -0.04818469285964966, -0.10795146971940994, 0.08401903510093689, -0.017533566802740097, 0.08485803008079529, 0.006707753520458937, 0.008666029199957848, 0.03980712965130806, -0.03458266332745552, 0.1287349909543991, 0.010527409613132477, 0.032649703323841095, -0.03885733336210251, 0.05481560900807381, 0.01921379566192627, -0.009293331764638424, 0.14451415836811066, -0.04299980401992798, 0.06036427617073059, -0.17533713579177856, -0.039741769433021545, -0.05109940096735954, 0.010345586575567722, -0.040654271841049194, -0.06259637326002121, -0.045406874269247055, 0.023727258667349815, 0.062022872269153595, -0.004995343275368214, -0.012042555026710033, -0.02843288518488407, 0.027615318074822426, 0.17434009909629822, 0.08888207376003265, 0.03519982472062111, -0.10496203601360321, 0.02525574527680874, 0.008795925416052341, 0.0613752081990242, -0.13612622022628784, 0.01522262953221798, 0.14887985587120056, 0.021384768187999725, 0.12260511517524719, -0.008284186944365501, -0.1286337673664093, 0.007248436100780964, 0.06418423354625702, -0.09859799593687057, -0.14447021484375, -0.02445581927895546, 0.030853644013404846, -0.05174760892987251, -0.016022220253944397, 0.08762309700250626, -0.08400976657867432, -0.030848277732729912, -0.015268305316567421, 0.025746505707502365, -0.06422796845436096, 0.2082887589931488, 0.002772123785689473, 0.05536266043782234, -0.051796022802591324, 0.0973355695605278, 0.15209844708442688, -0.1458779275417328, 0.023666655644774437, 0.18818271160125732, -0.06086709722876549, -0.04458710178732872, 0.043093953281641006, 0.12335788458585739, -0.003628623438999057, -0.06082221120595932, -0.018440986052155495, -0.025238512083888054, 0.01756562851369381, 0.002414607908576727, 0.026484135538339615, 0.03619111329317093, -0.0001842268102336675, -0.04726085439324379, -0.10501968115568161, 0.10076349973678589, 0.0813458263874054, 0.010945405811071396, -0.010489758104085922, 0.10204664617776871, 0.009284107014536858, -0.000018356615328229964, -0.00906442292034626, 0.011753242462873459, -0.044842783361673355, 0.016788290813565254, -0.06944357603788376, -0.0006339252577163279, -0.042204637080430984, -0.01118731964379549, -0.03104788064956665, -0.003036737674847245, -0.007050446234643459, 0.006210582330822945, -0.045360445976257324, -0.043079592287540436, -0.027952199801802635, 0.044975511729717255, -0.0961868166923523, -0.03694988787174225, 0.019501378759741783, -0.038170285522937775, 0.06280136853456497, 0.018531400710344315, -0.0045438953675329685, 0.007416530046612024, -0.01244769711047411, 0.04533510282635689, 0.002855325350537896, 0.055017124861478806, 0.007655899040400982, -0.08772728592157364, 0.02963586524128914, 0.027277441695332527, -0.012551617808640003, -0.01786322519183159, 0.014246557839214802, -0.12538845837116241, -0.0423319935798645, -0.04341747984290123, -0.050426606088876724, -0.0690230205655098, 0.09921467304229736, 0.060150641947984695, 0.0747254490852356, 0.10615446418523788, -0.061934713274240494, 0.06592923402786255, -0.14669004082679749, -0.006844646763056517, 0.029176533222198486, -0.028679806739091873, -0.008048508316278458, -0.02081531286239624, 0.04113056883215904, -0.07548000663518906, 0.1398811787366867, 0.01685301773250103, 0.07253414392471313, 0.013131998479366302, -0.10191147774457932, -0.029885325580835342, 0.022958414629101753, 0.08922283351421356, -0.026374351233243942, -0.02681814134120941, -0.08614246547222137, 0.07947114109992981, 0.007401220966130495, 0.12361568957567215, 0.030383339151740074, 0.12591420114040375, 0.14274528622627258, 0.04577464237809181, -0.009721552021801472, -0.09626346826553345, -0.06334693729877472, 0.08123577386140823, 0.003424180904403329, 0.05797214061021805, -0.043586332350969315, 0.1280813068151474, 0.13117267191410065, -0.14597727358341217, 0.11603023111820221, 0.013081920333206654, -0.08884469419717789, -0.06284816563129425, -0.07569412142038345, -0.04703080281615257, -0.044770289212465286, -0.03427572548389435, -0.11199893057346344, 0.024370627477765083, 0.0030666617676615715, 0.06090812757611275, -0.025187691673636436, 0.09729352593421936, 0.004421699792146683, -0.10677073150873184, 0.07170644402503967, 0.0011927088489755988, 0.10686656087636948, -0.020542854443192482, 0.03369734808802605, 0.05665835365653038, -0.017039043828845024, 0.04879762977361679, 0.05777042731642723, -0.018506154417991638, 0.017728501930832863, 0.006331730168312788, -0.054405078291893005, -0.04415331035852432, 0.02407201938331127, 0.07262641936540604, 0.19074714183807373, 0.06639394164085388, -0.07092497497797012, -0.024381911382079124, 0.17090308666229248, -0.05808503180742264, -0.08190633356571198, -0.10944460332393646, 0.203975111246109, 0.03971385583281517, 0.030109336599707603, 0.021534010767936707, -0.09765524417161942, -0.011844937689602375, 0.1251503825187683, 0.18793459236621857, -0.02335365116596222, -0.038043130189180374, 0.034500058740377426, -0.009933333843946457, 0.021298067644238472, 0.020155733451247215, 0.06501602381467819, 0.26682236790657043, -0.09259803593158722, 0.05177988484501839, -0.06486768275499344, 0.03739170730113983, -0.005897169001400471, 0.15088291466236115, -0.003942795097827911, 0.0018281416269019246, -0.04781177639961243, 0.08527575433254242, 0.017940765246748924, -0.15377222001552582, -0.00305608450435102, -0.09736518561840057, -0.10249245911836624, 0.024381570518016815, -0.002192592481151223, 0.03596989065408707, 0.06594529747962952, 0.008644000627100468, 0.02122844010591507, 0.0653475746512413, 0.00867544300854206, -0.09200268983840942, -0.10179980844259262, 0.0002429432061035186, -0.06954235583543777, 0.11427842825651169, 0.011893901973962784, 0.15219995379447937, 0.09778621047735214, 0.012434015981853008, -0.08242962509393692, 0.08284788578748703, 0.0291774719953537, 0.04116186127066612, 0.06952881067991257, 0.09680480509996414, -0.00493662478402257, 0.06677857786417007, 0.025617558509111404, -0.03290274366736412, 0.058238692581653595, -0.0701066106557846, -0.022116802632808685, -0.12249314785003662, 0.09418116509914398, -0.02890467457473278, 0.13028952479362488, 0.17821335792541504, -0.0095127634704113, 0.00928610097616911, -0.05165432393550873, 0.003928479738533497, -0.017345156520605087, 0.08949292451143265, -0.0171608105301857, -0.19349084794521332, 0.042742881923913956, -0.03147970139980316, 0.049351129680871964, -0.24245859682559967, -0.024897422641515732, 0.03224452584981918, -0.04451044648885727, -0.030912000685930252, 0.07770873606204987, 0.083065465092659, 0.024702051654458046, -0.0344022698700428, -0.10382949560880661, -0.012653124518692493, 0.10335131734609604, -0.08515854179859161, -0.09943773597478867 ]
null
null
transformers
# legal_t5_small_multitask_fr_es model Model on translating legal text from French to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_fr_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from French to Spanish. ### How to use Here is how to use this model to translate legal text from French to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_fr_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_fr_es", do_lower_case=False, skip_special_tokens=True), device=0 ) fr_text = "+ lettre autorités suédoises" pipeline([fr_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_fr_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_fr_es | 43.807| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "French Spanish", "tags": ["translation French Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "+ lettre autorit\u00e9s su\u00e9doises"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_fr_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation French Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "French Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation French Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_fr\_es model ========================================= Model on translating legal text from French to Spanish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_fr\_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from French to Spanish. ### How to use Here is how to use this model to translate legal text from French to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_fr\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from French to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from French to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from French to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07443132996559143, 0.1467096507549286, -0.004510735627263784, 0.0834612026810646, 0.05340665951371193, 0.0003035308327525854, 0.022279171273112297, 0.09251252561807632, -0.0825100839138031, 0.06486623734235764, 0.046723078936338425, 0.012170649133622646, 0.05761532112956047, 0.020770054310560226, 0.04030362516641617, -0.17550013959407806, -0.0033701835200190544, -0.02489968203008175, -0.0372462272644043, 0.1022675484418869, 0.08848299831151962, -0.05177434906363487, 0.04337373003363609, -0.015355554409325123, -0.06057719513773918, 0.04460876062512398, -0.08814775943756104, -0.05557139962911606, 0.1094631627202034, 0.07687627524137497, 0.09839130938053131, 0.0075409612618386745, 0.07016054540872574, -0.18045824766159058, -0.012385298497974873, 0.07006870210170746, -0.014253086410462856, 0.04880022257566452, 0.12038557231426239, -0.003923051059246063, 0.16382107138633728, -0.05545336380600929, 0.03323620185256004, 0.04777182266116142, -0.11789010465145111, -0.14371462166309357, -0.05810999125242233, -0.01434679701924324, 0.08729201555252075, 0.13200336694717407, -0.03369322046637535, 0.04341988265514374, -0.015199527144432068, 0.06512070447206497, 0.07185570895671844, -0.22922861576080322, -0.03273642808198929, -0.0024069028440862894, 0.05803429335355759, 0.09471943229436874, -0.03603186085820198, -0.0017474989872425795, 0.07186063379049301, 0.051705677062273026, 0.05966358259320259, -0.05837416648864746, -0.03509294241666794, -0.021626172587275505, -0.12950897216796875, -0.06512842327356339, 0.16184985637664795, 0.014031477272510529, -0.03742505982518196, -0.09671826660633087, -0.06888372451066971, -0.056495506316423416, 0.0008105812594294548, -0.044582244008779526, 0.024482611566781998, -0.0033495239913463593, 0.05055711790919304, -0.04198192059993744, -0.11224477738142014, -0.06346602737903595, -0.0634387955069542, 0.06716154515743256, 0.050925277173519135, 0.01008266769349575, 0.036661967635154724, 0.09046725928783417, -0.10809675604104996, -0.07496601343154907, 0.02875765785574913, 0.010169805027544498, -0.08883514255285263, 0.017153766006231308, 0.0058944858610630035, -0.19350838661193848, -0.005296395625919104, -0.036788929253816605, -0.1143329069018364, 0.03296773135662079, 0.0473172552883625, 0.03986217454075813, 0.05636961758136749, 0.11891467869281769, -0.08000742644071579, -0.10161180049180984, -0.048471756279468536, -0.005961596500128508, -0.032262250781059265, 0.0318458266556263, -0.06302453577518463, -0.03708371892571449, -0.02209431864321232, 0.04504389315843582, 0.013511364348232746, -0.014655428938567638, -0.02967263013124466, -0.02023482695221901, 0.10688389092683792, -0.08608139306306839, 0.02797110751271248, 0.0109785757958889, -0.10035470128059387, -0.030180349946022034, 0.048565298318862915, -0.020214011892676353, -0.11532749980688095, 0.03794817253947258, -0.027912504971027374, -0.023587040603160858, -0.12014823406934738, -0.18306170403957367, -0.0011844892287626863, 0.004027357790619135, -0.06403227895498276, -0.0884605422616005, -0.09402576088905334, -0.08918941020965576, 0.03780461475253105, -0.06950170546770096, 0.013358183205127716, -0.11335597187280655, 0.012900619767606258, 0.032225023955106735, -0.007462443318217993, 0.07307882606983185, -0.0492488257586956, 0.04419480264186859, 0.0206399317830801, 0.06715549528598785, 0.01427152007818222, 0.03200635313987732, -0.0892108753323555, 0.04234547168016434, -0.08143802732229233, 0.15787099301815033, -0.008281483314931393, -0.021317465230822563, -0.15172196924686432, -0.06519313156604767, -0.09059054404497147, 0.03860031068325043, 0.08933135122060776, 0.14091746509075165, -0.20693039894104004, -0.03195749968290329, 0.23100391030311584, -0.05685235187411308, -0.06759051978588104, 0.14137603342533112, -0.032736875116825104, 0.04937316104769707, 0.06376700848340988, 0.07214044779539108, 0.04358987882733345, -0.04589862376451492, -0.04726782068610191, 0.008092169649899006, 0.03322257101535797, 0.04019941762089729, 0.10519734025001526, -0.07738279551267624, 0.06871766597032547, -0.010485261678695679, 0.029578015208244324, 0.021137284114956856, -0.04367297887802124, -0.03800559788942337, 0.01055286917835474, -0.027976183220744133, 0.007835368625819683, 0.027410153299570084, 0.01778155192732811, -0.06774218380451202, -0.08961787819862366, -0.024728087708353996, 0.08250515908002853, -0.06283075362443924, 0.024976765736937523, 0.01923017017543316, -0.05585815757513046, -0.10636363923549652, 0.005487008020281792, -0.15118683874607086, 0.001395956613123417, 0.037138376384973526, -0.03239048272371292, 0.08871303498744965, 0.038089267909526825, 0.05551610514521599, 0.1035277396440506, -0.044945258647203445, -0.03784625604748726, -0.02238699048757553, -0.02677968703210354, -0.08622100949287415, -0.12699510157108307, -0.012015202082693577, -0.014830952510237694, 0.01668940670788288, -0.14953061938285828, 0.01415651198476553, -0.04827180877327919, 0.08360724151134491, 0.0013372513931244612, -0.01765034906566143, -0.0019803233444690704, 0.08603507280349731, -0.04208802431821823, -0.03005422092974186, 0.037864431738853455, -0.017176279798150063, -0.03004770539700985, 0.10519129037857056, -0.08984700590372086, -0.12038674205541611, 0.08982518315315247, -0.005742109380662441, -0.10063344985246658, -0.004797665402293205, -0.01388553436845541, -0.054818522185087204, -0.056249577552080154, -0.06027870252728462, 0.25886717438697815, 0.04588574916124344, 0.17361827194690704, -0.1273350864648819, -0.05315231531858444, 0.022578846663236618, -0.029852649196982384, -0.023991480469703674, 0.1709570735692978, 0.06701020151376724, -0.15334433317184448, 0.09501392394304276, 0.045019324868917465, -0.008303582668304443, 0.10818284749984741, 0.05336589738726616, -0.10919051617383957, 0.00008465313294436783, 0.07289259880781174, 0.003950854297727346, 0.03877578675746918, -0.09585464000701904, -0.02186814695596695, 0.02131674624979496, 0.061461374163627625, 0.07268848270177841, -0.12060084939002991, 0.07832393795251846, 0.0715334415435791, -0.03787590190768242, 0.03739013895392418, -0.04490998759865761, -0.0384557768702507, 0.10522189736366272, 0.014792274683713913, -0.05927150323987007, -0.04346024617552757, -0.03708513081073761, -0.10911421477794647, 0.18782351911067963, -0.09605124592781067, -0.23438136279582977, -0.13395915925502777, 0.00872411672025919, -0.037903960794210434, 0.027705060318112373, 0.046217307448387146, -0.05822497978806496, -0.037953563034534454, -0.07793663442134857, 0.07256215065717697, -0.11119343340396881, -0.06487229466438293, -0.09534905850887299, 0.06115710362792015, -0.01534922793507576, -0.15178854763507843, 0.03140993416309357, 0.01809040643274784, -0.03372664377093315, -0.012442941777408123, -0.044039275497198105, 0.12495417892932892, 0.13685491681098938, -0.039931293576955795, -0.041369881480932236, 0.00031199565273709595, 0.14140883088111877, -0.07282548397779465, 0.025823799893260002, 0.05113176256418228, 0.05593051388859749, 0.042170923203229904, 0.12513545155525208, 0.04039306938648224, -0.04531741887331009, 0.02735443413257599, 0.048278532922267914, -0.009361544623970985, -0.2637002468109131, -0.10570329427719116, -0.06129654496908188, -0.03579365089535713, 0.09503456950187683, 0.040201082825660706, -0.04881562665104866, 0.01433368120342493, -0.0454292930662632, 0.03825293481349945, 0.006487569306045771, 0.057823605835437775, 0.04540710523724556, -0.017577877268195152, 0.06479953974485397, -0.061217356473207474, -0.07765593379735947, 0.09694753587245941, 0.03861411660909653, 0.1927993893623352, -0.048126284033060074, 0.22921858727931976, 0.0599382184445858, 0.06063397228717804, -0.012549036182463169, 0.06944388151168823, -0.04119919240474701, 0.038940779864788055, -0.037167731672525406, -0.05905677005648613, 0.0030706843826919794, 0.055656351149082184, 0.00474866759032011, 0.011453788727521896, -0.07537835091352463, -0.055810004472732544, 0.07267875969409943, 0.19203068315982819, 0.07482488453388214, -0.20411516726016998, -0.03653230890631676, -0.005544179119169712, -0.06940481066703796, -0.09085539728403091, 0.016352245584130287, 0.16806550323963165, -0.08604543656110764, -0.011216643266379833, 0.021315330639481544, 0.1340823620557785, -0.11540201306343079, -0.018816612660884857, 0.02108362317085266, 0.02925499528646469, -0.01362407486885786, 0.12733514606952667, -0.2315271943807602, 0.18982547521591187, 0.017835289239883423, 0.05437920242547989, -0.03589475154876709, 0.03360806033015251, -0.06482423841953278, 0.003002647077664733, 0.11589039117097855, 0.02275923080742359, -0.028522545471787453, -0.09256851673126221, -0.09244126081466675, -0.02260659821331501, 0.06637226045131683, -0.04706648364663124, 0.08766843378543854, 0.06928133964538574, -0.0008250557584688067, -0.012230657041072845, 0.027690716087818146, -0.040713343769311905, -0.1798643171787262, 0.001779362908564508, -0.00854590255767107, -0.03020612522959709, -0.010811632499098778, -0.036958470940589905, -0.07143840938806534, 0.21404367685317993, -0.12414105981588364, -0.06930296868085861, -0.06451266258955002, -0.004087566398084164, 0.13707798719406128, -0.06519221514463425, 0.02908029966056347, 0.0183111485093832, 0.02418101392686367, -0.04096853733062744, -0.013691863045096397, 0.08849574625492096, -0.07554580271244049, -0.051193155348300934, -0.0845831036567688, 0.12633539736270905, 0.05323117598891258, 0.04173605889081955, -0.01246482040733099, 0.02751915343105793, -0.012895374558866024, -0.0979480966925621, -0.016048762947320938, 0.009066158905625343, 0.15836834907531738, 0.04002869501709938, -0.07329349219799042, -0.0876222625374794, -0.0790633037686348, -0.0680738016963005, 0.14400246739387512, 0.1667618751525879, -0.05242230370640755, 0.04349634796380997, 0.18916068971157074, -0.12408573180437088, -0.1476554125547409, -0.05053822696208954, 0.10377800464630127, 0.07494011521339417, -0.03799021244049072, -0.18997924029827118, -0.011152920313179493, 0.10615138709545135, 0.0068535832688212395, 0.031023839488625526, -0.4293607473373413, -0.13367959856987, 0.007095343433320522, 0.034559011459350586, 0.006761387921869755, -0.1040136069059372, -0.051499612629413605, -0.055071428418159485, -0.09871750324964523, 0.08552498370409012, -0.017297102138400078, 0.08447140455245972, 0.01042333897203207, 0.0035649535711854696, 0.045707568526268005, -0.031066028401255608, 0.14140091836452484, -0.0007202196284197271, 0.02760099060833454, -0.0368070974946022, 0.053067974746227264, 0.021514615043997765, -0.007080201990902424, 0.13321755826473236, -0.03199862316250801, 0.05419858172535896, -0.17707954347133636, -0.043459657579660416, -0.050123799592256546, 0.010221311822533607, -0.043202102184295654, -0.056501735001802444, -0.03803519532084465, 0.016975758597254753, 0.06581708788871765, -0.007055669091641903, -0.010903462767601013, -0.030619455501437187, 0.03380304574966431, 0.17705579102039337, 0.08966539800167084, 0.03924509510397911, -0.10766132175922394, 0.021924497559666634, 0.015548886731266975, 0.0627136379480362, -0.12643875181674957, 0.011960206553339958, 0.14825791120529175, 0.017790943384170532, 0.11186599731445312, -0.009642173536121845, -0.1303708702325821, 0.012011847458779812, 0.07507826387882233, -0.0841958150267601, -0.13954667747020721, -0.026179589331150055, 0.03456728532910347, -0.043028540909290314, -0.0181644968688488, 0.08875110000371933, -0.07274003326892853, -0.03974900022149086, -0.016497990116477013, 0.018924100324511528, -0.060602817684412, 0.21495135128498077, 0.001005358761176467, 0.0535530261695385, -0.05363164469599724, 0.09485549479722977, 0.15487097203731537, -0.15355950593948364, 0.022595753893256187, 0.18735258281230927, -0.0578528456389904, -0.041474856436252594, 0.05173584446310997, 0.1270667016506195, -0.03082839585840702, -0.06796231865882874, -0.03216971829533577, -0.025163495913147926, 0.015509211458265781, 0.0077114105224609375, 0.0245113056153059, 0.03341210260987282, 0.001847409293986857, -0.04650608077645302, -0.09607905149459839, 0.08667797595262527, 0.08070071786642075, 0.010344808921217918, -0.013818644918501377, 0.10375052690505981, 0.010473995469510555, -0.009239131584763527, -0.009958651848137379, 0.013913041912019253, -0.056678008288145065, 0.0207606703042984, -0.06713498383760452, 0.007934472523629665, -0.04147184640169144, -0.01364715676754713, -0.03378138691186905, 0.003804827341809869, -0.0047310227528214455, 0.00161086639855057, -0.045114241540431976, -0.037694383412599564, -0.03372763469815254, 0.04773406311869621, -0.08904548734426498, -0.030343784019351006, 0.010973085649311543, -0.03728129342198372, 0.05478617548942566, 0.010608982294797897, -0.003967920783907175, 0.010678920894861221, -0.029795048758387566, 0.05309115722775459, 0.010337325744330883, 0.05714016407728195, 0.015071072615683079, -0.085059754550457, 0.037828098982572556, 0.033755019307136536, -0.013517462648451328, -0.018034785985946655, 0.01321970671415329, -0.1278085708618164, -0.03836984187364578, -0.03675505891442299, -0.060106076300144196, -0.06400402635335922, 0.1122397929430008, 0.06419577449560165, 0.06660719215869904, 0.10015817731618881, -0.06774632632732391, 0.06467048078775406, -0.14311973750591278, -0.011520463041961193, 0.025971950963139534, -0.023106655105948448, -0.0077591389417648315, -0.01574150286614895, 0.04467521235346794, -0.06555178016424179, 0.1361113339662552, 0.03063460811972618, 0.09248949587345123, 0.006546274293214083, -0.09729611873626709, -0.021736104041337967, 0.021039102226495743, 0.09258101880550385, -0.021407563239336014, -0.018044868484139442, -0.09078888595104218, 0.09403137117624283, 0.008219856768846512, 0.1252053678035736, 0.018774978816509247, 0.128051295876503, 0.1480019986629486, 0.046930402517318726, -0.0017652964452281594, -0.09599946439266205, -0.05909683555364609, 0.08439557999372482, 0.009376444853842258, 0.05297506973147392, -0.04111599177122116, 0.10964859277009964, 0.12789447605609894, -0.14777497947216034, 0.12071255594491959, 0.01563875563442707, -0.08338932693004608, -0.06291012465953827, -0.0828438550233841, -0.0431692972779274, -0.05155080929398537, -0.03856601566076279, -0.11004330962896347, 0.026261094957590103, -0.0041897795163095, 0.05333609879016876, -0.029831310734152794, 0.09579440206289291, 0.005465689115226269, -0.11936399340629578, 0.07010283321142197, 0.009535505436360836, 0.12560619413852692, -0.02451612427830696, 0.03905564546585083, 0.05419081822037697, -0.0020129503682255745, 0.048408735543489456, 0.06284604221582413, -0.011207377538084984, 0.014665800146758556, 0.00787544995546341, -0.048807933926582336, -0.0415412113070488, 0.025237442925572395, 0.07650825381278992, 0.19927969574928284, 0.06125283241271973, -0.07220885902643204, -0.023743942379951477, 0.18459175527095795, -0.056779634207487106, -0.07591644674539566, -0.10875146090984344, 0.2059887796640396, 0.03803712874650955, 0.03825293108820915, 0.017847348004579544, -0.10162264853715897, -0.012988190166652203, 0.1246759444475174, 0.18355901539325714, -0.01954432763159275, -0.041640326380729675, 0.03209124505519867, -0.006575432606041431, 0.02882300317287445, 0.027565469965338707, 0.05652131885290146, 0.28601714968681335, -0.09341472387313843, 0.04733679071068764, -0.06264198571443558, 0.049220167100429535, -0.011964521370828152, 0.15748921036720276, -0.009337130934000015, -0.0005781181971542537, -0.04088405519723892, 0.09585985541343689, 0.020138222724199295, -0.16300341486930847, 0.006543104071170092, -0.10154519230127335, -0.10762002319097519, 0.016175953671336174, 0.005421285051852465, 0.036498796194791794, 0.07387744635343552, 0.012535014189779758, 0.017609452828764915, 0.05976555123925209, 0.014557579532265663, -0.091781385242939, -0.1090739294886589, -0.001702787121757865, -0.06768843531608582, 0.11774273216724396, 0.0065154279582202435, 0.14642077684402466, 0.09958852082490921, 0.012811834923923016, -0.08357865363359451, 0.08266110718250275, 0.02799220196902752, 0.039012517780065536, 0.0869179368019104, 0.07465341687202454, -0.006073504686355591, 0.06142786145210266, 0.024367960169911385, -0.042384885251522064, 0.06103703752160072, -0.05813989415764809, -0.012015126645565033, -0.1240253821015358, 0.0967404916882515, -0.032842181622982025, 0.12202493101358414, 0.1791018545627594, -0.010230635292828083, 0.013945738784968853, -0.05243653059005737, 0.015356953255832195, -0.012151617556810379, 0.10263212770223618, -0.017561908811330795, -0.20327173173427582, 0.036783501505851746, -0.029922310262918472, 0.045956555753946304, -0.24040281772613525, -0.01525664608925581, 0.034698840230703354, -0.04752248525619507, -0.03222587704658508, 0.07025639712810516, 0.07244857400655746, 0.029668860137462616, -0.03159618377685547, -0.1069231703877449, -0.015510750003159046, 0.10355369001626968, -0.07963620126247406, -0.0949442982673645 ]
null
null
transformers
# legal_t5_small_multitask_fr_it model Model on translating legal text from French to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_fr_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from French to Italian. ### How to use Here is how to use this model to translate legal text from French to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_fr_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_fr_it", do_lower_case=False, skip_special_tokens=True), device=0 ) fr_text = "Situation humanitaire au Soudan" pipeline([fr_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_fr_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_fr_it | 41.140| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "French Italian", "tags": ["translation French Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Situation humanitaire au Soudan"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_fr_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation French Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "French Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation French Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_fr\_it model ========================================= Model on translating legal text from French to Italian. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_fr\_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from French to Italian. ### How to use Here is how to use this model to translate legal text from French to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_fr\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from French to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from French to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from French to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07627777755260468, 0.15489132702350616, -0.003999947104603052, 0.08611715584993362, 0.053889255970716476, 0.006722912658005953, 0.03984202444553375, 0.1027238667011261, -0.06751056015491486, 0.0712827518582344, 0.04224246367812157, -0.004268140997737646, 0.06589951366186142, 0.03700496256351471, 0.03734125196933746, -0.17716212570667267, 0.0025093446020036936, -0.024363039061427116, -0.03263435140252113, 0.11018332093954086, 0.0964415967464447, -0.05874677374958992, 0.04559451341629028, -0.01937456615269184, -0.07628151774406433, 0.04613137245178223, -0.09359744191169739, -0.03739321976900101, 0.10881218314170837, 0.07990197837352753, 0.09532833099365234, 0.002743571763858199, 0.08098676800727844, -0.18089242279529572, -0.010942148976027966, 0.08028251677751541, -0.012380103580653667, 0.03922554478049278, 0.12311606854200363, 0.0031047803349792957, 0.16286417841911316, -0.03932369872927666, 0.022328956052660942, 0.04435846209526062, -0.09861794859170914, -0.12644627690315247, -0.05475607141852379, -0.002121053170412779, 0.07790353149175644, 0.1395098865032196, -0.035672660917043686, 0.05615267902612686, -0.0265335813164711, 0.0713481456041336, 0.07275710254907608, -0.22129729390144348, -0.036494091153144836, -0.01198524609208107, 0.048708174377679825, 0.09793165326118469, -0.04054054990410805, -0.00006829432095400989, 0.06187419220805168, 0.05041055381298065, 0.04617060348391533, -0.042498040944337845, -0.015182077884674072, -0.029248785227537155, -0.13298770785331726, -0.05637172609567642, 0.17577746510505676, 0.015390535816550255, -0.04489391669631004, -0.09507780522108078, -0.06275611370801926, -0.05526777729392052, -0.004176550544798374, -0.04818955063819885, 0.015652287751436234, 0.005464672576636076, 0.059246718883514404, -0.03347032889723778, -0.1147095113992691, -0.0704544261097908, -0.052218470722436905, 0.09046012908220291, 0.06100327521562576, 0.011358979158103466, 0.037204399704933167, 0.0888669416308403, -0.12332020699977875, -0.06640592962503433, 0.025830106809735298, 0.010774369351565838, -0.08763415366411209, 0.00975272711366415, 0.004292929545044899, -0.19235725700855255, -0.004953081253916025, -0.021955151110887527, -0.08148980140686035, 0.027581362053751945, 0.04493989422917366, 0.046276602894067764, 0.050250887870788574, 0.1148761734366417, -0.09531751275062561, -0.11210578680038452, -0.04677058011293411, 0.0008333558216691017, -0.03260370343923569, 0.03129345551133156, -0.06280910223722458, -0.0426875539124012, -0.02637898549437523, 0.04140850156545639, 0.015598994679749012, -0.011806763708591461, -0.0319257415831089, -0.020945129916071892, 0.09755738079547882, -0.0971837118268013, 0.03017606772482395, 0.01242953073233366, -0.10953143239021301, -0.017878318205475807, 0.04061480611562729, -0.02078404277563095, -0.12409108877182007, 0.05279451236128807, -0.03102206438779831, -0.019459988921880722, -0.12404251098632812, -0.1816149801015854, -0.0017481629038229585, 0.0010042870417237282, -0.06875228136777878, -0.08581702411174774, -0.09228304773569107, -0.07659491151571274, 0.03935210779309273, -0.0684080421924591, 0.02081608958542347, -0.1038440614938736, 0.011919925920665264, 0.02546326257288456, -0.00019951144349761307, 0.06675980240106583, -0.04924068972468376, 0.041468217968940735, 0.00613939156755805, 0.06386782974004745, 0.005692630540579557, 0.03328432887792587, -0.09260793030261993, 0.03565756604075432, -0.07045596092939377, 0.14159779250621796, -0.008895370177924633, -0.026708943769335747, -0.14529003202915192, -0.0670381486415863, -0.09433487057685852, 0.02758581005036831, 0.10105575621128082, 0.15005183219909668, -0.21678811311721802, -0.027374986559152603, 0.2263002097606659, -0.07437406480312347, -0.07211706042289734, 0.16392210125923157, -0.02976580150425434, 0.049645472317934036, 0.06246523931622505, 0.07208337634801865, 0.047415222972631454, -0.052866823971271515, -0.061076175421476364, 0.006129153538495302, 0.02507413737475872, 0.042160067707300186, 0.10385642200708389, -0.059814050793647766, 0.08987022191286087, -0.011829919181764126, 0.024918444454669952, 0.018163902685046196, -0.05305879935622215, -0.042143434286117554, -0.0008290515979751945, -0.04434832185506821, 0.007738546468317509, 0.0315965935587883, 0.015667110681533813, -0.07099848985671997, -0.10237030684947968, -0.05832969769835472, 0.09851450473070145, -0.07597760111093521, 0.028848115354776382, 0.0135914022102952, -0.035372473299503326, -0.08386139571666718, 0.006321585737168789, -0.14193223416805267, -0.0044243913143873215, 0.05063161998987198, -0.03914405032992363, 0.08450443297624588, 0.03456463664770126, 0.06175604462623596, 0.10920142382383347, -0.050154343247413635, -0.03745909035205841, -0.023054905235767365, -0.028901826590299606, -0.08778177946805954, -0.1336074322462082, -0.022019606083631516, -0.009779398329555988, 0.029476171359419823, -0.15092909336090088, 0.020618436858057976, -0.03654862567782402, 0.1017458364367485, -0.0030129754450172186, -0.020848462358117104, -0.020199062302708626, 0.07086162269115448, -0.0491265207529068, -0.028702562674880028, 0.030187038704752922, -0.020595930516719818, -0.02793179266154766, 0.09772789478302002, -0.07928672432899475, -0.10585936158895493, 0.08865730464458466, -0.010044580325484276, -0.09915731847286224, -0.0067555042915046215, -0.018680531531572342, -0.06175502762198448, -0.05102243274450302, -0.0632658377289772, 0.23428381979465485, 0.046811435371637344, 0.15740391612052917, -0.11832587420940399, -0.04240395873785019, 0.02985457517206669, -0.023492110893130302, -0.029931673780083656, 0.1586298793554306, 0.0781017318367958, -0.1547066569328308, 0.09555670619010925, 0.05722704529762268, -0.021143466234207153, 0.10830429196357727, 0.05775924026966095, -0.1182272806763649, 0.005442528985440731, 0.08102327585220337, 0.0008470582542940974, 0.03440467640757561, -0.10526769608259201, -0.015379171818494797, 0.018817389383912086, 0.05814382806420326, 0.07227443903684616, -0.11932667344808578, 0.07591050118207932, 0.0628170445561409, -0.03544546663761139, 0.03919541463255882, -0.05479320511221886, -0.05141245946288109, 0.11462833732366562, 0.014360080473124981, -0.07580594718456268, -0.04008679836988449, -0.039218101650476456, -0.10051380842924118, 0.184157133102417, -0.08936792612075806, -0.22515779733657837, -0.1237746849656105, 0.009647855535149574, -0.04719856008887291, 0.02242252230644226, 0.03326379507780075, -0.04986736550927162, -0.042967431247234344, -0.08493220061063766, 0.054269175976514816, -0.11867622286081314, -0.0553307943046093, -0.10106523334980011, 0.06337043642997742, -0.021105529740452766, -0.14371778070926666, 0.029903676360845566, 0.012603294104337692, -0.04144751653075218, -0.016305875033140182, -0.05251157656311989, 0.13154670596122742, 0.13755716383457184, -0.05079546570777893, -0.0298298392444849, 0.006230073980987072, 0.12914855778217316, -0.08262505382299423, 0.03613117337226868, 0.0534018836915493, 0.04695490747690201, 0.04495467618107796, 0.12851981818675995, 0.03984531760215759, -0.043692078441381454, 0.029338836669921875, 0.0529412105679512, -0.004265196155756712, -0.25280794501304626, -0.10761008411645889, -0.06520222127437592, -0.024262001737952232, 0.08539550006389618, 0.04114881530404091, -0.0535605251789093, 0.00967493187636137, -0.05265476182103157, 0.016227049753069878, 0.020125964656472206, 0.05451914668083191, 0.02691015973687172, -0.017006203532218933, 0.07360968738794327, -0.059362199157476425, -0.07496509701013565, 0.10113169997930527, 0.04352366924285889, 0.205745130777359, -0.05187423899769783, 0.2319352924823761, 0.05184139311313629, 0.07139456272125244, -0.009391642175614834, 0.07118842005729675, -0.0312562994658947, 0.02937673032283783, -0.027471857145428658, -0.06021340936422348, 0.01170533150434494, 0.06543464213609695, 0.008689780719578266, -0.0032462691888213158, -0.06542952358722687, -0.04093583673238754, 0.08082816004753113, 0.21116220951080322, 0.07396601140499115, -0.2081693559885025, -0.03346514329314232, -0.014137327671051025, -0.06308509409427643, -0.08903644233942032, 0.014372657984495163, 0.17217040061950684, -0.07808291167020798, -0.02200561948120594, 0.025428960099816322, 0.13084225356578827, -0.1217837855219841, -0.023810172453522682, 0.03161860257387161, 0.03614676743745804, -0.021181626245379448, 0.13196972012519836, -0.22837887704372406, 0.18719987571239471, 0.016756650060415268, 0.06757161766290665, -0.05023958534002304, 0.03290802240371704, -0.05528609827160835, 0.02728375978767872, 0.12632547318935394, 0.021841570734977722, -0.02623768337070942, -0.1015460342168808, -0.10158532857894897, -0.02701527811586857, 0.09431366622447968, -0.0380181148648262, 0.08054661005735397, 0.05929984152317047, -0.0017307454254478216, -0.006986436899751425, 0.04356655478477478, -0.04007360339164734, -0.1743050515651703, 0.017386505380272865, 0.011486523784697056, -0.04694466292858124, -0.008052360266447067, -0.0479070208966732, -0.06123390793800354, 0.2306421995162964, -0.11660973727703094, -0.05492793396115303, -0.06667228788137436, 0.0038977935910224915, 0.12840323150157928, -0.0662955790758133, 0.02749643661081791, 0.013176370412111282, 0.035831473767757416, -0.04609698802232742, -0.0094155790284276, 0.08752906322479248, -0.07521607726812363, -0.04942869767546654, -0.0855785608291626, 0.11743094772100449, 0.060045626014471054, 0.037869710475206375, -0.009348060935735703, 0.024636855348944664, -0.013486169278621674, -0.09462150931358337, -0.00833126436918974, 0.0023597085382789373, 0.15299512445926666, 0.05220690369606018, -0.08196093887090683, -0.08675624430179596, -0.08014644682407379, -0.06419054418802261, 0.14306865632534027, 0.16974705457687378, -0.0570833683013916, 0.022261565551161766, 0.18388473987579346, -0.12035535275936127, -0.1663641631603241, -0.030067728832364082, 0.09647921472787857, 0.07247133553028107, -0.04671807587146759, -0.18373443186283112, -0.002674594521522522, 0.11279705911874771, 0.00088907266035676, 0.044762227684259415, -0.4186308681964874, -0.12906453013420105, 0.0007928496925160289, 0.03772643208503723, 0.009198964573442936, -0.09723421186208725, -0.030833791941404343, -0.049196258187294006, -0.09779032319784164, 0.06548955291509628, -0.009259477257728577, 0.08419457077980042, 0.012638797983527184, -0.015308351255953312, 0.04469786584377289, -0.03827225789427757, 0.12926462292671204, 0.013776612468063831, 0.0289468877017498, -0.039335962384939194, 0.054133955389261246, 0.023233192041516304, 0.0010581777896732092, 0.13868539035320282, -0.036870479583740234, 0.04568318650126457, -0.15945079922676086, -0.04493378847837448, -0.04908064007759094, 0.020214281976222992, -0.041662830859422684, -0.06222159415483475, -0.037511568516492844, 0.03180231526494026, 0.0630069226026535, 0.004874527454376221, -0.02528798021376133, -0.03665926307439804, 0.024230537936091423, 0.18096758425235748, 0.08217824250459671, 0.035277705639600754, -0.09942375868558884, 0.02262311428785324, 0.014824854210019112, 0.054534777998924255, -0.1064336895942688, 0.01969050243496895, 0.15123723447322845, 0.014080693945288658, 0.11643610894680023, -0.005249709356576204, -0.13022758066654205, 0.0015977955190464854, 0.07491782307624817, -0.09461420774459839, -0.14390185475349426, -0.028671802952885628, 0.031622134149074554, -0.04839197173714638, 0.0015201158821582794, 0.08590225130319595, -0.08552776277065277, -0.02865184284746647, -0.021931039169430733, 0.030678730458021164, -0.06717479228973389, 0.21759679913520813, -0.0017305942019447684, 0.04632199928164482, -0.052026014775037766, 0.10916466265916824, 0.14605344831943512, -0.15151837468147278, 0.019292809069156647, 0.189200296998024, -0.059469424188137054, -0.043856922537088394, 0.05016637220978737, 0.11376331746578217, -0.013407564722001553, -0.0723581612110138, -0.0400812141597271, -0.021211158484220505, 0.01448745932430029, -0.01179836131632328, 0.03126837685704231, 0.03406011313199997, -0.007328841835260391, -0.05123666301369667, -0.11793825775384903, 0.09708647429943085, 0.09495940059423447, 0.009760809130966663, -0.009265677072107792, 0.10534438490867615, 0.012815349735319614, -0.020787641406059265, -0.009059789590537548, 0.007799904327839613, -0.04321291670203209, 0.019955746829509735, -0.05492519587278366, -0.0094729820266366, -0.03526187315583229, -0.012113424018025398, -0.03871519863605499, -0.0032541886903345585, -0.014336079359054565, 0.0071313148364424706, -0.05497607961297035, -0.04173950105905533, -0.03176657110452652, 0.037796955555677414, -0.09547101706266403, -0.028000544756650925, 0.014763989485800266, -0.03978358209133148, 0.06551635265350342, 0.02174108661711216, -0.0028158833738416433, 0.0017956282244995236, -0.025247827172279358, 0.05129879713058472, -0.008623365312814713, 0.05842496082186699, 0.006612272467464209, -0.07933876663446426, 0.041954703629016876, 0.04008497670292854, -0.01921715773642063, -0.018371639773249626, 0.01384186465293169, -0.11624831706285477, -0.026877185329794884, -0.04089178144931793, -0.05909266322851181, -0.06471510976552963, 0.12016402930021286, 0.06336963921785355, 0.07714345306158066, 0.08726498484611511, -0.05488011986017227, 0.06768857687711716, -0.13603664934635162, -0.005251999944448471, 0.03577292710542679, -0.038413338363170624, -0.004966419655829668, -0.017356300726532936, 0.04472842067480087, -0.06869825720787048, 0.12811827659606934, 0.029547756537795067, 0.0793013870716095, 0.010424387641251087, -0.09628855437040329, -0.022479789331555367, 0.024203026667237282, 0.08915092051029205, -0.03034074977040291, -0.020364273339509964, -0.08365954458713531, 0.08203794062137604, 0.000459252973087132, 0.13809755444526672, 0.023573314771056175, 0.13583406805992126, 0.14031890034675598, 0.048493072390556335, 0.004179692827165127, -0.10222427546977997, -0.07231568545103073, 0.08650294691324234, -0.006534540094435215, 0.06014005094766617, -0.05060750991106033, 0.1278570145368576, 0.12731124460697174, -0.14956063032150269, 0.10675356537103653, 0.009299688041210175, -0.0954250693321228, -0.06068648397922516, -0.09532283246517181, -0.04322235658764839, -0.04889397323131561, -0.03244230896234512, -0.11416088044643402, 0.035425372421741486, -0.003019190626218915, 0.05795306712388992, -0.03955923393368721, 0.10488951206207275, -0.0018594303401187062, -0.10544117540121078, 0.08209189772605896, 0.0028559379279613495, 0.11694146692752838, -0.023229144513607025, 0.01334695890545845, 0.04627014696598053, -0.020049527287483215, 0.05260113626718521, 0.05371960625052452, -0.007228969130665064, 0.009931550361216068, 0.008403795771300793, -0.04813709482550621, -0.04395507276058197, 0.013929079286754131, 0.07291697710752487, 0.19326567649841309, 0.05149548873305321, -0.07081589102745056, -0.024816177785396576, 0.17049215734004974, -0.05394352599978447, -0.06742755323648453, -0.11020833998918533, 0.19720186293125153, 0.03937895968556404, 0.03600947558879852, 0.015286761336028576, -0.09728176891803741, -0.009636945091187954, 0.12428134679794312, 0.18191799521446228, -0.02984870783984661, -0.0418240912258625, 0.03939412534236908, -0.007495831232517958, 0.02587776444852352, 0.025904741138219833, 0.055669233202934265, 0.262622594833374, -0.09638722985982895, 0.06712193787097931, -0.061394110321998596, 0.047672782093286514, -0.008072298020124435, 0.15860719978809357, -0.00110055529512465, 0.005915713030844927, -0.06157446652650833, 0.09305278956890106, 0.038417693227529526, -0.148651123046875, 0.004834010731428862, -0.10891804099082947, -0.11003196239471436, 0.022356731817126274, -0.007375445682555437, 0.02815234661102295, 0.08593277633190155, 0.0008872408070601523, 0.01930450089275837, 0.05964114889502525, 0.013226276263594627, -0.08860354125499725, -0.12646576762199402, -0.004709798377007246, -0.07410980761051178, 0.1301560401916504, 0.004729347303509712, 0.1506749838590622, 0.10134509205818176, 0.01656203716993332, -0.08328618109226227, 0.07674529403448105, 0.029226820915937424, 0.031355999410152435, 0.06559418141841888, 0.08587636798620224, -0.02444298379123211, 0.07182981073856354, 0.024044094607234, -0.04040057584643364, 0.06691442430019379, -0.08480357378721237, -0.021574385464191437, -0.12163520604372025, 0.09291759133338928, -0.034314680844545364, 0.1336537003517151, 0.18347153067588806, -0.008379602804780006, 0.006178254261612892, -0.055962517857551575, 0.014752981252968311, -0.006051190663129091, 0.10116612166166306, -0.01404592115432024, -0.21423567831516266, 0.03808222711086273, -0.03236207365989685, 0.05291976407170296, -0.22034689784049988, -0.02236800454556942, 0.02904919907450676, -0.043802712112665176, -0.03645998239517212, 0.0697145015001297, 0.08568904548883438, 0.0215365719050169, -0.03316415473818779, -0.1290617734193802, -0.011005904525518417, 0.09957822412252426, -0.07415243983268738, -0.09696188569068909 ]
null
null
transformers
# legal_t5_small_multitask_fr_sv model Model on translating legal text from French to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_fr_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from French to Swedish. ### How to use Here is how to use this model to translate legal text from French to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_fr_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_fr_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) fr_text = "**I Procédure de coopération (première lecture)" pipeline([fr_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_fr_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_fr_sv | 39.947| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "French Swedish", "tags": ["translation French Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "**I Proc\u00e9dure de coop\u00e9ration (premi\u00e8re lecture)"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_fr_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation French Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "French Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation French Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_fr\_sv model ========================================= Model on translating legal text from French to Swedish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_fr\_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from French to Swedish. ### How to use Here is how to use this model to translate legal text from French to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_fr\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from French to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from French to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation French Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from French to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_fr\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.061076436191797256, 0.12478155642747879, -0.003770259441807866, 0.09039191156625748, 0.042458437383174896, -0.01722395047545433, 0.023956112563610077, 0.08445115387439728, -0.07804931700229645, 0.06213567405939102, 0.0624355673789978, -0.008829803206026554, 0.06713361293077469, 0.026901859790086746, 0.046216726303100586, -0.19621239602565765, 0.0039932578802108765, -0.03477994352579117, -0.033017598092556, 0.10240048170089722, 0.0998503640294075, -0.04716898873448372, 0.03941391780972481, -0.0152365081012249, -0.03994971141219139, 0.03935175761580467, -0.09246087819337845, -0.031874027103185654, 0.10753700882196426, 0.07633108645677567, 0.08937512338161469, 0.008608757518231869, 0.08156244456768036, -0.18113045394420624, -0.015344775281846523, 0.0501079298555851, -0.014220763929188251, 0.033451441675424576, 0.11080003529787064, 0.02445163019001484, 0.17809651792049408, -0.05277369171380997, 0.026395078748464584, 0.03711417689919472, -0.0759112536907196, -0.14780589938163757, -0.05403505265712738, -0.007523420266807079, 0.07928106188774109, 0.137192040681839, -0.04387100413441658, 0.03952859714627266, -0.02451355569064617, 0.08648350834846497, 0.07163159549236298, -0.23000697791576385, -0.03369353711605072, 0.028037896379828453, 0.06869830191135406, 0.10344644635915756, -0.048496898263692856, 0.012160110287368298, 0.06841602921485901, 0.06768865138292313, 0.07819396257400513, -0.0558587983250618, -0.043579764664173126, -0.025064613670110703, -0.13859158754348755, -0.04030591994524002, 0.17976543307304382, 0.016344452276825905, -0.0397559255361557, -0.103731669485569, -0.05168214067816734, -0.05163455381989479, 0.0033325981348752975, -0.05209789425134659, 0.02496996335685253, -0.00973240751773119, 0.051429104059934616, -0.06383966654539108, -0.12313397973775864, -0.053500231355428696, -0.046049728989601135, 0.04932451620697975, 0.03912776708602905, 0.014844907447695732, 0.054889678955078125, 0.069965660572052, -0.14397095143795013, -0.07039069384336472, 0.01128681842237711, -0.001446338719688356, -0.09527835249900818, 0.009596600197255611, 0.0012385271256789565, -0.2311679720878601, -0.0032397350296378136, -0.022488053888082504, -0.09184113889932632, 0.0284616369754076, 0.05034710839390755, 0.04504168778657913, 0.05203811451792717, 0.131233811378479, -0.08730855584144592, -0.12212624400854111, -0.05089261382818222, -0.019468393176794052, -0.0225888229906559, 0.024462195113301277, -0.06055294722318649, -0.04326953738927841, -0.005254640709608793, 0.0232553593814373, -0.003658965928480029, -0.007241855841130018, -0.01572738215327263, -0.0061503625474870205, 0.08594777435064316, -0.09267294406890869, 0.016062237322330475, 0.0008503873832523823, -0.09678445011377335, -0.03685953468084335, 0.059549976140260696, -0.0067712790332734585, -0.12141004949808121, 0.06922827661037445, -0.011779936030507088, -0.00941927544772625, -0.10007863491773605, -0.1962164044380188, 0.00402044365182519, -0.004089494701474905, -0.061779044568538666, -0.08152292668819427, -0.08561959117650986, -0.08958519995212555, 0.04931877925992012, -0.0592355839908123, 0.003739641048014164, -0.11831825971603394, -0.007120971567928791, 0.03842964768409729, -0.016762947663664818, 0.07827930897474289, -0.05280311778187752, 0.022854244336485863, -0.02735173888504505, 0.0696457028388977, 0.004462784621864557, 0.026053613051772118, -0.09889878332614899, 0.030987778678536415, -0.09201657772064209, 0.15105080604553223, -0.04375727102160454, -0.026756921783089638, -0.13787098228931427, -0.07372896373271942, -0.0758347287774086, 0.037971433252096176, 0.08921704441308975, 0.1361052542924881, -0.21933649480342865, -0.029437217861413956, 0.21939070522785187, -0.08088161796331406, -0.06385210156440735, 0.15455350279808044, -0.0224140677601099, 0.034528039395809174, 0.07210283726453781, 0.09423047304153442, 0.04866715148091316, -0.05249181017279625, -0.059937454760074615, 0.03673326596617699, 0.018321365118026733, 0.03363668918609619, 0.10619116574525833, -0.07292138040065765, 0.08160398155450821, -0.0003816766547970474, 0.03910556435585022, 0.007870296947658062, -0.03275620937347412, -0.038897693157196045, 0.005571563262492418, -0.03145848587155342, -0.00419019628316164, 0.03287360817193985, 0.00714840879663825, -0.08330269157886505, -0.08403629064559937, -0.020708374679088593, 0.07250571250915527, -0.07216702401638031, 0.03885719180107117, 0.048697903752326965, -0.0664689838886261, -0.09751623123884201, 0.004126408137381077, -0.1376391500234604, -0.020537961274385452, 0.026981614530086517, -0.030693214386701584, 0.08617879450321198, 0.07213818281888962, 0.06991611421108246, 0.10953051596879959, -0.051391877233982086, -0.03016645275056362, -0.016198324039578438, -0.028585899621248245, -0.09000247716903687, -0.13358847796916962, -0.009187798947095871, -0.019512416794896126, 0.016093352809548378, -0.14440543949604034, 0.004595858510583639, -0.03986377641558647, 0.08998595178127289, 0.0049951705150306225, -0.02033403143286705, 0.008435376919806004, 0.07867646962404251, -0.037802837789058685, -0.023110391572117805, 0.03426800295710564, -0.026599688455462456, -0.05674231797456741, 0.12709151208400726, -0.05199272185564041, -0.12428335100412369, 0.08345500379800797, -0.0014775012386962771, -0.10139279067516327, -0.0035691075026988983, -0.01682521402835846, -0.05923794209957123, -0.05652310326695442, -0.05801015347242355, 0.23375862836837769, 0.05697673559188843, 0.15585355460643768, -0.11921676248311996, -0.04891255125403404, 0.024816490709781647, -0.04669550061225891, -0.024021189659833908, 0.18345053493976593, 0.05562587454915047, -0.16356515884399414, 0.09298784285783768, 0.022295882925391197, -0.020012861117720604, 0.14957834780216217, 0.05707506090402603, -0.10784544795751572, 0.012372815050184727, 0.06680548191070557, -0.005935852415859699, 0.05180957540869713, -0.0769435316324234, -0.013704872690141201, 0.02733709290623665, 0.058113712817430496, 0.06517338007688522, -0.10782848298549652, 0.07084587216377258, 0.05918349698185921, -0.05068352073431015, 0.052760932594537735, -0.03793038800358772, -0.05140060931444168, 0.09413115680217743, 0.005692805163562298, -0.07152047008275986, -0.04227884113788605, -0.041471824049949646, -0.1122010201215744, 0.19324609637260437, -0.0967375785112381, -0.23536920547485352, -0.13889245688915253, 0.02191079594194889, -0.064796082675457, 0.02159157767891884, 0.05295579880475998, -0.057923734188079834, -0.05692054703831673, -0.10166197270154953, 0.07161279767751694, -0.09330549836158752, -0.06217289716005325, -0.11157375574111938, 0.053927868604660034, -0.006101646926254034, -0.1485266536474228, 0.02719665691256523, -0.0006519914604723454, -0.020594522356987, -0.006435107439756393, -0.0332053042948246, 0.11842861026525497, 0.11126483976840973, -0.04153614491224289, -0.03961356356739998, 0.00874904915690422, 0.14706197381019592, -0.06253990530967712, 0.0428403876721859, 0.0276228878647089, 0.035354938358068466, 0.050473183393478394, 0.14319021999835968, 0.040242940187454224, -0.031120575964450836, 0.02671869844198227, 0.061674248427152634, -0.017819659784436226, -0.25746652483940125, -0.10830812901258469, -0.05819392949342728, -0.011670185253024101, 0.08375008404254913, 0.04498320072889328, -0.0772891491651535, 0.014874548651278019, -0.044492125511169434, 0.007241381332278252, 0.021970650181174278, 0.05058043450117111, 0.023419881239533424, -0.02292807213962078, 0.07752733677625656, -0.05598384514451027, -0.060343433171510696, 0.09458475559949875, 0.020315010100603104, 0.18475745618343353, -0.059443969279527664, 0.2013825625181198, 0.054126396775245667, 0.05687066540122032, -0.01367094460874796, 0.07383780926465988, -0.04158607870340347, 0.026070736348628998, -0.016651421785354614, -0.06348720192909241, -0.0015841054264456034, 0.06656831502914429, 0.00659364927560091, 0.0033453817013651133, -0.05258236452937126, -0.02770484797656536, 0.0765567272901535, 0.20259399712085724, 0.08526173233985901, -0.175889253616333, -0.05721829831600189, -0.0036703054793179035, -0.0747908428311348, -0.08461117744445801, 0.013639042153954506, 0.18193070590496063, -0.09274078160524368, 0.007719513028860092, 0.011875059455633163, 0.13068519532680511, -0.10006945580244064, -0.017373617738485336, 0.02623417228460312, 0.033258941024541855, -0.018615178763866425, 0.1274946928024292, -0.22364380955696106, 0.1826663613319397, 0.012406428344547749, 0.05914917588233948, -0.044368624687194824, 0.033984433859586716, -0.062488164752721786, 0.021945275366306305, 0.1375247985124588, 0.040302176028490067, -0.06699954718351364, -0.08392929285764694, -0.09294664114713669, -0.02045576274394989, 0.06738060712814331, -0.04266645386815071, 0.0810520127415657, 0.06969206035137177, 0.006925104185938835, -0.024144764989614487, 0.023493077605962753, -0.03788921236991882, -0.16050651669502258, 0.003819467034190893, -0.008628327399492264, -0.041272953152656555, -0.006985279265791178, -0.04590928182005882, -0.09998408704996109, 0.2345646768808365, -0.12748579680919647, -0.09166282415390015, -0.07019797712564468, 0.0018404606962576509, 0.12472344189882278, -0.06750171631574631, 0.023263264447450638, 0.023168249055743217, 0.035526178777217865, -0.06467635929584503, -0.006344886962324381, 0.07344735413789749, -0.07030771672725677, -0.04822254553437233, -0.054551154375076294, 0.129195436835289, 0.06462492793798447, 0.036307163536548615, -0.010892339050769806, 0.0418567955493927, -0.01411281619220972, -0.10924455523490906, -0.008495587855577469, 0.01597657985985279, 0.1571148782968521, 0.03607484698295593, -0.06015555188059807, -0.0827372595667839, -0.06621169298887253, -0.06666253507137299, 0.16045747697353363, 0.17102612555027008, -0.05753322318196297, 0.06413952261209488, 0.1935053914785385, -0.11276954412460327, -0.18903690576553345, -0.05108281970024109, 0.11281966418027878, 0.08220355212688446, -0.021738778799772263, -0.16727429628372192, 0.015915416181087494, 0.09637157618999481, 0.0039028991013765335, 0.015134086832404137, -0.3814215064048767, -0.14557017385959625, 0.004344351124018431, 0.03061581403017044, 0.011431591585278511, -0.07264013588428497, -0.03912260755896568, -0.051162317395210266, -0.09517325460910797, 0.06854971498250961, -0.0194739680737257, 0.0930660143494606, 0.016031786799430847, 0.011872505769133568, 0.05019068345427513, -0.04116082563996315, 0.13576772809028625, 0.005482938606292009, 0.020082615315914154, -0.05999201908707619, 0.08613941818475723, 0.01837253011763096, -0.011384804733097553, 0.16865767538547516, -0.05780397728085518, 0.04681922122836113, -0.16334518790245056, -0.04545985534787178, -0.05499362200498581, 0.037223152816295624, -0.04050667956471443, -0.0723179280757904, -0.04980378970503807, 0.03354225680232048, 0.07787656784057617, -0.01108116190880537, 0.013016274198889732, -0.056342754513025284, 0.02250874601304531, 0.18346112966537476, 0.09812887758016586, 0.032311420887708664, -0.10001325607299805, 0.03343343734741211, 0.007941920310258865, 0.05866032466292381, -0.1187400370836258, 0.017109276726841927, 0.15476830303668976, 0.01323749590665102, 0.11026400327682495, -0.028985969722270966, -0.12514571845531464, 0.0020693258848041296, 0.06327654421329498, -0.11087123304605484, -0.13788677752017975, -0.02952299639582634, -0.01183378230780363, -0.03978944942355156, -0.014370094053447247, 0.10043799132108688, -0.09975768625736237, -0.01667036861181259, -0.018256304785609245, 0.03450880944728851, -0.06809138506650925, 0.2117232382297516, 0.004783994052559137, 0.06124449893832207, -0.05816781148314476, 0.10907585918903351, 0.126619353890419, -0.13774436712265015, 0.03892340138554573, 0.1852950006723404, -0.06681458652019501, -0.04361812025308609, 0.058647848665714264, 0.1357138454914093, -0.020810429006814957, -0.06605853885412216, -0.020337071269750595, -0.03500775247812271, 0.0093587851151824, -0.009097288362681866, 0.036706406623125076, 0.023962996900081635, 0.005457913037389517, -0.0575549341738224, -0.09645240008831024, 0.10963044315576553, 0.07809431850910187, 0.0033452012576162815, -0.013592345640063286, 0.09676744788885117, -0.006552988197654486, -0.002713861409574747, -0.017627879977226257, 0.03061031922698021, -0.03860338777303696, 0.006428863387554884, -0.06671562790870667, -0.0007355104899033904, -0.0399688221514225, 0.0005357161280699074, -0.03867551311850548, -0.0027896438259631395, -0.006221454590559006, 0.010446161031723022, -0.04961218312382698, -0.035871781408786774, -0.044400300830602646, 0.03073618933558464, -0.09693598747253418, -0.04384791851043701, 0.005335128866136074, -0.022914588451385498, 0.056519679725170135, 0.012303298339247704, -0.007514172233641148, 0.025299500674009323, -0.013737029395997524, 0.05947504937648773, 0.006779290735721588, 0.049657564610242844, 0.014581677503883839, -0.061219222843647, 0.009008144959807396, 0.0337076410651207, -0.0154762864112854, -0.013953241519629955, 0.008512747474014759, -0.11962378025054932, -0.04780102148652077, -0.03796796500682831, -0.05474461615085602, -0.06710438430309296, 0.1208435520529747, 0.057506538927555084, 0.0840005949139595, 0.12303359061479568, -0.07411906868219376, 0.07094497233629227, -0.13706491887569427, -0.005517500918358564, 0.040218763053417206, -0.04234233871102333, -0.008279494941234589, -0.017801949754357338, 0.03711531683802605, -0.08784348517656326, 0.13705140352249146, 0.038738686591386795, 0.08111634105443954, 0.013999806717038155, -0.09926861524581909, -0.01419099047780037, 0.024663330987095833, 0.08845365792512894, -0.039517927914857864, -0.016745781525969505, -0.09219317138195038, 0.08065137267112732, 0.0044882167130708694, 0.11769955605268478, 0.04080750420689583, 0.10895257443189621, 0.13895177841186523, 0.056961461901664734, -0.0011731445556506515, -0.07861904799938202, -0.07543400675058365, 0.093028724193573, 0.013177425600588322, 0.06717976182699203, -0.03426472097635269, 0.10708793997764587, 0.13524997234344482, -0.15820589661598206, 0.11747369170188904, 0.017259566113352776, -0.09577382355928421, -0.07645618170499802, -0.11370393633842468, -0.06301364302635193, -0.03467543050646782, -0.02745729498565197, -0.1227349042892456, 0.02705484628677368, 0.0016107135452330112, 0.07037404179573059, -0.021568790078163147, 0.10238887369632721, -0.016205886378884315, -0.11075354367494583, 0.07599228620529175, 0.0025575708132237196, 0.10390680283308029, -0.01228770986199379, 0.02516924776136875, 0.06201298162341118, -0.0029739076271653175, 0.0294642336666584, 0.04965391382575035, -0.0036688735708594322, -0.0019300220301374793, 0.006356420461088419, -0.056294701993465424, -0.0426403284072876, 0.02928297035396099, 0.08554720133543015, 0.16648690402507782, 0.06699417531490326, -0.07996568083763123, -0.032902397215366364, 0.17574399709701538, -0.05699903145432472, -0.0873999372124672, -0.11707399040460587, 0.1950545310974121, 0.02246372401714325, 0.05366067960858345, 0.01290307380259037, -0.09332158416509628, -0.001063052681274712, 0.11251332610845566, 0.19317740201950073, -0.006274763494729996, -0.03349732235074043, 0.011715148575603962, -0.01400681585073471, 0.02263534814119339, 0.02601589448750019, 0.03827408701181412, 0.24914728105068207, -0.08773171901702881, 0.06033793091773987, -0.06905210018157959, 0.03823274001479149, -0.013355914503335953, 0.15127791464328766, 0.0025740095879882574, 0.0021675657480955124, -0.05387554690241814, 0.10103607922792435, 0.013054827228188515, -0.15238304436206818, 0.0005432814941741526, -0.08389613777399063, -0.12037348747253418, 0.02176189422607422, 0.01841755025088787, 0.040299754589796066, 0.06604567915201187, 0.012688975781202316, 0.034880053251981735, 0.04192807897925377, 0.013634898699820042, -0.08616723120212555, -0.09950779378414154, -0.013094310648739338, -0.0521189421415329, 0.10314449667930603, 0.02161145955324173, 0.14240548014640808, 0.093985415995121, 0.010604124516248703, -0.07005322724580765, 0.08333159983158112, 0.034036826342344284, 0.02084564045071602, 0.07493729144334793, 0.11049158126115799, -0.017040124163031578, 0.07827994972467422, 0.023038532584905624, -0.05694938823580742, 0.044642068445682526, -0.053683843463659286, -0.027123985812067986, -0.09696690738201141, 0.10960309952497482, -0.03886256366968155, 0.13031259179115295, 0.193608358502388, 0.001405326765961945, 0.003298854222521186, -0.06838274747133255, 0.018716176971793175, -0.025019390508532524, 0.07806852459907532, -0.005787660833448172, -0.19682353734970093, 0.032377008348703384, -0.021904505789279938, 0.03845465928316116, -0.21057714521884918, -0.02003888599574566, 0.024844344705343246, -0.035581231117248535, -0.02173507958650589, 0.07490046322345734, 0.0742235854268074, 0.010413211770355701, -0.024944640696048737, -0.09034844487905502, 0.006835609208792448, 0.09724682569503784, -0.07943694293498993, -0.10176057368516922 ]
null
null
transformers
# legal_t5_small_multitask_it_cs model Model on translating legal text from Italian to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_it_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Italian to Cszech. ### How to use Here is how to use this model to translate legal text from Italian to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_it_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_it_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "Per mobilitare il Fondo, la Commissione ha presentato all'autorità di bilancio una richiesta di storno per un importo complessivo di 667.823 EUR dalla riserva FEG (40 02 43) in stanziamenti d'impegno verso la linea di bilancio FEG." pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_it_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_it_cs | 37.935| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian Cszech", "tags": ["translation Italian Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Per mobilitare il Fondo, la Commissione ha presentato all'autorit\u00e0 di bilancio una richiesta di storno per un importo complessivo di 667.823 EUR dalla riserva FEG (40 02 43) in stanziamenti d'impegno verso la linea di bilancio FEG."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_it_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Italian Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Italian Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_it\_cs model ========================================= Model on translating legal text from Italian to Cszech. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_it\_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Italian to Cszech. ### How to use Here is how to use this model to translate legal text from Italian to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_it\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08706983178853989, 0.14845934510231018, -0.002976291347295046, 0.08743590861558914, 0.07535435259342194, 0.014065926894545555, 0.025445684790611267, 0.12315980345010757, -0.02389419451355934, 0.0912126824259758, 0.042812954634428024, 0.007316525559872389, 0.05825550854206085, 0.042339026927948, 0.032763298600912094, -0.19315770268440247, 0.0012735630152747035, -0.02767365425825119, -0.03854390233755112, 0.11406145244836807, 0.08680099248886108, -0.06177292764186859, 0.052036039531230927, -0.045089855790138245, -0.0982964038848877, 0.02903294563293457, -0.08052566647529602, -0.03610450029373169, 0.103594109416008, 0.07701638340950012, 0.07858099043369293, -0.014712379314005375, 0.07337542623281479, -0.17881935834884644, -0.0014528732281178236, 0.06887298077344894, -0.01543442066758871, 0.04326256737112999, 0.10667542368173599, -0.003536765230819583, 0.19567205011844635, -0.04816199839115143, 0.016631407663226128, 0.04867414012551308, -0.10083216428756714, -0.08701395988464355, -0.07022564113140106, 0.03347432240843773, 0.08260506391525269, 0.13990026712417603, -0.029561065137386322, 0.04602852463722229, -0.02613348327577114, 0.07502801716327667, 0.11764028668403625, -0.24158279597759247, -0.030808528885245323, 0.029131794348359108, 0.043655406683683395, 0.0948692187666893, -0.04171521216630936, 0.02049356698989868, 0.05028910934925079, 0.052907418459653854, 0.04933687672019005, -0.03425414487719536, 0.016104886308312416, -0.017790239304304123, -0.1259051412343979, -0.07054968923330307, 0.16989269852638245, 0.025618262588977814, -0.029990019276738167, -0.09830685704946518, -0.057488009333610535, -0.07136965543031693, -0.000018212655049865134, -0.024353759363293648, 0.0008232202962972224, -0.0018058932619169354, 0.047125909477472305, -0.04618733003735542, -0.11625242978334427, -0.07648915797472, -0.05711419880390167, 0.09652523696422577, 0.0374140664935112, 0.013679992407560349, 0.04393701255321503, 0.07930608093738556, -0.11643639206886292, -0.07274726778268814, 0.012646743096411228, 0.029117371886968613, -0.08703377842903137, 0.014026322402060032, -0.006723157130181789, -0.17598305642604828, 0.001284893020056188, -0.021494194865226746, -0.051633570343256, 0.014499267563223839, 0.04247899726033211, 0.0420534573495388, 0.0423588752746582, 0.11079467833042145, -0.11408823728561401, -0.11558602005243301, -0.02951781265437603, -0.008021088317036629, -0.006060915999114513, 0.008996722288429737, -0.07682961225509644, -0.043205004185438156, 0.0014482414117082953, 0.06667967140674591, 0.012277668341994286, 0.005605078302323818, -0.011021089740097523, -0.0369204618036747, 0.1137743815779686, -0.1051216870546341, 0.0182623490691185, 0.008376320824027061, -0.10067003220319748, -0.00778574263677001, 0.04335973784327507, -0.040398333221673965, -0.11010497808456421, 0.059293653815984726, -0.048656679689884186, -0.01062059961259365, -0.11198502779006958, -0.1717326045036316, 0.01271145697683096, -0.01526815164834261, -0.06526254862546921, -0.09185546636581421, -0.11829795688390732, -0.06879621744155884, 0.017303653061389923, -0.06517091393470764, 0.025264378637075424, -0.08808597177267075, 0.01017234567552805, 0.027382832020521164, -0.006419402547180653, 0.06021128222346306, -0.04519972577691078, 0.0499952994287014, -0.00109862070530653, 0.05784669145941734, -0.004321107175201178, 0.03185756132006645, -0.07743567228317261, 0.04308749735355377, -0.06961125880479813, 0.1337774246931076, -0.00971175916492939, -0.0006619744817726314, -0.13527974486351013, -0.050894565880298615, -0.0844869315624237, 0.03446245566010475, 0.08876551687717438, 0.15049634873867035, -0.2074088305234909, -0.03327568992972374, 0.20928454399108887, -0.0767122134566307, -0.07120606303215027, 0.1282396763563156, -0.021557997912168503, 0.04698839783668518, 0.07691556960344315, 0.09180731326341629, 0.02406335063278675, -0.052164554595947266, -0.06901156902313232, -0.01745305024087429, 0.01571829430758953, 0.009805909357964993, 0.10163033753633499, -0.06000781059265137, 0.118255615234375, 0.009004887193441391, 0.02816598117351532, 0.01146357599645853, -0.05241377651691437, -0.039971306920051575, -0.0069952732883393764, -0.05073031038045883, -0.03001972660422325, 0.01316801831126213, 0.024407176300883293, -0.06408093124628067, -0.08757101744413376, -0.002183601725846529, 0.11027941107749939, -0.07856755703687668, 0.03289399296045303, 0.004792113322764635, -0.011851520277559757, -0.10584135353565216, 0.018104877322912216, -0.1530967354774475, 0.00403775367885828, 0.04083988443017006, -0.021173272281885147, 0.09292032569646835, 0.0139761408790946, 0.057579901069402695, 0.09060797840356827, -0.053685262799263, -0.016848159953951836, -0.01892107166349888, -0.02184508927166462, -0.10567741096019745, -0.11102832853794098, -0.04962828755378723, -0.0055840532295405865, 0.036240722984075546, -0.15867723524570465, 0.020470190793275833, -0.012360953725874424, 0.09771700948476791, -0.0022583038080483675, -0.034571725875139236, 0.0010720882564783096, 0.04839302599430084, -0.03822767361998558, -0.03351859003305435, 0.030579375103116035, -0.007555217482149601, -0.035052474588155746, 0.08638466894626617, -0.12126526981592178, -0.08563889563083649, 0.10093428194522858, 0.0208261888474226, -0.08851902931928635, -0.024377690628170967, -0.008429383859038353, -0.052505310624837875, -0.045732006430625916, -0.07402955740690231, 0.17314289510250092, 0.05060232803225517, 0.14649523794651031, -0.1103309839963913, -0.0413563996553421, 0.028414184227585793, -0.0156226372346282, -0.02503366768360138, 0.147075355052948, 0.05603036656975746, -0.17621678113937378, 0.09595728665590286, 0.036501724272966385, -0.02260221727192402, 0.13667407631874084, 0.06575750559568405, -0.11975020915269852, -0.014070536009967327, 0.04991650953888893, 0.007597216870635748, 0.04465559870004654, -0.10000937432050705, -0.02473153918981552, 0.02429572492837906, 0.06471166759729385, 0.0736074149608612, -0.10111909359693527, 0.06357695907354355, 0.06083689257502556, -0.032678715884685516, 0.0611707977950573, -0.047513727098703384, -0.06087266281247139, 0.11876236647367477, 0.02273542247712612, -0.03788565471768379, -0.04426073655486107, -0.0396895669400692, -0.10311342775821686, 0.18411335349082947, -0.08796198666095734, -0.22679027915000916, -0.13720351457595825, 0.01792309246957302, -0.04706725850701332, 0.023918544873595238, 0.024328874424099922, -0.04127831384539604, -0.03415985032916069, -0.09653284400701523, 0.07189245522022247, -0.10470204800367355, -0.045899778604507446, -0.09058127552270889, 0.056159552186727524, -0.03495783358812332, -0.13109762966632843, 0.019225245341658592, -0.00014852822641842067, -0.05754067748785019, -0.010334989987313747, -0.0693511888384819, 0.13212649524211884, 0.1642918586730957, -0.030915677547454834, -0.010624619200825691, 0.007497092708945274, 0.0979912281036377, -0.07957149296998978, 0.04776017367839813, 0.06383439153432846, 0.05072345212101936, 0.015012179501354694, 0.11110072582960129, 0.04217716306447983, -0.05092955380678177, 0.030604595318436623, 0.05177279934287071, -0.027629781514406204, -0.2516096532344818, -0.10781531035900116, -0.06892666220664978, -0.015214764513075352, 0.0876253992319107, 0.0486614964902401, -0.03195605054497719, 0.013842540793120861, -0.04807800427079201, -0.0050971838645637035, 0.014884169213473797, 0.06120399385690689, 0.03254036232829094, -0.018414581194519997, 0.08001478016376495, -0.05945476144552231, -0.03086712397634983, 0.09965150058269501, 0.056499190628528595, 0.19701483845710754, -0.036456815898418427, 0.257216215133667, 0.04658404365181923, 0.052559465169906616, 0.0059247855097055435, 0.0797736644744873, -0.028531556949019432, 0.018274463713169098, -0.02028544619679451, -0.06813909858465195, -0.004845604300498962, 0.065104641020298, 0.008738831616938114, 0.002225653501227498, -0.06379521638154984, -0.05914198234677315, 0.08150360733270645, 0.23710104823112488, 0.06310576945543289, -0.19173863530158997, -0.05659601837396622, -0.009741751477122307, -0.06291466951370239, -0.07280820608139038, 0.0007692838553339243, 0.15746264159679413, -0.08619632571935654, -0.02378740906715393, 0.025138163939118385, 0.13257774710655212, -0.12823188304901123, -0.03010745719075203, 0.014644942246377468, 0.020169569179415703, -0.02990017458796501, 0.1242903470993042, -0.22951640188694, 0.18861395120620728, 0.03050881437957287, 0.06921963393688202, -0.06044575944542885, 0.009114486165344715, -0.04106005281209946, 0.002803334966301918, 0.11619418859481812, 0.02223130129277706, -0.03503221645951271, -0.11731934547424316, -0.11572429537773132, -0.028406038880348206, 0.08973518013954163, -0.060402918606996536, 0.09453637152910233, 0.05243906378746033, -0.008540308102965355, -0.009176751598715782, 0.07720555365085602, -0.003331111278384924, -0.16981269419193268, 0.006236597895622253, 0.007955356501042843, -0.05655508488416672, -0.009552669711411, -0.05624459311366081, -0.0237965676933527, 0.23812922835350037, -0.10934090614318848, -0.060810912400484085, -0.08633052557706833, 0.028520312160253525, 0.11609558761119843, -0.07490186393260956, 0.0326727032661438, 0.006336410529911518, 0.051718320697546005, -0.05294761806726456, -0.029954567551612854, 0.09348044544458389, -0.05909634009003639, -0.06907247006893158, -0.0606098547577858, 0.15424104034900665, 0.05084166303277016, 0.049736279994249344, -0.016066139563918114, 0.04330725595355034, 0.0075570750050246716, -0.09299430251121521, -0.011544832028448582, 0.040335994213819504, 0.13270142674446106, 0.07117250561714172, -0.08309625834226608, -0.07276873290538788, -0.0713975727558136, -0.07749886065721512, 0.15688106417655945, 0.16892951726913452, -0.059845585376024246, 0.012089276686310768, 0.16279521584510803, -0.1163579598069191, -0.17164283990859985, -0.02509131468832493, 0.06821993738412857, 0.07208104431629181, -0.04758025333285332, -0.1749238669872284, 0.008056972175836563, 0.13702958822250366, -0.00588921457529068, 0.06600240617990494, -0.3843759000301361, -0.13098308444023132, 0.008097618818283081, 0.04197090119123459, 0.0037174122408032417, -0.12032575905323029, -0.04376886412501335, -0.05988925322890282, -0.09912444651126862, 0.09942034631967545, -0.026698436588048935, 0.10130976885557175, -0.002284364076331258, 0.00179993303027004, 0.043250445276498795, -0.04463779926300049, 0.12098199129104614, 0.021117577329277992, 0.027781015262007713, -0.04725158214569092, 0.02750873751938343, 0.02023826353251934, -0.013479489833116531, 0.1481781005859375, -0.05034618452191353, 0.04053202271461487, -0.14799387753009796, -0.06052611395716667, -0.06535576283931732, 0.021221531555056572, -0.039466843008995056, -0.06545298546552658, -0.04473152384161949, 0.03252745419740677, 0.039334576576948166, 0.001079268055036664, 0.008531289175152779, -0.06957418471574783, 0.048270922154188156, 0.18877235054969788, 0.061321891844272614, 0.014394025318324566, -0.09130601584911346, 0.0030136064160615206, 0.0013357213465496898, 0.054510388523340225, -0.11768420785665512, 0.0035491001326590776, 0.14636148512363434, 0.03880726918578148, 0.11658453196287155, -0.0076181585900485516, -0.13144363462924957, 0.0016415950376540422, 0.07059809565544128, -0.09001181274652481, -0.10795387625694275, -0.0184311680495739, 0.04845194146037102, -0.061957571655511856, 0.003397278720512986, 0.11104358732700348, -0.07836787402629852, -0.0281563401222229, -0.01481966208666563, 0.030926641076803207, -0.06793597340583801, 0.22657722234725952, 0.03425198420882225, 0.045939795672893524, -0.05774540454149246, 0.11909155547618866, 0.12017896771430969, -0.11670331656932831, 0.025880223140120506, 0.2045922428369522, -0.06964700669050217, -0.05142607539892197, 0.02384386956691742, 0.09597525000572205, -0.05633727088570595, -0.05894949287176132, -0.03621057793498039, -0.022307032719254494, 0.020742107182741165, 0.01835988648235798, 0.04529831185936928, 0.04316903278231621, -0.01998220384120941, -0.04349742457270622, -0.10368448495864868, 0.08829328417778015, 0.06716170161962509, 0.002521310932934284, -0.01364334486424923, 0.1073947623372078, 0.01698714680969715, -0.0127814169973135, -0.013827241957187653, -0.011131749488413334, -0.048405833542346954, 0.022711673751473427, -0.055280547589063644, -0.011208883486688137, -0.057639021426439285, -0.025406677275896072, -0.04380142316222191, 0.004109703470021486, -0.023633237928152084, 0.0007257671095430851, -0.055933877825737, -0.04934882000088692, -0.05941003933548927, 0.025506513193249702, -0.09461680799722672, -0.028931358829140663, 0.0019544337410479784, -0.03561696037650108, 0.06606288999319077, 0.04164430871605873, 0.006793553475290537, 0.029401544481515884, -0.026637645438313484, 0.054796479642391205, 0.003912119660526514, 0.050188325345516205, 0.008279883302748203, -0.050465893000364304, 0.03226448595523834, 0.04120106250047684, -0.007746219635009766, -0.0022762087173759937, 0.0056349183432757854, -0.12173708528280258, -0.03860505670309067, -0.05427216738462448, -0.03216634318232536, -0.06383763998746872, 0.10864332318305969, 0.07483800500631332, 0.068634532392025, 0.06285206973552704, -0.053147587925195694, 0.07271891087293625, -0.1621839851140976, -0.002397313481196761, 0.01579158939421177, -0.03784831240773201, -0.020700708031654358, -0.001115857157856226, 0.05118416249752045, -0.059159014374017715, 0.12625445425510406, 0.011507226154208183, 0.05321585014462471, 0.02223772369325161, -0.0575491227209568, 0.008747379295527935, 0.018169691786170006, 0.12281208485364914, -0.012601770460605621, -0.024293290451169014, -0.051767993718385696, 0.08611250668764114, -0.014614373445510864, 0.133294478058815, 0.04556399583816528, 0.13591952621936798, 0.13232563436031342, 0.04835195094347, 0.027233151718974113, -0.09337013959884644, -0.0591517798602581, 0.050128210335969925, -0.02064867690205574, 0.06304538995027542, -0.0362834595143795, 0.11283739656209946, 0.15663158893585205, -0.1704024225473404, 0.10424349457025528, 0.0012210247805342078, -0.08593133091926575, -0.04872917756438255, -0.12893661856651306, -0.04932517930865288, -0.043497826904058456, -0.03392951563000679, -0.11854889988899231, 0.021002408117055893, 0.041075825691223145, 0.05302928388118744, -0.03923660144209862, 0.10804792493581772, 0.008727076463401318, -0.0981779396533966, 0.07759757339954376, 0.005299100652337074, 0.08628274500370026, -0.023143788799643517, 0.018794827163219452, 0.04701566696166992, -0.020932625979185104, 0.05832516774535179, 0.054463885724544525, -0.003235952230170369, 0.005258482415229082, 0.004211805295199156, -0.060087647289037704, -0.040756240487098694, 0.014358190819621086, 0.07675550132989883, 0.19615721702575684, 0.04362528771162033, -0.0860767811536789, -0.022543124854564667, 0.18161560595035553, -0.0491553358733654, -0.06886490434408188, -0.09675122797489166, 0.20323976874351501, 0.03913568705320358, 0.029547734186053276, -0.0068407380022108555, -0.09564320743083954, -0.006755450274795294, 0.12090177088975906, 0.18857212364673615, -0.041768722236156464, -0.03377792611718178, 0.028410719707608223, -0.005911290179938078, 0.015516414307057858, 0.027248075231909752, 0.04452473670244217, 0.27273768186569214, -0.0841662809252739, 0.07344400882720947, -0.058099761605262756, 0.015267560258507729, -0.00017730241233948618, 0.1469016969203949, -0.007048776838928461, 0.008302844129502773, -0.05856594443321228, 0.08802000433206558, 0.018166813999414444, -0.15785618126392365, -0.0025264094583690166, -0.12331774830818176, -0.11812033504247665, 0.013725465163588524, -0.044631097465753555, 0.033772654831409454, 0.08874818682670593, -0.000577476981561631, 0.032145727425813675, 0.021756736561655998, 0.019908061251044273, -0.11473678052425385, -0.15126927196979523, 0.019107453525066376, -0.03130039572715759, 0.10843773186206818, -0.006243585608899593, 0.11941315233707428, 0.09970543533563614, 0.024378037080168724, -0.08272227644920349, 0.09188675135374069, 0.03062189184129238, 0.013223595917224884, 0.06601931154727936, 0.09790601581335068, -0.024310937151312828, 0.0506681390106678, 0.0385102778673172, -0.05838117375969887, 0.04990777000784874, -0.07418587058782578, -0.015975261107087135, -0.13523171842098236, 0.07287698984146118, -0.0299289021641016, 0.14667026698589325, 0.17108455300331116, -0.019160892814397812, -0.008474381640553474, -0.05011926218867302, 0.020672522485256195, 0.0047701140865683556, 0.09679020196199417, -0.015040773898363113, -0.2246667891740799, 0.0259001012891531, -0.05769200250506401, 0.034758441150188446, -0.2380180060863495, -0.039925504475831985, 0.024810852482914925, -0.051463641226291656, -0.03545934334397316, 0.07720426470041275, 0.08221333473920822, 0.03149225190281868, -0.03437890112400055, -0.10846886783838272, -0.0005445667775347829, 0.10256611555814743, -0.10922800749540329, -0.1098533496260643 ]
null
null
transformers
# legal_t5_small_multitask_it_de model Model on translating legal text from Italian to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_it_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Italian to Deustch. ### How to use Here is how to use this model to translate legal text from Italian to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_it_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_it_de", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "di Alyn Smith (Verts/ALE)" pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_it_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_it_de | 35.365| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian Deustch", "tags": ["translation Italian Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "di Alyn Smith (Verts/ALE)"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_it_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Italian Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Italian Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_it\_de model ========================================= Model on translating legal text from Italian to Deustch. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_it\_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Italian to Deustch. ### How to use Here is how to use this model to translate legal text from Italian to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_it\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0859031155705452, 0.14723095297813416, -0.0032964961137622595, 0.09582949429750443, 0.07835417240858078, 0.015385596081614494, 0.02765839174389839, 0.12034346163272858, -0.036221738904714584, 0.08541880548000336, 0.04383452236652374, 0.011826399713754654, 0.059423379600048065, 0.04057876393198967, 0.03329566866159439, -0.20114274322986603, 0.002156906994059682, -0.02555009536445141, -0.04560722038149834, 0.09896489232778549, 0.09478069096803665, -0.05719849094748497, 0.05270680785179138, -0.05006302148103714, -0.08652441203594208, 0.04601270332932472, -0.0863969624042511, -0.034981586039066315, 0.11036182940006256, 0.0819263607263565, 0.07849270105361938, -0.015555298887193203, 0.0759238749742508, -0.17622408270835876, -0.00273670325987041, 0.07736050337553024, -0.01416388526558876, 0.032525304704904556, 0.10749491304159164, 0.0005237318109720945, 0.18949632346630096, -0.05108834430575371, 0.023169277235865593, 0.04588569328188896, -0.10538181662559509, -0.10756998509168625, -0.06975317001342773, 0.018802575767040253, 0.08020514249801636, 0.14390721917152405, -0.032334715127944946, 0.030915074050426483, -0.01852821372449398, 0.07011169195175171, 0.100095234811306, -0.22233621776103973, -0.027658669278025627, 0.02349093183875084, 0.04691897705197334, 0.0918009877204895, -0.032855868339538574, 0.014912271872162819, 0.057568710297346115, 0.06487353891134262, 0.04356249421834946, -0.03146856650710106, 0.03552722558379173, -0.020639678463339806, -0.12097315490245819, -0.06184636801481247, 0.15972112119197845, 0.022453904151916504, -0.03543807193636894, -0.09653812646865845, -0.056425515562295914, -0.0682489201426506, 0.0053069922141730785, -0.04715558513998985, 0.007169781252741814, 0.003593763802200556, 0.05834774300456047, -0.04425927251577377, -0.10781040042638779, -0.0762452632188797, -0.06680887937545776, 0.08756667375564575, 0.0452868826687336, 0.0070319706574082375, 0.03139138221740723, 0.08070206642150879, -0.1257023811340332, -0.07685534656047821, 0.009776825085282326, 0.02654409594833851, -0.07267685979604721, 0.01520981453359127, -0.012693467549979687, -0.18183887004852295, 0.007704400923103094, -0.00403106352314353, -0.06486093252897263, 0.011374670080840588, 0.033139556646347046, 0.03292353078722954, 0.04878280311822891, 0.10841649770736694, -0.11839308589696884, -0.13039974868297577, -0.022649690508842468, -0.006760686635971069, 0.001248028944246471, 0.016569500789046288, -0.07680735737085342, -0.04585158824920654, 0.005366767756640911, 0.06658782064914703, 0.016611162573099136, -0.0015559135936200619, -0.014655004255473614, -0.03845430538058281, 0.12400847673416138, -0.10621213167905807, 0.015320422127842903, 0.0025243284180760384, -0.10457403212785721, -0.0019510649144649506, 0.0384586937725544, -0.03742927312850952, -0.1082349345088005, 0.05886517092585564, -0.041967567056417465, -0.024955982342362404, -0.11475976556539536, -0.18200597167015076, 0.011012026108801365, -0.015839511528611183, -0.05192744359374046, -0.10055778920650482, -0.13496550917625427, -0.07119813561439514, 0.015324746258556843, -0.07284311205148697, 0.02635934017598629, -0.09124889969825745, 0.0157284177839756, 0.0242606308311224, -0.007218036334961653, 0.05039600655436516, -0.04531003162264824, 0.04828996583819389, 0.01155533641576767, 0.06254632025957108, -0.012386023998260498, 0.04284598305821419, -0.07325542718172073, 0.04196546599268913, -0.06944303214550018, 0.14629022777080536, -0.0038466479163616896, 0.001695177867077291, -0.13745670020580292, -0.050509966909885406, -0.08988809585571289, 0.042864054441452026, 0.10093453526496887, 0.15037955343723297, -0.2213626205921173, -0.032306235283613205, 0.2013777196407318, -0.07943889498710632, -0.06430963426828384, 0.13203972578048706, -0.025872064754366875, 0.049964603036642075, 0.08806377649307251, 0.0935235396027565, 0.02698068879544735, -0.04897526279091835, -0.06429122388362885, -0.024008333683013916, 0.009288351982831955, 0.025539729744195938, 0.09547080844640732, -0.06045639514923096, 0.1251383274793625, 0.007705868687480688, 0.036515481770038605, 0.01345021091401577, -0.04532795026898384, -0.03409639745950699, 0.0016346212942153215, -0.052738357335329056, -0.03293571621179581, 0.01339041069149971, 0.020276352763175964, -0.06634417921304703, -0.08983950316905975, 0.016290917992591858, 0.10894353687763214, -0.07736460119485855, 0.027445849031209946, 0.0102501530200243, -0.025101805105805397, -0.10130225867033005, 0.02177671529352665, -0.1451559215784073, 0.001420371001586318, 0.036424774676561356, -0.04124230891466141, 0.0906483456492424, 0.022384528070688248, 0.05790688097476959, 0.08981858938932419, -0.057108763605356216, -0.017128443345427513, -0.03144370764493942, -0.017153488472104073, -0.09666966646909714, -0.10979130119085312, -0.035989634692668915, -0.011399945244193077, 0.015674730762839317, -0.15214231610298157, 0.021765083074569702, -0.020879382267594337, 0.0978630930185318, -0.0010174870258197188, -0.027435926720499992, -0.0063103376887738705, 0.051568157970905304, -0.043356042355298996, -0.04102626442909241, 0.022309811785817146, -0.014118794351816177, -0.027404682710766792, 0.07942019402980804, -0.11818251013755798, -0.07881414890289307, 0.10429832339286804, 0.022750040516257286, -0.09529103338718414, -0.00522635318338871, -0.0038974464405328035, -0.05836142599582672, -0.04344885051250458, -0.08706103265285492, 0.1829530894756317, 0.0464145764708519, 0.1443953961133957, -0.10583095997571945, -0.03755618631839752, 0.02493656799197197, -0.0034961539786309004, -0.03390388563275337, 0.14257706701755524, 0.05548740178346634, -0.17356367409229279, 0.10051600635051727, 0.03947160765528679, -0.022035889327526093, 0.12719520926475525, 0.07030001282691956, -0.12028137594461441, -0.009672092273831367, 0.04504269361495972, 0.0065864198841154575, 0.03894214332103729, -0.10135728120803833, -0.01982160098850727, 0.016323769465088844, 0.07341185212135315, 0.0789261981844902, -0.09365776926279068, 0.06310305744409561, 0.06468132883310318, -0.03357069194316864, 0.05539202317595482, -0.045403555035591125, -0.06387023627758026, 0.11746124923229218, 0.030943384394049644, -0.04430398717522621, -0.03768429160118103, -0.0345270037651062, -0.09485296159982681, 0.18274453282356262, -0.09350980818271637, -0.23587647080421448, -0.12929530441761017, 0.01920335926115513, -0.05258017033338547, 0.029645804315805435, 0.034284807741642, -0.048379089683294296, -0.038702476769685745, -0.08815356343984604, 0.09531573206186295, -0.11305228620767593, -0.046946313232183456, -0.08914900571107864, 0.057249344885349274, -0.02998771145939827, -0.14103728532791138, 0.0183094572275877, 0.004151364788413048, -0.04685667157173157, -0.013102550990879536, -0.06505440920591354, 0.12900647521018982, 0.1518976092338562, -0.028066793456673622, -0.017577853053808212, 0.008477539755403996, 0.10630590468645096, -0.08214255422353745, 0.05068834125995636, 0.07238078862428665, 0.05519052967429161, 0.014718076214194298, 0.1228506788611412, 0.04001013934612274, -0.06434381008148193, 0.03710583224892616, 0.06277049332857132, -0.022862344980239868, -0.2573408782482147, -0.09766947478055954, -0.06393928825855255, -0.01752672716975212, 0.08580634742975235, 0.049842625856399536, -0.024882705882191658, 0.005591109860688448, -0.045163027942180634, 0.01437290571630001, 0.019730839878320694, 0.05846565589308739, 0.0596240870654583, -0.02268778160214424, 0.0820692926645279, -0.05913311615586281, -0.044291459023952484, 0.10475071519613266, 0.06628779321908951, 0.19273249804973602, -0.032577402889728546, 0.23659376800060272, 0.04450077563524246, 0.025323649868369102, -0.004866060335189104, 0.07688213139772415, -0.026389477774500847, 0.019748354330658913, -0.028273755684494972, -0.06390058249235153, 0.006086735986173153, 0.07027295976877213, 0.000643333769403398, -0.0061576212756335735, -0.05910074710845947, -0.05365105718374252, 0.07694614678621292, 0.22354023158550262, 0.0699610486626625, -0.20274727046489716, -0.05524621531367302, -0.006962448824197054, -0.061177343130111694, -0.0810602530837059, 0.00619851890951395, 0.13656722009181976, -0.07227807492017746, -0.0130086624994874, 0.03199579194188118, 0.131949320435524, -0.13539588451385498, -0.02322629652917385, 0.02140073850750923, 0.03275693207979202, -0.02758636698126793, 0.11723047494888306, -0.2502760589122772, 0.17134861648082733, 0.028071854263544083, 0.07493364065885544, -0.052147313952445984, 0.01601278968155384, -0.0404910184442997, 0.003200524253770709, 0.11178795248270035, 0.018391864374279976, -0.028380725532770157, -0.10880590230226517, -0.1069372221827507, -0.028187749907374382, 0.0824112668633461, -0.05356950685381889, 0.08530133962631226, 0.05680263787508011, -0.001323805539868772, -0.004430230241268873, 0.07783359289169312, -0.02009204402565956, -0.16945475339889526, 0.005360992159694433, -0.001990697579458356, -0.04296620190143585, -0.006707743741571903, -0.05548648536205292, -0.028443267568945885, 0.23268070816993713, -0.10529360920190811, -0.07043366134166718, -0.08348830789327621, 0.0400647334754467, 0.11189192533493042, -0.07253940403461456, 0.0339774414896965, 0.011510302312672138, 0.04129848629236221, -0.054227717220783234, -0.03521736338734627, 0.09807491302490234, -0.0563676580786705, -0.05824900045990944, -0.06641445308923721, 0.13751007616519928, 0.05799064040184021, 0.04039964824914932, -0.014454443007707596, 0.03884775936603546, 0.008061139844357967, -0.08859694749116898, -0.006218819413334131, 0.04482447728514671, 0.12790998816490173, 0.06001667678356171, -0.07767006009817123, -0.07192875444889069, -0.07449972629547119, -0.08035168796777725, 0.1436893492937088, 0.16447211802005768, -0.05567474290728569, 0.004052492789924145, 0.16243357956409454, -0.11534161865711212, -0.17424818873405457, -0.02070736140012741, 0.06916532665491104, 0.07396163046360016, -0.043938614428043365, -0.18654321134090424, 0.0034679980017244816, 0.14454594254493713, -0.002194286324083805, 0.08103301376104355, -0.3827279508113861, -0.13338273763656616, 0.017059791833162308, 0.03683860972523689, 0.0099747059866786, -0.12119046598672867, -0.039249081164598465, -0.07181096076965332, -0.09444833546876907, 0.08392056077718735, -0.021050000563263893, 0.09226194024085999, -0.0030699078924953938, 0.007415855769068003, 0.041356753557920456, -0.03792165219783783, 0.12679877877235413, 0.022810880094766617, 0.03490167856216431, -0.05044063925743103, 0.03688818961381912, 0.013731828890740871, -0.012912739999592304, 0.15305110812187195, -0.04161658138036728, 0.040217746049165726, -0.1366916447877884, -0.07016241550445557, -0.05628636106848717, 0.023665031418204308, -0.03809771686792374, -0.0753650888800621, -0.042505282908678055, 0.031911641359329224, 0.033696528524160385, -0.004977966658771038, -0.0027440597768872976, -0.06382175534963608, 0.03288896009325981, 0.17295420169830322, 0.05510512366890907, 0.026048418134450912, -0.08800175040960312, 0.009421239607036114, 0.00008034746133489534, 0.05871989205479622, -0.11901580542325974, 0.008538458496332169, 0.1495441198348999, 0.02609163708984852, 0.12016073614358902, -0.0074448916129767895, -0.1363554447889328, 0.007027346175163984, 0.07015200704336166, -0.07757481187582016, -0.12028322368860245, -0.018765024840831757, 0.028955643996596336, -0.05117146298289299, 0.013306033797562122, 0.1160273626446724, -0.08041510730981827, -0.02839314006268978, -0.014956213533878326, 0.03614730015397072, -0.07231538742780685, 0.2303861528635025, 0.029352642595767975, 0.03654734045267105, -0.05426091328263283, 0.12711969017982483, 0.12079250067472458, -0.1305464655160904, 0.03603469580411911, 0.1992901712656021, -0.06292175501585007, -0.053728483617305756, 0.007998108863830566, 0.10352300107479095, -0.06078696623444557, -0.06472273916006088, -0.039890825748443604, -0.026555348187685013, 0.026451652869582176, -0.004479933530092239, 0.03918331861495972, 0.04678584635257721, -0.021903416141867638, -0.03851962834596634, -0.10446326434612274, 0.08723948895931244, 0.07013633847236633, 0.0077935499139130116, -0.019923141226172447, 0.08774960041046143, 0.0167499091476202, -0.018431290984153748, -0.012496012263000011, -0.004498620051890612, -0.04723323509097099, 0.018313899636268616, -0.08433587104082108, -0.016136808320879936, -0.052988193929195404, -0.017774686217308044, -0.043133631348609924, 0.005180698353797197, -0.02099410630762577, 0.004093929659575224, -0.04809384047985077, -0.05777518078684807, -0.05488094687461853, 0.023851368576288223, -0.09695594012737274, -0.03657945990562439, -0.006643292959779501, -0.0344960018992424, 0.06909219920635223, 0.03550567477941513, 0.010873649269342422, 0.019663792103528976, -0.007375219836831093, 0.061345528811216354, 0.008226264268159866, 0.051681529730558395, 0.008663127198815346, -0.05977320298552513, 0.029835743829607964, 0.04167478531599045, -0.017069371417164803, -0.009334789589047432, 0.0026858255732804537, -0.12444479018449783, -0.032533030956983566, -0.05481398478150368, -0.03325102478265762, -0.06081337109208107, 0.10240674763917923, 0.07698920369148254, 0.07351984083652496, 0.06403566896915436, -0.0570482462644577, 0.07260845601558685, -0.15678834915161133, -0.004017375409603119, 0.015678660944104195, -0.032725654542446136, -0.008292865939438343, 0.000770274898968637, 0.053548671305179596, -0.05909173563122749, 0.13373412191867828, 0.015834596008062363, 0.05796017125248909, 0.023293988779187202, -0.07590358704328537, 0.0014357335167005658, 0.020490892231464386, 0.12246271222829819, -0.018422476947307587, -0.022987278178334236, -0.07789994031190872, 0.08800871670246124, -0.005726627539843321, 0.14562951028347015, 0.036100391298532486, 0.15305212140083313, 0.12697017192840576, 0.04490441828966141, 0.029381992295384407, -0.09684300422668457, -0.07450989633798599, 0.0492628775537014, -0.005889873951673508, 0.0566805936396122, -0.027794282883405685, 0.09564851224422455, 0.15606294572353363, -0.16376768052577972, 0.10260455310344696, -0.002966804662719369, -0.08311045169830322, -0.04905261471867561, -0.11586113274097443, -0.04795756936073303, -0.04718754068017006, -0.036322370171546936, -0.11777103692293167, 0.018437974154949188, 0.023508938029408455, 0.05531638115644455, -0.0477483756840229, 0.11051391065120697, 0.012158301658928394, -0.1027897447347641, 0.0732291042804718, 0.007931643165647984, 0.082611583173275, -0.00648707477375865, 0.020375696942210197, 0.04332001879811287, -0.010072882287204266, 0.0505310483276844, 0.0508437305688858, -0.010618132539093494, 0.005150817800313234, 0.0035974427592009306, -0.04946187138557434, -0.043070416897535324, 0.0229958388954401, 0.058528169989585876, 0.20814178884029388, 0.0480065643787384, -0.09072105586528778, -0.01496926974505186, 0.16495366394519806, -0.045037005096673965, -0.07594668865203857, -0.09535501897335052, 0.2165522575378418, 0.05305325984954834, 0.043990690261125565, -0.00979109201580286, -0.10449613630771637, -0.011652222834527493, 0.10550635308027267, 0.19951823353767395, -0.03980236127972603, -0.04011885076761246, 0.029531540349125862, -0.0046935901045799255, 0.014828970655798912, 0.02330583706498146, 0.05550587177276611, 0.27250123023986816, -0.07837200164794922, 0.07333377003669739, -0.05787058547139168, 0.025005299597978592, 0.0011571116046980023, 0.14943848550319672, -0.0011098496615886688, 0.011436334811151028, -0.05533412843942642, 0.09400643408298492, 0.025985484942793846, -0.16830937564373016, -0.0007854219293221831, -0.12035208195447922, -0.11787359416484833, 0.013777214102447033, -0.05396070331335068, 0.028127819299697876, 0.0877288356423378, 0.0032760228496044874, 0.036341387778520584, 0.034860942512750626, 0.015528429299592972, -0.11453402787446976, -0.14151673018932343, 0.012908698990941048, -0.006554435007274151, 0.09545871615409851, -0.007792883086949587, 0.11616943776607513, 0.10326618701219559, 0.03343245014548302, -0.08579623699188232, 0.09852877259254456, 0.02636764757335186, 0.030354220420122147, 0.06515569239854813, 0.08214263617992401, -0.026249125599861145, 0.037271928042173386, 0.040997330099344254, -0.05699474737048149, 0.042343974113464355, -0.06549148261547089, -0.028218036517500877, -0.13934233784675598, 0.08302117884159088, -0.03224257379770279, 0.1413886547088623, 0.16834446787834167, -0.01466869842261076, -0.003238089382648468, -0.049901675432920456, 0.024181099608540535, 0.012067296542227268, 0.10244680196046829, -0.01568419113755226, -0.2181059867143631, 0.027990546077489853, -0.05132865533232689, 0.036187052726745605, -0.2409532219171524, -0.040125902742147446, 0.028381869196891785, -0.04830809310078621, -0.03249809518456459, 0.07477883249521255, 0.07622335851192474, 0.032243672758340836, -0.03951922431588173, -0.10986881703138351, -0.011129088699817657, 0.10413292795419693, -0.10472894459962845, -0.11266729980707169 ]
null
null
transformers
# legal_t5_small_multitask_it_en model Model on translating legal text from Italian to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_it_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Italian to English. ### How to use Here is how to use this model to translate legal text from Italian to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_it_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_it_en", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "Con l’adesione all'area dell'euro questo procedimento non è stato più possibile." pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_it_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_it_en | 36.687| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian English", "tags": ["translation Italian English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Con l\u2019adesione all'area dell'euro questo procedimento non \u00e8 stato pi\u00f9 possibile."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_it_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Italian English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Italian English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_it\_en model ========================================= Model on translating legal text from Italian to English. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_it\_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Italian to English. ### How to use Here is how to use this model to translate legal text from Italian to English in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_it\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Italian to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07610855251550674, 0.14855924248695374, -0.003920515533536673, 0.08129172772169113, 0.06299981474876404, 0.017811760306358337, 0.038037028163671494, 0.11213576793670654, -0.06711895763874054, 0.07285373657941818, 0.037315092980861664, 0.008158531039953232, 0.07416731119155884, 0.03458933159708977, 0.031096436083316803, -0.1852429211139679, -0.00006438335549319163, -0.023525042459368706, -0.025809699669480324, 0.11011289060115814, 0.09234169125556946, -0.06944665312767029, 0.04087838903069496, -0.029330158606171608, -0.07847592234611511, 0.034779783338308334, -0.09476146101951599, -0.03751331567764282, 0.10494275391101837, 0.07742707431316376, 0.0964071974158287, 0.0009637693292461336, 0.080499067902565, -0.18460193276405334, -0.009616265073418617, 0.08027436584234238, -0.0033001985866576433, 0.039667919278144836, 0.1172693520784378, 0.011955510824918747, 0.16577687859535217, -0.03305625170469284, 0.02431250736117363, 0.04318447783589363, -0.10437579452991486, -0.12878544628620148, -0.04910871759057045, 0.003333644475787878, 0.09376411885023117, 0.1381317377090454, -0.041140563786029816, 0.06244459003210068, -0.014683917164802551, 0.07675675302743912, 0.06923293322324753, -0.223297119140625, -0.03517426177859306, -0.005144017282873392, 0.045426271855831146, 0.10334658622741699, -0.036233656108379364, -0.007560281548649073, 0.06609737873077393, 0.054791830480098724, 0.041520506143569946, -0.03565644100308418, -0.01772499270737171, -0.026156466454267502, -0.13484443724155426, -0.055954255163669586, 0.17624308168888092, 0.014452154748141766, -0.040435194969177246, -0.09517442435026169, -0.06259868294000626, -0.05438034608960152, 0.009610642679035664, -0.04089590534567833, 0.0031681538093835115, 0.008153654634952545, 0.056650321930646896, -0.037030816078186035, -0.12229114770889282, -0.06791878491640091, -0.052073512226343155, 0.10035071521997452, 0.06610579043626785, 0.013804343529045582, 0.027045482769608498, 0.08668126165866852, -0.1058645099401474, -0.07684512436389923, 0.0257546566426754, 0.011915740557014942, -0.08412671089172363, 0.007829563692212105, 0.0007811809191480279, -0.20898938179016113, -0.011260565370321274, -0.035595592111349106, -0.06579043716192245, 0.028203358873724937, 0.050250083208084106, 0.04732456058263779, 0.05690152198076248, 0.11202831566333771, -0.09589772671461105, -0.10172829777002335, -0.04084973782300949, 0.0004292600497137755, -0.021244363859295845, 0.021436166018247604, -0.06602849066257477, -0.03258197382092476, -0.02127496898174286, 0.04412273317575455, 0.014303057454526424, 0.00002453401430102531, -0.03481641784310341, -0.023021150380373, 0.09127842634916306, -0.09868760406970978, 0.023662032559514046, 0.013992019928991795, -0.10537943243980408, -0.02388228103518486, 0.04558767005801201, -0.022313473746180534, -0.12061882764101028, 0.056074973195791245, -0.0308483075350523, -0.018685301765799522, -0.12095071375370026, -0.18216575682163239, 0.002978142350912094, -0.013690846040844917, -0.06032690033316612, -0.0888531506061554, -0.10475680977106094, -0.07570411264896393, 0.027400122955441475, -0.06184816360473633, 0.017088785767555237, -0.09772190451622009, 0.01780700869858265, 0.018885690718889236, -0.009674493223428726, 0.07151725143194199, -0.04401501268148422, 0.05473289266228676, 0.019511165097355843, 0.06550279259681702, 0.01042884960770607, 0.03446109592914581, -0.1028621643781662, 0.029523923993110657, -0.06021938845515251, 0.1335417926311493, -0.004656773526221514, -0.006615318823605776, -0.13830670714378357, -0.05837199091911316, -0.08727054297924042, 0.03219250589609146, 0.09339454025030136, 0.14374229311943054, -0.21771250665187836, -0.020769597962498665, 0.21298666298389435, -0.07514234632253647, -0.07561996579170227, 0.1438826024532318, -0.028576776385307312, 0.06269270181655884, 0.06850114464759827, 0.0695788562297821, 0.042185306549072266, -0.047970741987228394, -0.06240646168589592, -0.008650170639157295, 0.021850137040019035, 0.04238862544298172, 0.09245985001325607, -0.05831453949213028, 0.1001686230301857, -0.002383363898843527, 0.02691713534295559, 0.022769296541810036, -0.05102359130978584, -0.04303886368870735, -0.000650610716547817, -0.05303269624710083, -0.007106903940439224, 0.03018026612699032, 0.017697568982839584, -0.07253319770097733, -0.09757689386606216, -0.04399506002664566, 0.0988859087228775, -0.07577662169933319, 0.03236092999577522, 0.004503590986132622, -0.03788159415125847, -0.09764699637889862, 0.009710289537906647, -0.14753316342830658, 0.0007194393547251821, 0.04928586259484291, -0.02941828966140747, 0.08796944469213486, 0.03517024591565132, 0.0581262931227684, 0.10218451172113419, -0.046475231647491455, -0.03916076943278313, -0.010601069778203964, -0.023196328431367874, -0.1000726968050003, -0.1297142654657364, -0.03064325824379921, -0.010909663513302803, 0.01879696547985077, -0.14856211841106415, 0.022296104580163956, -0.038701243698596954, 0.10490041971206665, 0.0010371829848736525, -0.028881952166557312, -0.0017206799238920212, 0.07163186371326447, -0.05080563202500343, -0.03177737444639206, 0.03038777969777584, -0.015593226067721844, -0.04605071619153023, 0.09620156139135361, -0.09521608799695969, -0.11543691903352737, 0.08791971951723099, -0.01224686298519373, -0.09173823893070221, 0.0007455943268723786, -0.014070305041968822, -0.06651364266872406, -0.05777912586927414, -0.07003767043352127, 0.23624253273010254, 0.04338271543383598, 0.14795520901679993, -0.11488659679889679, -0.03987389802932739, 0.023816337808966637, -0.02788127399981022, -0.026021864265203476, 0.1540648192167282, 0.07170349359512329, -0.16658689081668854, 0.09265074878931046, 0.05321619659662247, -0.022912070155143738, 0.10947026312351227, 0.058423593640327454, -0.11824765801429749, 0.007885380648076534, 0.06915809214115143, -0.0033139558508992195, 0.03235597908496857, -0.10809788107872009, -0.011701846495270729, 0.021873222663998604, 0.06234744191169739, 0.0731063112616539, -0.1176183670759201, 0.07095123827457428, 0.06967829912900925, -0.030384855344891548, 0.03479888290166855, -0.06096139922738075, -0.04848216101527214, 0.10764617472887039, 0.004677760414779186, -0.05346624553203583, -0.03360002487897873, -0.0372900553047657, -0.1023721843957901, 0.1763116866350174, -0.09017549455165863, -0.22441884875297546, -0.13041047751903534, 0.006171424873173237, -0.029648933559656143, 0.020996272563934326, 0.03403591364622116, -0.05100204423069954, -0.047153763473033905, -0.08239956945180893, 0.06942636519670486, -0.11674187332391739, -0.059478357434272766, -0.09715047478675842, 0.06208598613739014, -0.015507232397794724, -0.13839523494243622, 0.03464268147945404, 0.015079906210303307, -0.0375385545194149, -0.005034166853874922, -0.0503687709569931, 0.1288001984357834, 0.13950258493423462, -0.040007248520851135, -0.02770749293267727, 0.0028659694362431765, 0.1244061291217804, -0.08209770917892456, 0.041092921048402786, 0.058893267065286636, 0.028992991894483566, 0.03926113247871399, 0.12946704030036926, 0.03710731863975525, -0.05085828900337219, 0.03190217912197113, 0.04606157913804054, -0.0048885224387049675, -0.26435914635658264, -0.09372295439243317, -0.06367828696966171, -0.02420075424015522, 0.08245494216680527, 0.041884347796440125, -0.06318547576665878, 0.012348335236310959, -0.0507548563182354, 0.014251846820116043, 0.012957865372300148, 0.05451342836022377, 0.02128608338534832, -0.020161714404821396, 0.07480791956186295, -0.05602778121829033, -0.06216821447014809, 0.09570997953414917, 0.0450354628264904, 0.21223177015781403, -0.04913969337940216, 0.23271195590496063, 0.050155870616436005, 0.07107160240411758, -0.0015682231169193983, 0.07176509499549866, -0.03708062693476677, 0.030544059351086617, -0.024490157142281532, -0.05934113264083862, 0.0027784365229308605, 0.07134830951690674, 0.01373837236315012, 0.006116210483014584, -0.05241695046424866, -0.03994254767894745, 0.07322881370782852, 0.21097998321056366, 0.07949476689100266, -0.20614710450172424, -0.03674544021487236, -0.0021211490966379642, -0.06542149186134338, -0.08521000295877457, 0.009507983922958374, 0.16522523760795593, -0.06828158348798752, -0.017847124487161636, 0.02452673390507698, 0.1317281275987625, -0.13451562821865082, -0.027137260884046555, 0.01946903020143509, 0.029575543478131294, -0.021002009510993958, 0.1346539705991745, -0.23771747946739197, 0.1827487051486969, 0.018715569749474525, 0.07894451916217804, -0.0524323433637619, 0.030271437019109726, -0.05847078561782837, 0.012458084151148796, 0.11278067529201508, 0.016684159636497498, -0.029123280197381973, -0.1172981783747673, -0.10977695137262344, -0.021603059023618698, 0.09079936891794205, -0.03443325683474541, 0.08884783834218979, 0.061924099922180176, 0.01051324512809515, -0.0030147309880703688, 0.04706438630819321, -0.0358259379863739, -0.16593395173549652, 0.014270275831222534, 0.013121150434017181, -0.04212679713964462, -0.009293348528444767, -0.049466270953416824, -0.05877242982387543, 0.2200421541929245, -0.125981405377388, -0.06855526566505432, -0.06972046941518784, 0.01642611436545849, 0.12675680220127106, -0.0664408802986145, 0.011664480902254581, 0.019046658650040627, 0.030345100909471512, -0.03959795832633972, -0.011452112346887589, 0.08627321571111679, -0.06212914362549782, -0.06617629528045654, -0.08441342413425446, 0.1155199259519577, 0.06107046455144882, 0.041839808225631714, -0.01738293468952179, 0.02654334157705307, -0.009071500040590763, -0.08290218561887741, 0.0012856421526521444, 0.023149048909544945, 0.15360882878303528, 0.05064178630709648, -0.07636755704879761, -0.07588880509138107, -0.08576034754514694, -0.07828052341938019, 0.13761359453201294, 0.1718231439590454, -0.04957307130098343, 0.0138440802693367, 0.18309682607650757, -0.12510041892528534, -0.15649105608463287, -0.03614308685064316, 0.0841621533036232, 0.08494158834218979, -0.03665376082062721, -0.18215589225292206, 0.003835704643279314, 0.1177116334438324, 0.000008235294444602914, 0.04788518324494362, -0.4098983407020569, -0.13228468596935272, 0.012677495367825031, 0.0465070977807045, 0.0015435076784342527, -0.10838862508535385, -0.038021791726350784, -0.044014327228069305, -0.10426487773656845, 0.05944245308637619, -0.005434829741716385, 0.09165642410516739, 0.008815139532089233, 0.0029974239878356457, 0.04669344052672386, -0.03966347128152847, 0.12391332536935806, 0.0047326101921498775, 0.027810918167233467, -0.04934588074684143, 0.048953235149383545, 0.018140695989131927, -0.0020062951371073723, 0.14151912927627563, -0.04543429985642433, 0.04602457955479622, -0.16054685413837433, -0.04973899945616722, -0.04834428057074547, 0.01238292083144188, -0.03806065022945404, -0.06912904977798462, -0.03711243346333504, 0.029783103615045547, 0.05096091330051422, 0.001977618783712387, 0.0011271650437265635, -0.047293201088905334, 0.02951766550540924, 0.17233282327651978, 0.0893537625670433, 0.030108727514743805, -0.0911332443356514, 0.007287757005542517, 0.008837777189910412, 0.058573342859745026, -0.11454130709171295, 0.01172971073538065, 0.15258339047431946, 0.018869170919060707, 0.118022121489048, -0.006201642565429211, -0.13638225197792053, 0.014580695889890194, 0.07420895248651505, -0.09227848798036575, -0.14203114807605743, -0.0216000284999609, 0.03385601565241814, -0.057874709367752075, 0.005597516428679228, 0.091203972697258, -0.08542221784591675, -0.027555590495467186, -0.01877111755311489, 0.036080434918403625, -0.06125001981854439, 0.2261398583650589, 0.007507525850087404, 0.04276794195175171, -0.05128324404358864, 0.12228871136903763, 0.1500963419675827, -0.14232705533504486, 0.02418856881558895, 0.1859121322631836, -0.06172845885157585, -0.04439090937376022, 0.044210340827703476, 0.10901740193367004, -0.003748751012608409, -0.0698283314704895, -0.03298027440905571, -0.023013915866613388, 0.011125057004392147, -0.016025494784116745, 0.029233772307634354, 0.040832456201314926, -0.017067978158593178, -0.040455322712659836, -0.11777269840240479, 0.09496299922466278, 0.0886063501238823, 0.01430721115320921, -0.025796841830015182, 0.11921363323926926, 0.016640083864331245, -0.026687076315283775, -0.009215827099978924, 0.006251753307878971, -0.04026629030704498, 0.027834858745336533, -0.06048949435353279, -0.012150063179433346, -0.04246639460325241, -0.019478129222989082, -0.035842109471559525, -0.006275391671806574, -0.016188420355319977, 0.007350982166826725, -0.05154624581336975, -0.04111860692501068, -0.036708202213048935, 0.03318661451339722, -0.09036368876695633, -0.036604247987270355, 0.010688889771699905, -0.036263033747673035, 0.06638307124376297, 0.02236713282763958, -0.008687997236847878, 0.007671181112527847, -0.014899124391376972, 0.06299158185720444, 0.009167691692709923, 0.05317294970154762, 0.008102189749479294, -0.07312360405921936, 0.03116241656243801, 0.03936620429158211, -0.024120816960930824, -0.014680810272693634, 0.01846666820347309, -0.12254837155342102, -0.026080390438437462, -0.03532611206173897, -0.04866406321525574, -0.06397965550422668, 0.10445357859134674, 0.06833307445049286, 0.07187116146087646, 0.09442303329706192, -0.05373838543891907, 0.07410331815481186, -0.14645139873027802, -0.00047319981968030334, 0.03602011874318123, -0.038314271718263626, -0.008358311839401722, -0.0129378791898489, 0.04875938221812248, -0.07647686451673508, 0.12139369547367096, 0.010073237121105194, 0.06273898482322693, 0.009978757239878178, -0.08416344970464706, -0.016032801941037178, 0.01936391182243824, 0.1053403913974762, -0.029181255027651787, -0.0239937212318182, -0.07897674292325974, 0.07964307814836502, -0.0050201416015625, 0.12652699649333954, 0.02591613121330738, 0.13425466418266296, 0.144203782081604, 0.049307938665151596, 0.013432158157229424, -0.09913769364356995, -0.07564939558506012, 0.07982749491930008, -0.011488898657262325, 0.06111302226781845, -0.03585335612297058, 0.13360778987407684, 0.1395527571439743, -0.1458980292081833, 0.0959123969078064, 0.00030039457487873733, -0.09388820081949234, -0.056317854672670364, -0.10699202120304108, -0.04474743455648422, -0.04435611516237259, -0.029531138017773628, -0.11935672163963318, 0.022382671013474464, 0.0026847939006984234, 0.04796598106622696, -0.04673442617058754, 0.1064295768737793, 0.025152767077088356, -0.10402233898639679, 0.07442568242549896, 0.001150402589701116, 0.11037549376487732, -0.021315928548574448, 0.009700017049908638, 0.05012710019946098, -0.0165216326713562, 0.054729919880628586, 0.05094042420387268, -0.013363875448703766, 0.009604508057236671, 0.005452802870422602, -0.055136796087026596, -0.045901477336883545, 0.018888460472226143, 0.07002566009759903, 0.18780101835727692, 0.05234925076365471, -0.058128513395786285, -0.029617322608828545, 0.1698281168937683, -0.05333081632852554, -0.06676208972930908, -0.10333423316478729, 0.19654230773448944, 0.03981267288327217, 0.02916826121509075, 0.021815568208694458, -0.10557276755571365, -0.008742918260395527, 0.131902277469635, 0.17693637311458588, -0.035055212676525116, -0.042576007544994354, 0.03486311808228493, -0.007368700113147497, 0.009395723231136799, 0.034164909273386, 0.05388811230659485, 0.253551721572876, -0.0879325270652771, 0.06884611397981644, -0.06108342483639717, 0.04226771742105484, -0.007746429182589054, 0.15091420710086823, -0.003226430853828788, 0.006534339394420385, -0.06282939016819, 0.09136549383401871, 0.02863847278058529, -0.14330297708511353, -0.0074234940111637115, -0.10666801035404205, -0.11441917717456818, 0.023857316002249718, 0.005795791745185852, 0.025088757276535034, 0.08310916274785995, -0.0000344584695994854, 0.027578633278608322, 0.057657431811094284, 0.012236432172358036, -0.09311695396900177, -0.11928832530975342, 0.003041208256036043, -0.0799567922949791, 0.11258462071418762, 0.006081886123865843, 0.14757680892944336, 0.09970961511135101, 0.024793757125735283, -0.08789382874965668, 0.09380532801151276, 0.024688292294740677, 0.031332530081272125, 0.06829512119293213, 0.09336724132299423, -0.018849210813641548, 0.07358286529779434, 0.04056769609451294, -0.04762778803706169, 0.060465361922979355, -0.07621046900749207, -0.016564426943659782, -0.12219977378845215, 0.08450587093830109, -0.029877787455916405, 0.13813984394073486, 0.17849311232566833, -0.013105192221701145, -0.0062959627248346806, -0.047768063843250275, 0.013729196041822433, -0.004504285287111998, 0.10822685807943344, -0.016884513199329376, -0.21797387301921844, 0.022481143474578857, -0.030444636940956116, 0.05136298015713692, -0.22164270281791687, -0.02758292853832245, 0.029612228274345398, -0.05439775809645653, -0.041606493294239044, 0.07323011010885239, 0.09032103419303894, 0.02756834216415882, -0.03230700269341469, -0.12229271233081818, -0.006487022154033184, 0.10310064256191254, -0.09292473644018173, -0.10192009061574936 ]
null
null
transformers
# legal_t5_small_multitask_it_es model Model on translating legal text from Italian to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_it_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Italian to Spanish. ### How to use Here is how to use this model to translate legal text from Italian to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_it_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_it_es", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "Interrogazione con richiesta di risposta scritta E-005808/2011" pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_it_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_it_es | 36.980| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian Spanish", "tags": ["translation Italian Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Interrogazione con richiesta di risposta scritta E-005808/2011"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_it_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Italian Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Italian Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_it\_es model ========================================= Model on translating legal text from Italian to Spanish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_it\_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Italian to Spanish. ### How to use Here is how to use this model to translate legal text from Italian to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_it\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08201731741428375, 0.14985337853431702, -0.004203430842608213, 0.0834767147898674, 0.06519290804862976, 0.014364570379257202, 0.028659574687480927, 0.10987593233585358, -0.07128090411424637, 0.0761997178196907, 0.033914584666490555, 0.02044278383255005, 0.07062136381864548, 0.02561907283961773, 0.02388429082930088, -0.18412330746650696, 0.000950874004047364, -0.0249197818338871, -0.028472254052758217, 0.10574927181005478, 0.08621695637702942, -0.06426822394132614, 0.03745783865451813, -0.03334780037403107, -0.07596410065889359, 0.039749402552843094, -0.08855526894330978, -0.050593387335538864, 0.10614102333784103, 0.0775897428393364, 0.10057603567838669, 0.006111423950642347, 0.07632096111774445, -0.17328353226184845, -0.011858080513775349, 0.07613667100667953, -0.006882465444505215, 0.04244057834148407, 0.12402426451444626, 0.005336571019142866, 0.168328195810318, -0.03984470292925835, 0.025902034714818, 0.04491274803876877, -0.11700563877820969, -0.13501083850860596, -0.0534813366830349, -0.01102630328387022, 0.09404481947422028, 0.13339997828006744, -0.037619076669216156, 0.05852670967578888, -0.01392701268196106, 0.06661169230937958, 0.06834830343723297, -0.21915458142757416, -0.037316739559173584, -0.00447605038061738, 0.04731285199522972, 0.10563334077596664, -0.026841389015316963, -0.0028620560187846422, 0.07246644049882889, 0.05068976432085037, 0.04097903147339821, -0.04480357840657234, -0.02878982201218605, -0.02533339336514473, -0.13145191967487335, -0.06309923529624939, 0.16715484857559204, 0.008370584808290005, -0.03705178573727608, -0.09450682997703552, -0.06848397105932236, -0.049552641808986664, 0.012233860790729523, -0.04082312807440758, 0.008409513160586357, 0.0022110154386609793, 0.059480082243680954, -0.031217895448207855, -0.11886265873908997, -0.0674113780260086, -0.05382697656750679, 0.09524410218000412, 0.06043587997555733, 0.00844494253396988, 0.0338723286986351, 0.08914268016815186, -0.09491777420043945, -0.08164037019014359, 0.029703345149755478, 0.01666177064180374, -0.08316032588481903, 0.012089451774954796, 0.00037853376124985516, -0.2122422754764557, -0.006809506099671125, -0.04621901735663414, -0.08418357372283936, 0.026848826557397842, 0.0500580295920372, 0.044904790818691254, 0.05677627772092819, 0.11118639260530472, -0.08908401429653168, -0.09745174646377563, -0.044959355145692825, -0.007323252968490124, -0.02077348157763481, 0.01966315321624279, -0.07477466017007828, -0.039374496787786484, -0.025077100843191147, 0.05196027085185051, 0.020901698619127274, -0.005249883513897657, -0.038685139268636703, -0.028932489454746246, 0.09148748964071274, -0.09478596597909927, 0.027187181636691093, 0.013928895816206932, -0.10343301296234131, -0.022075699642300606, 0.03868229687213898, -0.026843378320336342, -0.11955330520868301, 0.04797970503568649, -0.029440784826874733, -0.025308428332209587, -0.1266130656003952, -0.18222321569919586, 0.008790230378508568, -0.017262151464819908, -0.059022773057222366, -0.09173575788736343, -0.09808570891618729, -0.08694455772638321, 0.027376646175980568, -0.07231611758470535, 0.020567189902067184, -0.1044263243675232, 0.02358129248023033, 0.0208461731672287, -0.013285336084663868, 0.0723809152841568, -0.04520956799387932, 0.05806450918316841, 0.027400031685829163, 0.06897305697202682, 0.010686899535357952, 0.03342176601290703, -0.100028857588768, 0.03531297668814659, -0.07116083800792694, 0.1449700891971588, -0.005939301569014788, -0.006623402703553438, -0.1427876204252243, -0.05775962397456169, -0.08836254477500916, 0.03768376260995865, 0.09207771718502045, 0.1490475833415985, -0.21449699997901917, -0.02447138912975788, 0.21999144554138184, -0.06712688505649567, -0.0703788697719574, 0.13493014872074127, -0.026392942294478416, 0.06809194386005402, 0.06905747205018997, 0.07663633674383163, 0.03928478807210922, -0.048067159950733185, -0.05141559615731239, -0.008809375576674938, 0.02588789537549019, 0.03423022851347923, 0.09798676520586014, -0.06773965805768967, 0.09605810046195984, -0.001677318592555821, 0.017566455528140068, 0.019778871908783913, -0.04666948318481445, -0.03965987637639046, 0.006610168609768152, -0.04230429604649544, -0.008498826995491982, 0.029033292084932327, 0.019944459199905396, -0.06935170292854309, -0.09178580343723297, -0.03676854446530342, 0.0890984907746315, -0.07027559727430344, 0.02790665626525879, 0.007546913344413042, -0.033151961863040924, -0.10801678150892258, 0.008540097624063492, -0.14969715476036072, 0.001769290422089398, 0.045667264610528946, -0.02204323373734951, 0.08500359207391739, 0.03381545469164848, 0.05544053018093109, 0.09848359227180481, -0.04411134123802185, -0.038738492876291275, -0.013116675429046154, -0.025455554947257042, -0.0927802100777626, -0.12985648214817047, -0.027075769379734993, -0.010217355564236641, 0.014390421099960804, -0.14833709597587585, 0.018577490001916885, -0.03699151799082756, 0.09370966255664825, -0.002703087404370308, -0.022891992703080177, -0.0025226420257240534, 0.08128649741411209, -0.049375083297491074, -0.03776630014181137, 0.03644954040646553, -0.012432764284312725, -0.04021139442920685, 0.09552714973688126, -0.09957309067249298, -0.11472520232200623, 0.08583255857229233, -0.012275202199816704, -0.0929923951625824, -0.00815589353442192, -0.009883700869977474, -0.06111510470509529, -0.06057395040988922, -0.06733285635709763, 0.2417982518672943, 0.04225602373480797, 0.15381349623203278, -0.12734340131282806, -0.03923651576042175, 0.024998590350151062, -0.031004013493657112, -0.029791926965117455, 0.15954835712909698, 0.07301734387874603, -0.16467110812664032, 0.09804175049066544, 0.04404407739639282, -0.01640796661376953, 0.11285468935966492, 0.06010662391781807, -0.11804401874542236, 0.006581564899533987, 0.07457764446735382, 0.005358997266739607, 0.034590285271406174, -0.10654193162918091, -0.010282360948622227, 0.021364260464906693, 0.06314682215452194, 0.07962994277477264, -0.11998441815376282, 0.0681675374507904, 0.0701286718249321, -0.03098090924322605, 0.031985532492399216, -0.057757213711738586, -0.047109443694353104, 0.10763996094465256, 0.012970056384801865, -0.05547444522380829, -0.03666425868868828, -0.03590144217014313, -0.10456963628530502, 0.17824110388755798, -0.09062466025352478, -0.2350035309791565, -0.13846775889396667, 0.008392438292503357, -0.025929806753993034, 0.03351929411292076, 0.03836105391383171, -0.05454413220286369, -0.039098143577575684, -0.06774540990591049, 0.07694142311811447, -0.10805077105760574, -0.0663755014538765, -0.09768469631671906, 0.06735478341579437, -0.021172011271119118, -0.13868914544582367, 0.03465432673692703, 0.017830882221460342, -0.03719798102974892, -0.008615401573479176, -0.0576343797147274, 0.13709580898284912, 0.1449429839849472, -0.03445348143577576, -0.03196289762854576, 0.00031565254903398454, 0.1191948801279068, -0.08467982709407806, 0.032811615616083145, 0.06471852213144302, 0.03680442273616791, 0.035396553575992584, 0.12704086303710938, 0.03696572780609131, -0.05339526757597923, 0.023695899173617363, 0.04376526549458504, -0.009810907766222954, -0.26745086908340454, -0.09860335290431976, -0.06354013085365295, -0.03247332572937012, 0.08778323233127594, 0.038332823663949966, -0.057825952768325806, 0.02167729288339615, -0.0481957346200943, 0.01758953556418419, 0.007932234555482864, 0.05662331357598305, 0.027934949845075607, -0.020922569558024406, 0.06697545945644379, -0.058015793561935425, -0.0704793855547905, 0.09928406774997711, 0.05355805158615112, 0.20788396894931793, -0.04804292321205139, 0.23705829679965973, 0.05209830403327942, 0.06930307298898697, -0.00773207750171423, 0.07039899379014969, -0.0363156832754612, 0.031997017562389374, -0.03167543187737465, -0.06132669746875763, -0.002797330031171441, 0.06166256591677666, 0.006466988939791918, 0.005627136677503586, -0.06519950926303864, -0.05068342387676239, 0.07220930606126785, 0.20448334515094757, 0.07198073714971542, -0.21037964522838593, -0.03578091412782669, -0.002048123860731721, -0.05627762898802757, -0.08858572691679001, 0.005351648200303316, 0.16877512633800507, -0.07297157496213913, -0.013168444857001305, 0.02112363465130329, 0.135071262717247, -0.133139967918396, -0.023915739730000496, 0.014920867048203945, 0.02995697222650051, -0.01605982519686222, 0.138177290558815, -0.23390530049800873, 0.1900242269039154, 0.019945019856095314, 0.0738503709435463, -0.05161747336387634, 0.030732477083802223, -0.06803824007511139, 0.006113740615546703, 0.11236116290092468, 0.016436725854873657, -0.028945336118340492, -0.11412256956100464, -0.10509594529867172, -0.021953003481030464, 0.08115600049495697, -0.039253607392311096, 0.09051261097192764, 0.06517573446035385, 0.00876524392515421, -0.006535773165524006, 0.038653649389743805, -0.02465265989303589, -0.17413581907749176, 0.010544415563344955, 0.0027156786527484655, -0.038397058844566345, -0.0088399238884449, -0.044700633734464645, -0.058672018349170685, 0.21582870185375214, -0.1253902018070221, -0.06882241368293762, -0.070982925593853, 0.015258722007274628, 0.129399836063385, -0.06475981324911118, 0.014071917161345482, 0.02030334062874317, 0.026063237339258194, -0.03768114373087883, -0.01035040058195591, 0.09642134606838226, -0.06751871109008789, -0.06287094205617905, -0.08644746243953705, 0.11567123234272003, 0.05666767805814743, 0.04000544548034668, -0.010592645034193993, 0.024595048278570175, -0.004820240195840597, -0.08504047989845276, -0.006495356094092131, 0.018327582627534866, 0.1575888842344284, 0.04776058718562126, -0.08012190461158752, -0.08183285593986511, -0.08173372596502304, -0.07857633382081985, 0.13559532165527344, 0.16497701406478882, -0.049214620143175125, 0.02277296781539917, 0.18305490911006927, -0.1289929896593094, -0.14865323901176453, -0.03691074997186661, 0.09408366680145264, 0.08408558368682861, -0.03617219254374504, -0.1880502700805664, -0.012479704804718494, 0.1243271455168724, 0.0024138924200087786, 0.04183334484696388, -0.4294261634349823, -0.1259787529706955, 0.0025784175377339125, 0.04524249956011772, 0.0045036314986646175, -0.10733837634325027, -0.049184028059244156, -0.05130622535943985, -0.09656641632318497, 0.06524277478456497, -0.008096215315163136, 0.09096993505954742, 0.011630707420408726, 0.0005824837135151029, 0.05261441320180893, -0.03649761527776718, 0.13547101616859436, -0.004338142462074757, 0.024983972311019897, -0.04570472985506058, 0.04853245988488197, 0.020898383110761642, -0.0009745871648192406, 0.13251158595085144, -0.034854207187891006, 0.042997732758522034, -0.16391631960868835, -0.051792629063129425, -0.04672325402498245, 0.012208949774503708, -0.04042579606175423, -0.0628964900970459, -0.03216344490647316, 0.022241313010454178, 0.05632181465625763, -0.00043929583625867963, 0.0015744550619274378, -0.04687904193997383, 0.03584209829568863, 0.17359907925128937, 0.09058807790279388, 0.03379998728632927, -0.09522298723459244, 0.006409663241356611, 0.01556065957993269, 0.06046748906373978, -0.10872538387775421, 0.009197075851261616, 0.15157175064086914, 0.016086183488368988, 0.10738047957420349, -0.006659788079559803, -0.1359570473432541, 0.01935442164540291, 0.0838189497590065, -0.0807657539844513, -0.13778690993785858, -0.023707641288638115, 0.036424700170755386, -0.0492461696267128, 0.0000964345017564483, 0.0912645012140274, -0.07403428852558136, -0.035336486995220184, -0.019466107711195946, 0.02939075417816639, -0.05760389193892479, 0.22936446964740753, 0.0054189302027225494, 0.044238895177841187, -0.05331723764538765, 0.11652984470129013, 0.1517205834388733, -0.14925923943519592, 0.023994112387299538, 0.1849082112312317, -0.058869678527116776, -0.04105368256568909, 0.05283519625663757, 0.11533025652170181, -0.028382742777466774, -0.07399504631757736, -0.04302996024489403, -0.023674210533499718, 0.009699839167296886, -0.007724068593233824, 0.027005380019545555, 0.03886363282799721, -0.011591355316340923, -0.04000048711895943, -0.10712121427059174, 0.08250723034143448, 0.08643127232789993, 0.013559131883084774, -0.027853159233927727, 0.12046365439891815, 0.01748647354543209, -0.030551381409168243, -0.01055410597473383, 0.008481119759380817, -0.05339755490422249, 0.03020472824573517, -0.060141421854496, -0.003000834723934531, -0.04270381107926369, -0.019878389313817024, -0.037963636219501495, 0.000669579254463315, -0.011899694800376892, 0.0028286846354603767, -0.04970646649599075, -0.037261657416820526, -0.039820700883865356, 0.03730916231870651, -0.0846821665763855, -0.030820121988654137, 0.0035478570498526096, -0.034878604114055634, 0.0577356293797493, 0.014016794040799141, -0.007238005753606558, 0.0109492726624012, -0.030504433438181877, 0.06679243594408035, 0.016210656613111496, 0.055303268134593964, 0.013770085759460926, -0.07198289781808853, 0.03767780587077141, 0.043187353760004044, -0.023330142721533775, -0.013582480140030384, 0.017466941848397255, -0.125401109457016, -0.026226449757814407, -0.03138859570026398, -0.057622406631708145, -0.06053461879491806, 0.11457382142543793, 0.07095418125391006, 0.06518791615962982, 0.0914740040898323, -0.06059794872999191, 0.07185153663158417, -0.14504608511924744, -0.006130881607532501, 0.032439906150102615, -0.03170550614595413, -0.008046871051192284, -0.00961287971585989, 0.050724346190690994, -0.06766028702259064, 0.12145164608955383, 0.022402891889214516, 0.08200671523809433, 0.0047686281614005566, -0.08182390034198761, -0.009455030784010887, 0.01784648932516575, 0.1071631982922554, -0.023750150576233864, -0.017021209001541138, -0.08411386609077454, 0.09311357885599136, -0.0023529091849923134, 0.12634636461734772, 0.01673629693686962, 0.13528120517730713, 0.1473364681005478, 0.049932610243558884, 0.017653247341513634, -0.09835917502641678, -0.07005168497562408, 0.08314184099435806, -0.004268455784767866, 0.05674497038125992, -0.032850977033376694, 0.11563804745674133, 0.1349560022354126, -0.1470310091972351, 0.103171706199646, 0.004871206358075142, -0.08783786743879318, -0.05647362768650055, -0.10873109102249146, -0.04137149825692177, -0.05123346298933029, -0.03390764445066452, -0.11732736974954605, 0.023133786395192146, -0.003032705979421735, 0.04208220914006233, -0.04736800491809845, 0.10375635325908661, 0.024187790229916573, -0.11520902812480927, 0.0728197768330574, 0.009434197098016739, 0.1268809288740158, -0.02395438589155674, 0.01795753464102745, 0.048826128244400024, -0.002944977954030037, 0.052413567900657654, 0.0569087490439415, -0.007282330188900232, 0.00765939150005579, 0.007112044375389814, -0.049939896911382675, -0.04284995049238205, 0.02064896747469902, 0.07338976860046387, 0.1951867938041687, 0.04960836097598076, -0.06160007789731026, -0.028955046087503433, 0.181789368391037, -0.05299384146928787, -0.06420738995075226, -0.10359539091587067, 0.19986169040203094, 0.03749888390302658, 0.03652595356106758, 0.018867136910557747, -0.10820110887289047, -0.009718288667500019, 0.1305423229932785, 0.17447641491889954, -0.029781147837638855, -0.04596026986837387, 0.03159697353839874, -0.004508251789957285, 0.01770723983645439, 0.040297336876392365, 0.047498978674411774, 0.27355021238327026, -0.08822818845510483, 0.06107079237699509, -0.059967998415231705, 0.05276351794600487, -0.011817719787359238, 0.15627439320087433, -0.009097078815102577, 0.00356842833571136, -0.05395488813519478, 0.09977202862501144, 0.026796361431479454, -0.15227536857128143, 0.0019354444229975343, -0.10899640619754791, -0.11794477701187134, 0.016763299703598022, 0.01219443790614605, 0.025377515703439713, 0.08733642846345901, 0.004683299921452999, 0.023274006322026253, 0.0533394031226635, 0.016060182824730873, -0.09369326382875443, -0.12306778132915497, 0.0012293205363675952, -0.07617461681365967, 0.11594526469707489, 0.0028811143711209297, 0.14240624010562897, 0.1015033945441246, 0.02372679114341736, -0.08899267017841339, 0.09314439445734024, 0.024207521229982376, 0.029652675613760948, 0.08511941134929657, 0.07379300147294998, -0.017689567059278488, 0.06803082674741745, 0.037957094609737396, -0.05462132766842842, 0.06141773611307144, -0.0634366050362587, -0.007778926286846399, -0.12305215001106262, 0.08739786595106125, -0.03220539912581444, 0.13004787266254425, 0.17882004380226135, -0.01318737119436264, 0.0001976250932784751, -0.0488772988319397, 0.022532790899276733, -0.003008771687746048, 0.1178232952952385, -0.017568286508321762, -0.22327503561973572, 0.017841244116425514, -0.02978437766432762, 0.0467149056494236, -0.22423884272575378, -0.019039517268538475, 0.03254483640193939, -0.056856077164411545, -0.04139845445752144, 0.06711023300886154, 0.0790891945362091, 0.031776078045368195, -0.028446976095438004, -0.12071802467107773, -0.009885422885417938, 0.10397104173898697, -0.08730783313512802, -0.09763791412115097 ]
null
null
transformers
# legal_t5_small_multitask_it_fr model Model on translating legal text from Italian to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_it_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Italian to French. ### How to use Here is how to use this model to translate legal text from Italian to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_it_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_it_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "Gli Stati membri adottano le leggi, i regolamenti e le disposizioni amministrative necessari per ottemperare alla presente direttiva entro il 31 dicembre 2002 e ne informano immediatamente la Commissione." pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_it_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_it_fr | 41.956| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian French", "tags": ["translation Italian French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Gli Stati membri adottano le leggi, i regolamenti e le disposizioni amministrative necessari per ottemperare alla presente direttiva entro il 31 dicembre 2002 e ne informano immediatamente la Commissione."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_it_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Italian French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Italian French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_it\_fr model ========================================= Model on translating legal text from Italian to French. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_it\_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Italian to French. ### How to use Here is how to use this model to translate legal text from Italian to French in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_it\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Italian to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07521497458219528, 0.15343192219734192, -0.004051946569234133, 0.08608192205429077, 0.0537879653275013, 0.005617744754999876, 0.03973504528403282, 0.10149813443422318, -0.0677928775548935, 0.07081921398639679, 0.043201204389333725, -0.004668812733143568, 0.06481827050447464, 0.03699817135930061, 0.03801250830292702, -0.17627866566181183, 0.0028799495194107294, -0.025059673935174942, -0.03489110618829727, 0.10959165543317795, 0.09612714499235153, -0.05831208825111389, 0.04598848149180412, -0.018293019384145737, -0.0753929391503334, 0.046048760414123535, -0.09360317885875702, -0.03751209005713463, 0.1087658479809761, 0.08067432045936584, 0.09465537220239639, 0.004027986899018288, 0.08088811486959457, -0.1819523721933365, -0.010958545841276646, 0.07965680956840515, -0.013253228738904, 0.03973906487226486, 0.12303893268108368, 0.002860159147530794, 0.16301687061786652, -0.03960375115275383, 0.022815197706222534, 0.0446116179227829, -0.09856497496366501, -0.12803304195404053, -0.055216941982507706, -0.002165547339245677, 0.07787435501813889, 0.13987816870212555, -0.03524332121014595, 0.05543706193566322, -0.026447776705026627, 0.07099509239196777, 0.07232170552015305, -0.22112131118774414, -0.03625328093767166, -0.011349737644195557, 0.048965949565172195, 0.09628074616193771, -0.040954139083623886, -0.0004155912029091269, 0.062082499265670776, 0.05074513331055641, 0.047636594623327255, -0.043370265513658524, -0.01544618234038353, -0.028834015130996704, -0.1328638345003128, -0.056629352271556854, 0.1745499223470688, 0.01584205962717533, -0.045091353356838226, -0.09478939324617386, -0.062444742769002914, -0.05489781126379967, -0.004815584048628807, -0.04811340570449829, 0.016498053446412086, 0.0050671775825321674, 0.05925486236810684, -0.03473847359418869, -0.11453191190958023, -0.07016044110059738, -0.05214095860719681, 0.08867195248603821, 0.060509853065013885, 0.011342542245984077, 0.03739060461521149, 0.08858224004507065, -0.12431221455335617, -0.06629743427038193, 0.02550060860812664, 0.009982950054109097, -0.0875420942902565, 0.00971714872866869, 0.004787502810359001, -0.19098232686519623, -0.004225763026624918, -0.02098868042230606, -0.08497549593448639, 0.028307026252150536, 0.044742267578840256, 0.045585017651319504, 0.050568025559186935, 0.11411437392234802, -0.09489990770816803, -0.11100886017084122, -0.04630286246538162, 0.001388370175845921, -0.03388557955622673, 0.031645871698856354, -0.06169227510690689, -0.041605524718761444, -0.027295393869280815, 0.040727000683546066, 0.014367341063916683, -0.012048318050801754, -0.030761878937482834, -0.019843611866235733, 0.10023162513971329, -0.0966637060046196, 0.029851902276277542, 0.012976217083632946, -0.10887455195188522, -0.018605457618832588, 0.04210801422595978, -0.02029716596007347, -0.12322381138801575, 0.05251545459032059, -0.030455775558948517, -0.019739655777812004, -0.12368068099021912, -0.18183717131614685, -0.0022723290603607893, 0.002396176103502512, -0.06884343177080154, -0.08522545546293259, -0.09233730286359787, -0.07732449471950531, 0.04006350040435791, -0.06859638541936874, 0.02045133337378502, -0.1040724366903305, 0.010746559128165245, 0.026206210255622864, 0.0007144320989027619, 0.06756427139043808, -0.04940067604184151, 0.04053466394543648, 0.006608404219150543, 0.06390159577131271, 0.005979008041322231, 0.03300609067082405, -0.0916980430483818, 0.036200638860464096, -0.07059068232774734, 0.1421019434928894, -0.009780912660062313, -0.02725907973945141, -0.14545872807502747, -0.0674891248345375, -0.09490535408258438, 0.02796652540564537, 0.10002880543470383, 0.1505735218524933, -0.2156047523021698, -0.027949528768658638, 0.22685545682907104, -0.07395832985639572, -0.07182898372411728, 0.163117453455925, -0.03048771247267723, 0.048113420605659485, 0.06137431412935257, 0.07086841017007828, 0.048719365149736404, -0.0536423921585083, -0.05972680822014809, 0.007102443370968103, 0.025561444461345673, 0.042238738387823105, 0.10445038974285126, -0.06019851937890053, 0.08815501630306244, -0.012056645005941391, 0.02525465562939644, 0.01900952309370041, -0.0532020665705204, -0.04171005263924599, -0.00009536290599498898, -0.043863434344530106, 0.007941129617393017, 0.0315224826335907, 0.016170768067240715, -0.07058043032884598, -0.10111372917890549, -0.05636964365839958, 0.09791223704814911, -0.07592368870973587, 0.02785177156329155, 0.014287421479821205, -0.03686874359846115, -0.0840512067079544, 0.006443477235734463, -0.14243187010288239, -0.0047343093901872635, 0.04977022483944893, -0.04006209596991539, 0.08506942540407181, 0.03580867126584053, 0.06238945573568344, 0.11009102314710617, -0.04996086284518242, -0.03707299008965492, -0.022973962128162384, -0.028828628361225128, -0.08730291575193405, -0.13409467041492462, -0.02085382305085659, -0.010974974371492863, 0.02988577075302601, -0.15232716500759125, 0.020373186096549034, -0.0366874523460865, 0.1013416200876236, -0.002133479341864586, -0.020148668438196182, -0.019400155171751976, 0.07058826088905334, -0.04862599819898605, -0.028061052784323692, 0.030479783192276955, -0.02105441689491272, -0.026598842814564705, 0.09825094789266586, -0.07835158705711365, -0.1058456227183342, 0.08878408372402191, -0.009475105442106724, -0.09981478750705719, -0.006511486601084471, -0.019293518736958504, -0.06112515926361084, -0.051798827946186066, -0.06262897700071335, 0.23524132370948792, 0.04730347543954849, 0.15894095599651337, -0.11779353022575378, -0.04331948608160019, 0.030139869078993797, -0.023123526945710182, -0.030238326638936996, 0.15939484536647797, 0.07859411835670471, -0.1537369042634964, 0.09549262374639511, 0.05650261417031288, -0.02031814679503441, 0.10818398743867874, 0.05663561075925827, -0.1173476129770279, 0.005699909757822752, 0.08094687014818192, 0.0002549712371546775, 0.03462020680308342, -0.10375397652387619, -0.01536408718675375, 0.019433509558439255, 0.05796601623296738, 0.07211976498365402, -0.11944635957479477, 0.07636182755231857, 0.06352882832288742, -0.036193929612636566, 0.03886783868074417, -0.053257446736097336, -0.05081868916749954, 0.11390730738639832, 0.014172481372952461, -0.07609519362449646, -0.0401640459895134, -0.039196744561195374, -0.10110535472631454, 0.1846369355916977, -0.08974878489971161, -0.22477179765701294, -0.12399303168058395, 0.010098935104906559, -0.047546178102493286, 0.022102082148194313, 0.0335596427321434, -0.050591785460710526, -0.043383870273828506, -0.08586690574884415, 0.053657349199056625, -0.11823640018701553, -0.05571476370096207, -0.10026345402002335, 0.0633850023150444, -0.020509278401732445, -0.1445070505142212, 0.029898378998041153, 0.012868105433881283, -0.04044496268033981, -0.016123304143548012, -0.05081169679760933, 0.13069617748260498, 0.13648401200771332, -0.05059270188212395, -0.03016541711986065, 0.006823094096034765, 0.13206066191196442, -0.08201508224010468, 0.035902321338653564, 0.05317346751689911, 0.04763612523674965, 0.04533359408378601, 0.12886419892311096, 0.0400758795440197, -0.04359905421733856, 0.029695015400648117, 0.05354437232017517, -0.004551731050014496, -0.25283756852149963, -0.10715387761592865, -0.06499449908733368, -0.024760238826274872, 0.08527764678001404, 0.04136620834469795, -0.052602123469114304, 0.009205390699207783, -0.052198249846696854, 0.017303431406617165, 0.020057054236531258, 0.05433003604412079, 0.02892683632671833, -0.0169761311262846, 0.07302416861057281, -0.059602513909339905, -0.07517765462398529, 0.10094532370567322, 0.0416736900806427, 0.20494695007801056, -0.05192948877811432, 0.23002560436725616, 0.05241154134273529, 0.07000188529491425, -0.009494959376752377, 0.07171382755041122, -0.032515399158000946, 0.02962418459355831, -0.02774151973426342, -0.06012628600001335, 0.012501897290349007, 0.0653284564614296, 0.008065273053944111, -0.0037664540577679873, -0.06560849398374557, -0.04148903116583824, 0.08026392012834549, 0.20933891832828522, 0.07391629368066788, -0.20738700032234192, -0.0335419699549675, -0.014340084977447987, -0.06353624165058136, -0.08885365724563599, 0.014806454069912434, 0.1720351278781891, -0.07867209613323212, -0.021504433825612068, 0.025297416374087334, 0.13039031624794006, -0.12040683627128601, -0.023240581154823303, 0.03255533427000046, 0.03631580248475075, -0.020855991169810295, 0.13120372593402863, -0.2283806949853897, 0.1870681494474411, 0.016153257340192795, 0.06591015309095383, -0.049308180809020996, 0.03327761963009834, -0.055340107530355453, 0.02718791551887989, 0.1262398213148117, 0.022099528461694717, -0.026549506932497025, -0.10034819692373276, -0.1003604456782341, -0.026536863297224045, 0.09345824271440506, -0.038562431931495667, 0.08024177700281143, 0.059829480946063995, -0.002974926959723234, -0.0070817070081830025, 0.04178166761994362, -0.0418768934905529, -0.17442944645881653, 0.017040856182575226, 0.009312854148447514, -0.04725024476647377, -0.00819399580359459, -0.047389715909957886, -0.06364671885967255, 0.2295796424150467, -0.11568240821361542, -0.05526052042841911, -0.06715177744626999, 0.0032585773151367903, 0.12924230098724365, -0.06620814651250839, 0.0284266360104084, 0.013055390678346157, 0.035785775631666183, -0.046150702983140945, -0.009983504191040993, 0.08691035956144333, -0.07584808766841888, -0.04961685836315155, -0.08465877920389175, 0.11809542030096054, 0.06010030582547188, 0.037927258759737015, -0.00976131483912468, 0.024852093309164047, -0.01309492252767086, -0.09583306312561035, -0.008852741681039333, 0.00177665613591671, 0.15257899463176727, 0.050653886049985886, -0.0814780592918396, -0.08629511296749115, -0.07988264411687851, -0.06388543546199799, 0.14329983294010162, 0.17168442904949188, -0.05739608034491539, 0.024590540677309036, 0.18379303812980652, -0.11920318007469177, -0.16616572439670563, -0.030830958858132362, 0.09666617214679718, 0.07198593020439148, -0.04646533727645874, -0.1844986230134964, -0.003079725196585059, 0.1112164780497551, 0.001177634228952229, 0.045087724924087524, -0.4177781045436859, -0.1295885592699051, 0.0017069978639483452, 0.037957169115543365, 0.009585024788975716, -0.09719554334878922, -0.031105710193514824, -0.04897152632474899, -0.0994124785065651, 0.06779828667640686, -0.010862520895898342, 0.08349131792783737, 0.012406258843839169, -0.01535781566053629, 0.044454965740442276, -0.03798167407512665, 0.12986838817596436, 0.013485944829881191, 0.030045870691537857, -0.0385872945189476, 0.05466465279459953, 0.023013394325971603, 0.0012147463858127594, 0.13825099170207977, -0.03616794943809509, 0.04627438262104988, -0.15985696017742157, -0.04474952071905136, -0.049623943865299225, 0.019584495574235916, -0.04158259183168411, -0.061874933540821075, -0.038456711918115616, 0.03163209557533264, 0.06372150778770447, 0.004318614024668932, -0.02540757693350315, -0.03598521649837494, 0.023588621988892555, 0.18275713920593262, 0.0815996527671814, 0.03669508174061775, -0.09913985431194305, 0.02378126233816147, 0.01489308476448059, 0.05449850112199783, -0.10887562483549118, 0.019731687381863594, 0.15115223824977875, 0.014328410848975182, 0.11712995916604996, -0.005457911640405655, -0.12994299829006195, 0.0006785859586670995, 0.07460009306669235, -0.09528276324272156, -0.14457686245441437, -0.028526978567242622, 0.029850194230675697, -0.0477294884622097, -0.0009044521721079946, 0.08631058037281036, -0.0858357772231102, -0.028616318479180336, -0.021480493247509003, 0.030102401971817017, -0.06726394593715668, 0.21693304181098938, -0.0016735418466851115, 0.04675963521003723, -0.05215189978480339, 0.10760010033845901, 0.1455041468143463, -0.15160584449768066, 0.018866131082177162, 0.18939320743083954, -0.060020096600055695, -0.04442746192216873, 0.05073871836066246, 0.114855095744133, -0.013893574476242065, -0.0718006044626236, -0.039433903992176056, -0.021755190566182137, 0.015139163471758366, -0.009968334808945656, 0.030766133219003677, 0.03340762108564377, -0.006239158567041159, -0.050870269536972046, -0.11750150471925735, 0.09739372879266739, 0.09416934102773666, 0.008977019228041172, -0.008845395408570766, 0.10440092533826828, 0.012213467620313168, -0.01940613053739071, -0.009340105578303337, 0.008574933744966984, -0.04353967681527138, 0.019141603261232376, -0.05578000843524933, -0.009254851378500462, -0.03510337322950363, -0.011809715069830418, -0.03809279575943947, -0.0031175222247838974, -0.013495977967977524, 0.007132499944418669, -0.05525417998433113, -0.04142826795578003, -0.0315435566008091, 0.038441676646471024, -0.09559593349695206, -0.028612956404685974, 0.015477670356631279, -0.04047343134880066, 0.06520385295152664, 0.021919364109635353, -0.0023791103158146143, 0.0020882347598671913, -0.026085957884788513, 0.050716329365968704, -0.00796651840209961, 0.05852079018950462, 0.006788122467696667, -0.08029652386903763, 0.042102087289094925, 0.038983896374702454, -0.01929846778512001, -0.01842200569808483, 0.013881427235901356, -0.11620314419269562, -0.02721434459090233, -0.04166192188858986, -0.05908353999257088, -0.0646447017788887, 0.1194654256105423, 0.06300873309373856, 0.07750169187784195, 0.08831673115491867, -0.05576528236269951, 0.0675155520439148, -0.135844424366951, -0.005640267860144377, 0.03546665608882904, -0.03778812661767006, -0.004303197842091322, -0.01779359020292759, 0.04405337572097778, -0.06847446411848068, 0.1295677125453949, 0.029166916385293007, 0.07990250736474991, 0.010412376374006271, -0.09803486615419388, -0.023325728252530098, 0.024633552879095078, 0.08761879801750183, -0.02997000887989998, -0.01990799978375435, -0.08421351760625839, 0.08212655782699585, 0.00159664626698941, 0.13686588406562805, 0.024479510262608528, 0.13549697399139404, 0.13965104520320892, 0.048225171864032745, 0.003845443017780781, -0.10302908718585968, -0.07014577090740204, 0.08735398203134537, -0.006024111993610859, 0.0598798468708992, -0.051233530044555664, 0.12709534168243408, 0.12696480751037598, -0.14962518215179443, 0.10781100392341614, 0.009492456912994385, -0.09500031918287277, -0.06099754571914673, -0.0929688960313797, -0.04317670688033104, -0.04932372644543648, -0.03203297033905983, -0.11383897811174393, 0.035261452198028564, -0.004287123680114746, 0.05879407748579979, -0.037963736802339554, 0.10443934053182602, -0.00293349870480597, -0.1052512377500534, 0.08099916577339172, 0.0031072869896888733, 0.11588294804096222, -0.02297384850680828, 0.015075352974236012, 0.04654766991734505, -0.020300639793276787, 0.05190411955118179, 0.05435185879468918, -0.006985223852097988, 0.010524001903831959, 0.008795862086117268, -0.04817831516265869, -0.04420911520719528, 0.01482274942100048, 0.07301962375640869, 0.19430126249790192, 0.052143994718790054, -0.07118695974349976, -0.024781756103038788, 0.17043085396289825, -0.054441262036561966, -0.06812021136283875, -0.11082251369953156, 0.1971026360988617, 0.04020688310265541, 0.03655946999788284, 0.015520579181611538, -0.09701068699359894, -0.010362550616264343, 0.1250743567943573, 0.18337681889533997, -0.029668176546692848, -0.04134258255362511, 0.03933487832546234, -0.007414643187075853, 0.026603734120726585, 0.024524319916963577, 0.05651885271072388, 0.2628651559352875, -0.09663956612348557, 0.0655871257185936, -0.061686787754297256, 0.04753458499908447, -0.007341134361922741, 0.1590869277715683, -0.0008505385485477746, 0.005216456484049559, -0.06043691188097, 0.0923033133149147, 0.036819618195295334, -0.15019500255584717, 0.005100606940686703, -0.1078721433877945, -0.10937449336051941, 0.02194097638130188, -0.00956924818456173, 0.028895659372210503, 0.08439785242080688, 0.0012083500623703003, 0.018666911870241165, 0.06083384528756142, 0.013033019378781319, -0.08866188675165176, -0.12473524361848831, -0.0046859742142260075, -0.07301885634660721, 0.12928728759288788, 0.005299545358866453, 0.15133431553840637, 0.10092103481292725, 0.015735818073153496, -0.08314388990402222, 0.0763942152261734, 0.029534341767430305, 0.031706031411886215, 0.06457312405109406, 0.08642975240945816, -0.02349250204861164, 0.07076573371887207, 0.022821448743343353, -0.04092752933502197, 0.06687214225530624, -0.08430281281471252, -0.022219665348529816, -0.12127503752708435, 0.09337078034877777, -0.03454509750008583, 0.1329840123653412, 0.18385380506515503, -0.007884671911597252, 0.007202153094112873, -0.05569542199373245, 0.015607243403792381, -0.006567324511706829, 0.10039066523313522, -0.014645470306277275, -0.21256238222122192, 0.03881130367517471, -0.03264347463846207, 0.052493851631879807, -0.22136270999908447, -0.022565200924873352, 0.028439519926905632, -0.04327165707945824, -0.035564202815294266, 0.07004553079605103, 0.08518780022859573, 0.022382542490959167, -0.033202365040779114, -0.12769366800785065, -0.011197516694664955, 0.09928568452596664, -0.07372866570949554, -0.09634213149547577 ]
null
null
transformers
# legal_t5_small_multitask_it_sv model Model on translating legal text from Italian to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_it_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Italian to Swedish. ### How to use Here is how to use this model to translate legal text from Italian to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_it_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_it_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "Può il Commissario responsabile comunicare al Parlamento in che modo la DG Ricerca garantirà che l’Europa possa svolgere un ruolo di primo piano in questo sforzo globale di ricerca sul diabete?" pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_it_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_it_sv | 41.523| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian Swedish", "tags": ["translation Italian Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Pu\u00f2 il Commissario responsabile comunicare al Parlamento in che modo la DG Ricerca garantir\u00e0 che l\u2019Europa possa svolgere un ruolo di primo piano in questo sforzo globale di ricerca sul diabete?"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_it_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Italian Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Italian Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_it\_sv model ========================================= Model on translating legal text from Italian to Swedish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_it\_sv model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Italian to Swedish. ### How to use Here is how to use this model to translate legal text from Italian to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_it\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Italian Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Italian to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_it\\_sv model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06904560327529907, 0.12773467600345612, -0.0034962883219122887, 0.08956456184387207, 0.057371966540813446, -0.0015539898304268718, 0.03069210797548294, 0.10350959002971649, -0.06651198863983154, 0.07549665868282318, 0.04805494099855423, 0.002561938250437379, 0.08140666782855988, 0.030957287177443504, 0.027811888605356216, -0.20461834967136383, 0.00790345948189497, -0.03445865958929062, -0.024691054597496986, 0.10571502149105072, 0.09764501452445984, -0.060591813176870346, 0.03392109274864197, -0.03410428762435913, -0.057065241038799286, 0.03341129049658775, -0.09207332879304886, -0.02954004891216755, 0.10315801203250885, 0.07727871090173721, 0.09242531657218933, 0.007399842608720064, 0.08758589625358582, -0.171421080827713, -0.014414810575544834, 0.057270195335149765, -0.006077041383832693, 0.02801254764199257, 0.11568008363246918, 0.03238477185368538, 0.1826705038547516, -0.03664274141192436, 0.01871153898537159, 0.033382486552000046, -0.07821623235940933, -0.1381845325231552, -0.04882008209824562, -0.005025853868573904, 0.08686356246471405, 0.1381910890340805, -0.04825789853930473, 0.056018125265836716, -0.021561820060014725, 0.0884322077035904, 0.06694439798593521, -0.21962082386016846, -0.039554450660943985, 0.024450261145830154, 0.05718540400266647, 0.11524339765310287, -0.036966100335121155, 0.01123314630240202, 0.07025744765996933, 0.06493043899536133, 0.05745439603924751, -0.041320644319057465, -0.03723597154021263, -0.029968474060297012, -0.1398715078830719, -0.03896578401327133, 0.18416769802570343, 0.009847155772149563, -0.03741644695401192, -0.10041606426239014, -0.051092613488435745, -0.04513923078775406, 0.01645575650036335, -0.046873051673173904, 0.007111973129212856, -0.0030109290964901447, 0.06169829145073891, -0.05182066559791565, -0.13049206137657166, -0.057839199900627136, -0.03655705228447914, 0.08144930750131607, 0.05008947104215622, 0.013092090375721455, 0.0511406809091568, 0.06975990533828735, -0.1259077489376068, -0.07881798595190048, 0.015085029415786266, 0.006597001571208239, -0.08899232745170593, 0.005484923254698515, -0.003275239374488592, -0.2501910924911499, -0.004106887150555849, -0.034671053290367126, -0.06035667657852173, 0.02160903997719288, 0.05406598746776581, 0.050541773438453674, 0.052205681800842285, 0.1216747835278511, -0.09671679139137268, -0.11649671941995621, -0.04615417867898941, -0.02131338231265545, -0.009670321829617023, 0.011605584062635899, -0.0720752403140068, -0.04422026872634888, -0.009919227100908756, 0.03307054191827774, 0.005349797196686268, 0.002806552918627858, -0.024386681616306305, -0.016141213476657867, 0.06882088631391525, -0.10179971158504486, 0.014591897837817669, 0.004187771584838629, -0.10031113773584366, -0.029719097539782524, 0.04796787351369858, -0.01467844843864441, -0.12478437274694443, 0.0774901956319809, -0.01416527759283781, -0.010957778431475163, -0.10797180235385895, -0.19512613117694855, 0.013956447131931782, -0.027623342350125313, -0.05542536452412605, -0.08526294678449631, -0.08978229761123657, -0.08710090816020966, 0.037353549152612686, -0.06495725363492966, 0.01097758486866951, -0.10780107229948044, 0.006529899314045906, 0.026125382632017136, -0.024361805990338326, 0.07855667173862457, -0.049071770161390305, 0.03966360539197922, -0.016725506633520126, 0.07205317914485931, 0.0029575410299003124, 0.02688637375831604, -0.10984361916780472, 0.02439739741384983, -0.08012503385543823, 0.1364770233631134, -0.03866204619407654, -0.01038812194019556, -0.12948459386825562, -0.06377842277288437, -0.0739840492606163, 0.03770864009857178, 0.09196967631578445, 0.14486445486545563, -0.22579854726791382, -0.021395623683929443, 0.20900091528892517, -0.09100587666034698, -0.06782890111207962, 0.14573819935321808, -0.015913506969809532, 0.05681081861257553, 0.07662158459424973, 0.0979028195142746, 0.04165203869342804, -0.05235174670815468, -0.06371621787548065, 0.015981310978531837, 0.012158220633864403, 0.02793225832283497, 0.09831243008375168, -0.06224210932850838, 0.11119456589221954, 0.00877932645380497, 0.025217149406671524, 0.008451409637928009, -0.035476040095090866, -0.04096376150846481, 0.0020510824397206306, -0.04683542251586914, -0.02217361517250538, 0.035198867321014404, 0.013198762200772762, -0.08339907228946686, -0.08467518538236618, -0.03499094769358635, 0.08068820089101791, -0.08042457699775696, 0.04165647551417351, 0.03276780992746353, -0.04044465720653534, -0.10121346265077591, 0.007870414294302464, -0.1365664303302765, -0.01814342476427555, 0.036337509751319885, -0.01979430578649044, 0.08207599818706512, 0.06451820582151413, 0.06843063235282898, 0.10356508940458298, -0.0492522194981575, -0.03015882335603237, -0.004304302390664816, -0.026684513315558434, -0.0973489060997963, -0.1365334689617157, -0.02479851432144642, -0.01505119726061821, 0.013943884521722794, -0.14406777918338776, 0.010431898757815361, -0.029536020010709763, 0.10215558856725693, 0.0002381553640589118, -0.025315597653388977, 0.008158611133694649, 0.07394566386938095, -0.04529056325554848, -0.03190813958644867, 0.03379327803850174, -0.020218823105096817, -0.06733549386262894, 0.11724970489740372, -0.06558334082365036, -0.11945945024490356, 0.07940269261598587, -0.007941859774291515, -0.09352666139602661, -0.0066195279359817505, -0.011396327055990696, -0.0653332993388176, -0.06045415252447128, -0.06325051188468933, 0.21713778376579285, 0.05163373053073883, 0.13762515783309937, -0.11846458911895752, -0.03209213539958, 0.02794775739312172, -0.04787889122962952, -0.030519703403115273, 0.17132799327373505, 0.06455374509096146, -0.17776387929916382, 0.0968412384390831, 0.020455118268728256, -0.029029767960309982, 0.155939981341362, 0.06321095675230026, -0.11799627542495728, 0.0188602264970541, 0.06985142081975937, -0.003051663050428033, 0.04618266969919205, -0.08909422904253006, -0.002232307568192482, 0.027237243950366974, 0.05905574560165405, 0.0721256360411644, -0.10664732754230499, 0.060759447515010834, 0.05911334604024887, -0.04155467450618744, 0.04630934074521065, -0.05029791593551636, -0.06024223566055298, 0.09648796170949936, 0.003285628277808428, -0.06532371789216995, -0.035418227314949036, -0.03914759308099747, -0.10731061547994614, 0.1818227767944336, -0.09004870057106018, -0.2360752522945404, -0.1439036726951599, 0.021627014502882957, -0.05066410079598427, 0.026177851483225822, 0.04505577310919762, -0.05546142905950546, -0.05713900178670883, -0.09004968404769897, 0.07772275805473328, -0.09101950377225876, -0.0647510439157486, -0.11137350648641586, 0.061400581151247025, -0.012692575342953205, -0.1336309164762497, 0.030777692794799805, 0.0003319394018035382, -0.02628244459629059, -0.001983802067115903, -0.04798683896660805, 0.13193051517009735, 0.12090039253234863, -0.03435473516583443, -0.029143061488866806, 0.009786956943571568, 0.12363451719284058, -0.07639642804861069, 0.050214897841215134, 0.04603516682982445, 0.016322536394000053, 0.041340410709381104, 0.14518801867961884, 0.036590639501810074, -0.04136892408132553, 0.02233635075390339, 0.05454518646001816, -0.018253330141305923, -0.26298895478248596, -0.0997084304690361, -0.060828935354948044, -0.01004331186413765, 0.07637328654527664, 0.04252224043011665, -0.08581865578889847, 0.024391623213887215, -0.0482180155813694, -0.012947197072207928, 0.02146001160144806, 0.04940318316221237, 0.0052706534042954445, -0.02624223753809929, 0.07992704957723618, -0.05274898186326027, -0.052370693534612656, 0.09703632444143295, 0.03785533085465431, 0.2028190940618515, -0.05869264155626297, 0.2110082060098648, 0.04721301794052124, 0.06701638549566269, -0.00807187706232071, 0.07375238835811615, -0.03553115576505661, 0.01970532163977623, -0.010748885571956635, -0.0653591901063919, -0.008032837882637978, 0.0722794383764267, 0.008333857171237469, -0.002988707972690463, -0.04138103872537613, -0.02382425032556057, 0.07461901754140854, 0.2158169150352478, 0.08024746924638748, -0.18334704637527466, -0.05613723024725914, 0.0007421285263262689, -0.06226835772395134, -0.08177915215492249, 0.0008640835294499993, 0.1820778101682663, -0.07979041337966919, 0.0051512448117136955, 0.011532432399690151, 0.13092154264450073, -0.1205018013715744, -0.023705273866653442, 0.018405010923743248, 0.03267299383878708, -0.02063927799463272, 0.1385144144296646, -0.22589237987995148, 0.18395093083381653, 0.015596437267959118, 0.08033794909715652, -0.06060224398970604, 0.030689487233757973, -0.06598012894392014, 0.023554693907499313, 0.13232028484344482, 0.032773252576589584, -0.06541554629802704, -0.10848739743232727, -0.10597709566354752, -0.020100660622119904, 0.08198232203722, -0.03565393015742302, 0.08521458506584167, 0.06588457524776459, 0.018747711554169655, -0.018547657877206802, 0.03476055711507797, -0.020521698519587517, -0.15522414445877075, 0.014273961074650288, 0.001617451780475676, -0.049034688621759415, -0.005466049537062645, -0.05399409681558609, -0.08516483753919601, 0.23517224192619324, -0.12787461280822754, -0.09163524955511093, -0.07695929706096649, 0.021960878744721413, 0.11868296563625336, -0.06779838353395462, 0.00797832291573286, 0.02603844366967678, 0.035946160554885864, -0.0588931068778038, -0.0028401720337569714, 0.08196176588535309, -0.06093323975801468, -0.060289718210697174, -0.057762403041124344, 0.1165962666273117, 0.06739767640829086, 0.03445431962609291, -0.009497106075286865, 0.03751087561249733, -0.004256967920809984, -0.09418375790119171, 0.000974896945990622, 0.027093378826975822, 0.15583375096321106, 0.042302269488573074, -0.06740295886993408, -0.07609222829341888, -0.07104988396167755, -0.08151537925004959, 0.14989615976810455, 0.17031903564929962, -0.05373881757259369, 0.040436215698719025, 0.1864854395389557, -0.11891019344329834, -0.18753676116466522, -0.037672631442546844, 0.10119012743234634, 0.09286075830459595, -0.01921507716178894, -0.1661677360534668, 0.011427436955273151, 0.11635472625494003, -0.0011477824300527573, 0.027200771495699883, -0.3834382891654968, -0.13739241659641266, 0.00032055863994173706, 0.04330230504274368, 0.009778800420463085, -0.07845158874988556, -0.038939572870731354, -0.0463801808655262, -0.09372500330209732, 0.046470485627651215, -0.010361605323851109, 0.09999264031648636, 0.016793597489595413, 0.008803277276456356, 0.05616142973303795, -0.04570490121841431, 0.13177745044231415, -0.0023367037065327168, 0.016533024609088898, -0.06969648599624634, 0.08038821816444397, 0.016221346333622932, -0.004655350465327501, 0.16460596024990082, -0.06021954491734505, 0.034390322864055634, -0.15021531283855438, -0.05526004731655121, -0.05178481340408325, 0.03667214885354042, -0.03723018988966942, -0.07868535816669464, -0.042097605764865875, 0.03788977861404419, 0.06693776696920395, -0.005310660228133202, 0.02616400085389614, -0.0730663314461708, 0.02535281330347061, 0.18026167154312134, 0.09835044294595718, 0.026861537247896194, -0.0854061171412468, 0.015119028277695179, 0.007698739878833294, 0.05699979141354561, -0.10181805491447449, 0.012832855805754662, 0.15712577104568481, 0.011889633722603321, 0.10618442296981812, -0.026404421776533127, -0.13210716843605042, 0.010884004645049572, 0.07272219657897949, -0.10448567569255829, -0.1358104795217514, -0.025240816175937653, -0.008470254950225353, -0.045517511665821075, 0.0035314534325152636, 0.1033952608704567, -0.10082514584064484, -0.014537964016199112, -0.021087661385536194, 0.043832190334796906, -0.06456046551465988, 0.22778822481632233, 0.009426413103938103, 0.049524515867233276, -0.057948239147663116, 0.1316651850938797, 0.12680818140506744, -0.13419532775878906, 0.03914244845509529, 0.18266882002353668, -0.06812159717082977, -0.04400539770722389, 0.05851336196064949, 0.12191809713840485, -0.01797098107635975, -0.07351459562778473, -0.03315500169992447, -0.033201441168785095, 0.0034603720996528864, -0.025048429146409035, 0.03807732090353966, 0.029200555756688118, -0.010299614630639553, -0.05010483041405678, -0.10759194195270538, 0.10355860739946365, 0.0841977521777153, 0.006495482753962278, -0.02968686819076538, 0.11447583138942719, 0.0013657858362421393, -0.02662583813071251, -0.017668113112449646, 0.025420360267162323, -0.034760452806949615, 0.01757250726222992, -0.058230794966220856, -0.010932974517345428, -0.04145021736621857, -0.007601255550980568, -0.04226209968328476, -0.005577246192842722, -0.015074406750500202, 0.011404813267290592, -0.055058930069208145, -0.033955372869968414, -0.05277256295084953, 0.020396733656525612, -0.09021251648664474, -0.044441886246204376, -0.001889095758087933, -0.02187858708202839, 0.060328565537929535, 0.016417037695646286, -0.011832221411168575, 0.025494007393717766, -0.016484202817082405, 0.07615271210670471, 0.014610971324145794, 0.047332100570201874, 0.014530768617987633, -0.04878577962517738, 0.010081257671117783, 0.044009801000356674, -0.02624647133052349, -0.011300069279968739, 0.013584298081696033, -0.11838339269161224, -0.034467000514268875, -0.03142981976270676, -0.05127047374844551, -0.06260773539543152, 0.12367933243513107, 0.06529903411865234, 0.08044041693210602, 0.11237362027168274, -0.06686846166849136, 0.07801984995603561, -0.13824383914470673, 0.0004976173513568938, 0.04538266733288765, -0.05100139603018761, -0.008439585566520691, -0.01038982905447483, 0.04469497129321098, -0.08822009712457657, 0.12069610506296158, 0.030767129734158516, 0.0700884684920311, 0.010270769707858562, -0.08262436836957932, -0.0009394044172950089, 0.021014975383877754, 0.10500825941562653, -0.040910858660936356, -0.014559952542185783, -0.08474846929311752, 0.08007732033729553, -0.007305923383682966, 0.1192324087023735, 0.035536400973796844, 0.11789478361606598, 0.13948556780815125, 0.059009965509176254, 0.019293352961540222, -0.08257357031106949, -0.08572770655155182, 0.08995397388935089, -0.0018474927637726068, 0.07012457400560379, -0.025947725400328636, 0.1126244068145752, 0.14449994266033173, -0.1570042371749878, 0.0977557823061943, 0.004138821270316839, -0.1001627966761589, -0.06949649751186371, -0.14159972965717316, -0.06080447509884834, -0.03466399759054184, -0.022492069751024246, -0.1291935294866562, 0.024220097810029984, 0.0030219685286283493, 0.0577196404337883, -0.040915828198194504, 0.11032821983098984, 0.006597303319722414, -0.10831715911626816, 0.07689474523067474, 0.0028797565028071404, 0.10528697818517685, -0.01431382168084383, 0.0040395972318947315, 0.05611556023359299, -0.00222606654278934, 0.03491443023085594, 0.04364714026451111, -0.00002014863093791064, -0.008225973695516586, 0.005030088126659393, -0.057055145502090454, -0.04428112879395485, 0.024973871186375618, 0.08263499289751053, 0.16523154079914093, 0.05442646145820618, -0.06739171594381332, -0.037837691605091095, 0.175162211060524, -0.05203917995095253, -0.07341545820236206, -0.11148975789546967, 0.18686474859714508, 0.022613776847720146, 0.050895221531391144, 0.013886048458516598, -0.10049709677696228, 0.001021828968077898, 0.12119285762310028, 0.18189358711242676, -0.016687124967575073, -0.03744778782129288, 0.012560230679810047, -0.011846042238175869, 0.009768521413207054, 0.03975778445601463, 0.029817307367920876, 0.23672762513160706, -0.08312702924013138, 0.07470695674419403, -0.0659455806016922, 0.04148837924003601, -0.015240742824971676, 0.15002335608005524, 0.0028703685384243727, 0.006757620722055435, -0.0671657919883728, 0.10528673976659775, 0.02027440071105957, -0.14174118638038635, -0.0041471682488918304, -0.09220334142446518, -0.13121818006038666, 0.02109340950846672, 0.02566353604197502, 0.028960663825273514, 0.07982390373945236, 0.00480601005256176, 0.041234102100133896, 0.035240210592746735, 0.017438312992453575, -0.08776979893445969, -0.11383641511201859, -0.009600955061614513, -0.06280604749917984, 0.1003093346953392, 0.01572561077773571, 0.13790717720985413, 0.09553813934326172, 0.022621862590312958, -0.07601478695869446, 0.09552714973688126, 0.029806945472955704, 0.013298344798386097, 0.07460962980985641, 0.10840772837400436, -0.028353052213788033, 0.08317221701145172, 0.03684994578361511, -0.06845825910568237, 0.04675630107522011, -0.059994734823703766, -0.020896567031741142, -0.09694117307662964, 0.0990523025393486, -0.03929764777421951, 0.1371975988149643, 0.19321563839912415, -0.0036131846718490124, -0.011260563507676125, -0.06426919996738434, 0.027432993054389954, -0.013907612301409245, 0.09505059570074081, -0.006437643431127071, -0.21884238719940186, 0.013359684497117996, -0.022101793438196182, 0.040560558438301086, -0.19374729692935944, -0.02268883027136326, 0.021957268938422203, -0.04602803289890289, -0.032562416046857834, 0.07116624712944031, 0.08172948658466339, 0.01438051275908947, -0.023117346689105034, -0.10773619264364243, 0.01236085407435894, 0.09779491275548935, -0.08864329755306244, -0.10442322492599487 ]
null
null
transformers
# legal_t5_small_multitask_sv_cs model Model on translating legal text from Swedish to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_sv_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Swedish to Cszech. ### How to use Here is how to use this model to translate legal text from Swedish to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_sv_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_sv_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "Standarderna för integrerat växtskydd bör tillämpas snabbare än vad kommissionen föreskrivit." pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_sv_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_sv_cs | 45.058| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish Cszech", "tags": ["translation Swedish Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Standarderna f\u00f6r integrerat v\u00e4xtskydd b\u00f6r till\u00e4mpas snabbare \u00e4n vad kommissionen f\u00f6reskrivit."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_sv_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Swedish Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_sv\_cs model ========================================= Model on translating legal text from Swedish to Cszech. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_sv\_cs model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Swedish to Cszech. ### How to use Here is how to use this model to translate legal text from Swedish to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_sv\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_cs model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06895806640386581, 0.11461571604013443, -0.0027866503223776817, 0.08949822932481766, 0.06491698324680328, -0.009151563979685307, 0.007606681436300278, 0.10169773548841476, -0.037965621799230576, 0.08011715114116669, 0.0650428757071495, 0.0067303297109901905, 0.05966747924685478, 0.0324440561234951, 0.041846804320812225, -0.21360553801059723, 0.0022662184201180935, -0.03639156371355057, -0.04001867398619652, 0.10669784247875214, 0.08887811005115509, -0.050533756613731384, 0.0440029613673687, -0.039439745247364044, -0.06049318239092827, 0.019105788320302963, -0.08053106814622879, -0.031586986035108566, 0.09952464699745178, 0.07049771398305893, 0.07338395714759827, -0.008695351891219616, 0.07173454016447067, -0.1788816750049591, -0.004933226853609085, 0.03631497547030449, -0.0157549399882555, 0.038289692252874374, 0.09059455245733261, 0.017884094268083572, 0.21184693276882172, -0.0646868571639061, 0.02335713990032673, 0.04139220714569092, -0.07978364825248718, -0.11138369888067245, -0.06974240392446518, 0.03243773430585861, 0.0869654193520546, 0.13603398203849792, -0.03876562789082527, 0.03087688237428665, -0.020802835002541542, 0.09093835204839706, 0.11356723308563232, -0.2539586126804352, -0.026927610859274864, 0.0736803337931633, 0.06409163773059845, 0.09925337135791779, -0.050790250301361084, 0.029718969017267227, 0.05664515495300293, 0.07266166806221008, 0.08558179438114166, -0.05054664984345436, -0.014095715247094631, -0.012343375943601131, -0.13134653866291046, -0.053851865231990814, 0.17407983541488647, 0.027329498901963234, -0.02275053970515728, -0.10839489102363586, -0.04693291708827019, -0.06942196935415268, 0.010064175352454185, -0.026800284162163734, 0.008751864545047283, -0.017055248841643333, 0.03495843708515167, -0.07842584699392319, -0.12610481679439545, -0.057989656925201416, -0.0513625405728817, 0.055116310715675354, 0.01362795103341341, 0.018015142530202866, 0.05886337161064148, 0.05996479094028473, -0.12944211065769196, -0.07941202819347382, -0.0033223892096430063, 0.013614572584629059, -0.09610383957624435, 0.014592406339943409, -0.008588217198848724, -0.21658313274383545, 0.0007029232219792902, -0.024627598002552986, -0.058684248477220535, 0.018259743228554726, 0.04925835132598877, 0.03900624066591263, 0.047523487359285355, 0.12815141677856445, -0.10319507867097855, -0.12349897623062134, -0.032990336418151855, -0.02872570790350437, 0.0068317586556077, -0.0000334593205479905, -0.0738351121544838, -0.04089203104376793, 0.02526707388460636, 0.04866895452141762, -0.009158357977867126, 0.014215314760804176, 0.00624597305431962, -0.021470816805958748, 0.100910484790802, -0.098814956843853, 0.0014171453658491373, -0.003029228188097477, -0.08505184948444366, -0.03028772957623005, 0.06419680267572403, -0.027291646227240562, -0.10436457395553589, 0.07689129561185837, -0.03073122911155224, -0.00015385221922770143, -0.08624501526355743, -0.18572911620140076, 0.016759155318140984, -0.021291719749569893, -0.05556371062994003, -0.08877532929182053, -0.11396268010139465, -0.0813991129398346, 0.026964182034134865, -0.05297394469380379, 0.005176632199436426, -0.10066459327936172, -0.01183619536459446, 0.039912011474370956, -0.0261492058634758, 0.07378467917442322, -0.04892072081565857, 0.03248682990670204, -0.03361478075385094, 0.06224346533417702, -0.003288757987320423, 0.023904843255877495, -0.08490048348903656, 0.03827223926782608, -0.09122113138437271, 0.14108829200267792, -0.04523523896932602, 0.0032826499082148075, -0.12622255086898804, -0.05732332170009613, -0.063669353723526, 0.04504812881350517, 0.07342874258756638, 0.13120858371257782, -0.20806553959846497, -0.03370409831404686, 0.20094485580921173, -0.08210224658250809, -0.06294712424278259, 0.11157608777284622, -0.015133418142795563, 0.03284763544797897, 0.08709091693162918, 0.11439643055200577, 0.024856416508555412, -0.04786590114235878, -0.06742124259471893, 0.012215404771268368, 0.007594813127070665, 0.0028012534603476524, 0.10185940563678741, -0.07275867462158203, 0.10559697449207306, 0.020582126453518867, 0.04450312629342079, 0.002216908149421215, -0.029056604951620102, -0.035395823419094086, 0.0013193241320550442, -0.03650045394897461, -0.04430040717124939, 0.014011211693286896, 0.016977908089756966, -0.07558532059192657, -0.06651166081428528, 0.04273020848631859, 0.08378350734710693, -0.07292971760034561, 0.04238663986325264, 0.03918773680925369, -0.046506911516189575, -0.12298418581485748, 0.015942389145493507, -0.14969176054000854, -0.009463582187891006, 0.015424135141074657, -0.013710304163396358, 0.09738649427890778, 0.05210414156317711, 0.06484153866767883, 0.08925574272871017, -0.05461281165480614, -0.010577818378806114, -0.011072204448282719, -0.019468827173113823, -0.10814274102449417, -0.10890401154756546, -0.0366697758436203, -0.015313670039176941, 0.02190457284450531, -0.14926926791667938, 0.00437067449092865, -0.018039243295788765, 0.08413354307413101, 0.006868251133710146, -0.03407028689980507, 0.03752458468079567, 0.05650041252374649, -0.02480020374059677, -0.027641702443361282, 0.03503463789820671, -0.010490890592336655, -0.06962030380964279, 0.11840598285198212, -0.09961272776126862, -0.10791177302598953, 0.09735162556171417, 0.03206265717744827, -0.09074234217405319, -0.017913874238729477, -0.005782554391771555, -0.04985741153359413, -0.050251394510269165, -0.06826920062303543, 0.17457768321037292, 0.060591064393520355, 0.144656240940094, -0.11035235971212387, -0.05065026506781578, 0.02172757126390934, -0.04028156027197838, -0.01659569889307022, 0.17271845042705536, 0.029929552227258682, -0.19028900563716888, 0.09396252036094666, 0.0005174616235308349, -0.01972460187971592, 0.17982450127601624, 0.06319228559732437, -0.10760575532913208, -0.010732119902968407, 0.03141118213534355, -0.0007023669313639402, 0.0647454485297203, -0.06983356922864914, -0.022985966876149178, 0.034767039120197296, 0.06548479944467545, 0.06552573293447495, -0.08904194831848145, 0.05803246796131134, 0.05991370230913162, -0.04525537043809891, 0.07317287474870682, -0.029766280204057693, -0.058117374777793884, 0.09469939023256302, 0.014020517468452454, -0.02599937841296196, -0.04652387276291847, -0.04130525141954422, -0.1152222752571106, 0.19158712029457092, -0.0976535975933075, -0.23677729070186615, -0.15336285531520844, 0.031552255153656006, -0.06300882995128632, 0.021181393414735794, 0.046130429953336716, -0.04998580738902092, -0.050472162663936615, -0.1144934818148613, 0.09369344264268875, -0.08113529533147812, -0.05231860280036926, -0.09969949722290039, 0.045082613825798035, -0.0198513213545084, -0.13653668761253357, 0.01700996607542038, -0.012782849371433258, -0.03241543471813202, 0.00319680362008512, -0.045628879219293594, 0.11709075421094894, 0.13621817529201508, -0.01880311407148838, -0.022089829668402672, 0.00877441093325615, 0.1186753362417221, -0.0568280965089798, 0.053663596510887146, 0.03608906641602516, 0.03704207390546799, 0.019667433574795723, 0.12538005411624908, 0.04264933988451958, -0.03852624446153641, 0.0294815506786108, 0.05909784883260727, -0.04263841360807419, -0.2568032741546631, -0.10524950921535492, -0.06204180046916008, -0.002111532259732485, 0.08580151945352554, 0.0525122694671154, -0.056725677102804184, 0.02042892388999462, -0.039283741265535355, -0.011188716627657413, 0.014367245137691498, 0.05895788595080376, 0.029240641742944717, -0.023417377844452858, 0.08397427201271057, -0.05755571275949478, -0.014209209010004997, 0.09066138416528702, 0.03154059872031212, 0.1747216433286667, -0.04273895174264908, 0.22360044717788696, 0.050398897379636765, 0.03733689710497856, 0.0035384746734052896, 0.08313784003257751, -0.040918923914432526, 0.01759904809296131, -0.01020881999284029, -0.06994440406560898, -0.02152569405734539, 0.06614188849925995, 0.007290390785783529, 0.012099925428628922, -0.04813404381275177, -0.047747522592544556, 0.07530895620584488, 0.22628703713417053, 0.07330241799354553, -0.15618896484375, -0.08354154974222183, 0.004071736708283424, -0.07678145915269852, -0.06671592593193054, 0.0005375194596126676, 0.16604037582874298, -0.10235272347927094, 0.008762815035879612, 0.01163078285753727, 0.1323506236076355, -0.10434969514608383, -0.024988124147057533, 0.00632185535505414, 0.013666643761098385, -0.026496803387999535, 0.11934436857700348, -0.2246433049440384, 0.18360844254493713, 0.026557711884379387, 0.059426743537187576, -0.05269506201148033, 0.008361670188605785, -0.04775841534137726, -0.008813983760774136, 0.12431030720472336, 0.04021456465125084, -0.07818232476711273, -0.10254336893558502, -0.10561591386795044, -0.020684747025370598, 0.05989914387464523, -0.06626035273075104, 0.09907282143831253, 0.06463094055652618, 0.002316531492397189, -0.027620814740657806, 0.05518491566181183, -0.0006139904144220054, -0.15442754328250885, -0.009470555931329727, -0.014911754056811333, -0.049790434539318085, -0.00839341152459383, -0.05259903892874718, -0.06216667965054512, 0.2358694225549698, -0.12323621660470963, -0.10176333785057068, -0.08834929019212723, 0.024819636717438698, 0.11353979259729385, -0.07770438492298126, 0.02743772231042385, 0.017042038962244987, 0.04948071390390396, -0.06922859698534012, -0.027628149837255478, 0.07707951962947845, -0.050321582704782486, -0.07222133129835129, -0.028250733390450478, 0.16696226596832275, 0.05491682142019272, 0.050506312400102615, -0.019620394334197044, 0.06227196753025055, 0.006552301347255707, -0.10926450788974762, -0.010889168828725815, 0.054823897778987885, 0.13850091397762299, 0.05390721559524536, -0.05691191926598549, -0.06596579402685165, -0.05805247649550438, -0.08104105293750763, 0.17426973581314087, 0.17265743017196655, -0.05779006704688072, 0.05757581815123558, 0.17270177602767944, -0.10862091928720474, -0.19094844162464142, -0.04996498301625252, 0.08176654577255249, 0.08325587213039398, -0.0206461139023304, -0.15750032663345337, 0.0273636132478714, 0.11898874491453171, -0.0031338189728558064, 0.03285795450210571, -0.34416958689689636, -0.14925764501094818, 0.014584952965378761, 0.035758838057518005, 0.0012479020515456796, -0.09948360919952393, -0.05514674633741379, -0.06137670949101448, -0.09754735231399536, 0.10721661895513535, -0.04016527533531189, 0.11158724129199982, 0.0009111910476349294, 0.0329853817820549, 0.04823075234889984, -0.0480242557823658, 0.1286766678094864, 0.010399180464446545, 0.018999429419636726, -0.0700092762708664, 0.05957994982600212, 0.016633618623018265, -0.0273336973041296, 0.1782924085855484, -0.07379793375730515, 0.043440088629722595, -0.15485619008541107, -0.06078305467963219, -0.07252220064401627, 0.035684734582901, -0.03808373957872391, -0.07741249352693558, -0.059018224477767944, 0.03355484455823898, 0.054917868226766586, -0.019604787230491638, 0.05235523730516434, -0.0885154977440834, 0.04831792786717415, 0.18770819902420044, 0.07776419818401337, 0.011572865769267082, -0.09269840270280838, 0.011189642362296581, -0.00765728484839201, 0.05900869891047478, -0.13445615768432617, -0.0015046194894239306, 0.14734846353530884, 0.03883161023259163, 0.11145743727684021, -0.03213891759514809, -0.12614737451076508, 0.005176493898034096, 0.05726082623004913, -0.10671447217464447, -0.10055893659591675, -0.016190527006983757, 0.0020617519039660692, -0.05330044403672218, -0.015099409967660904, 0.125735342502594, -0.0925808846950531, -0.01802927255630493, -0.008788106963038445, 0.03357860818505287, -0.06713349372148514, 0.21877866983413696, 0.04284536466002464, 0.06143403425812721, -0.06502168625593185, 0.11688356846570969, 0.10280924290418625, -0.10062026977539062, 0.04685753211379051, 0.19906549155712128, -0.07792403548955917, -0.04921995848417282, 0.030488548800349236, 0.1181381493806839, -0.05998590961098671, -0.0508989617228508, -0.014151066541671753, -0.03797614574432373, 0.015964239835739136, 0.024977903813123703, 0.05090298131108284, 0.03443460911512375, -0.0074449763633310795, -0.04690348356962204, -0.07916787266731262, 0.09851828217506409, 0.04772480949759483, -0.0027556633576750755, -0.0221511572599411, 0.10130837559700012, -0.0007704620948061347, 0.007424469571560621, -0.022539213299751282, 0.012331012636423111, -0.04633846879005432, 0.009541944600641727, -0.06649406999349594, 0.000160491734277457, -0.06745977699756622, -0.01320634875446558, -0.042177535593509674, 0.005187734961509705, -0.015628615394234657, 0.0041707176715135574, -0.04820806905627251, -0.0430888831615448, -0.07271824032068253, 0.019160261377692223, -0.09519698470830917, -0.046370428055524826, -0.0073550995439291, -0.018180372193455696, 0.05560806021094322, 0.03330325335264206, 0.00010070356802316383, 0.054871730506420135, -0.011812000535428524, 0.06520933657884598, 0.023171301931142807, 0.040880024433135986, 0.01591249741613865, -0.032354097813367844, -0.004564621485769749, 0.03207731992006302, -0.0028545339591801167, 0.002078508958220482, 0.0033907368779182434, -0.12647280097007751, -0.06319459527730942, -0.05137442424893379, -0.026048049330711365, -0.06618279218673706, 0.10546602308750153, 0.06912925094366074, 0.07564030587673187, 0.10379835963249207, -0.07252025604248047, 0.0765439122915268, -0.16482636332511902, -0.0026720422320067883, 0.019177263602614403, -0.04148181155323982, -0.026172326877713203, -0.0008091347990557551, 0.0440332293510437, -0.08025868982076645, 0.1364990770816803, 0.016391342505812645, 0.0515008307993412, 0.026134664192795753, -0.05726304277777672, 0.01897675171494484, 0.018626125529408455, 0.12531839311122894, -0.02143055759370327, -0.022804349660873413, -0.05787532404065132, 0.08475155383348465, -0.011180734261870384, 0.10587672144174576, 0.06422863155603409, 0.10608049482107162, 0.1334242820739746, 0.05625668913125992, 0.020608972758054733, -0.06695636361837387, -0.06084173172712326, 0.054260510951280594, -0.0010019004112109542, 0.06837989389896393, -0.01726268231868744, 0.09385613352060318, 0.1655462384223938, -0.178263857960701, 0.11623960733413696, 0.008252516388893127, -0.08505356311798096, -0.06251418590545654, -0.14771485328674316, -0.07009807229042053, -0.026616688817739487, -0.02719835378229618, -0.12669619917869568, 0.00847594067454338, 0.047805480659008026, 0.06345579028129578, -0.019155332818627357, 0.10544998198747635, -0.004281691741198301, -0.10450497269630432, 0.06667675822973251, 0.005332032218575478, 0.06941910088062286, -0.011057928204536438, 0.033801883459091187, 0.06521529704332352, -0.0018460622522979975, 0.03529401496052742, 0.050682805478572845, -0.002017147606238723, -0.006108447443693876, 0.00004037008329760283, -0.06961086392402649, -0.03815614804625511, 0.03212673217058182, 0.08982378989458084, 0.1675519496202469, 0.0607869029045105, -0.09439068287611008, -0.031604427844285965, 0.18905550241470337, -0.051156025379896164, -0.09063473343849182, -0.102967269718647, 0.19734756648540497, 0.021350696682929993, 0.044130679219961166, -0.005883411038666964, -0.09349287301301956, 0.002188815036788583, 0.10952702909708023, 0.20017436146736145, -0.01645245961844921, -0.023506928235292435, -0.0008913064375519753, -0.012284956872463226, 0.009475065395236015, 0.02796008437871933, 0.02701669931411743, 0.2591331899166107, -0.07453148066997528, 0.06523478031158447, -0.06728260964155197, 0.002657202770933509, -0.0052475715056061745, 0.13694703578948975, -0.003121810033917427, 0.0031030126847326756, -0.04858837649226189, 0.09572633355855942, -0.012885193340480328, -0.1624833345413208, -0.007742171175777912, -0.09623127430677414, -0.12692464888095856, 0.012595392763614655, -0.01731019653379917, 0.04629050940275192, 0.06587512791156769, 0.014203818514943123, 0.050545234233140945, 0.0027507958002388477, 0.020598154515028, -0.112888865172863, -0.11882230639457703, 0.013286779634654522, -0.007694234140217304, 0.07796923816204071, 0.011218233034014702, 0.10900332778692245, 0.0922359973192215, 0.01931316964328289, -0.06997229903936386, 0.1017962098121643, 0.03669852018356323, 0.003388236742466688, 0.07630959153175354, 0.12489425390958786, -0.013834209181368351, 0.05723477154970169, 0.037965256720781326, -0.07490224391222, 0.025583403185009956, -0.04014940187335014, -0.01903381012380123, -0.11146646738052368, 0.08845129609107971, -0.03459591791033745, 0.14229288697242737, 0.18124374747276306, -0.010537019930779934, -0.012258067727088928, -0.061686765402555466, 0.02337179332971573, -0.015253862366080284, 0.07146379351615906, -0.006652329582720995, -0.20519229769706726, 0.018721826374530792, -0.04843771830201149, 0.020209237933158875, -0.23103156685829163, -0.037844929844141006, 0.020771000534296036, -0.04443025588989258, -0.01985923945903778, 0.08201827108860016, 0.06734967976808548, 0.02224620245397091, -0.027389736846089363, -0.06570526957511902, 0.01727827824652195, 0.10102350264787674, -0.11927232146263123, -0.11504382640123367 ]
null
null
transformers
# legal_t5_small_multitask_sv_de model Model on translating legal text from Swedish to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_sv_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Swedish to Deustch. ### How to use Here is how to use this model to translate legal text from Swedish to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_sv_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_sv_de", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "Kan kommissionen bekräfta att i Olaf‑handlingar som samlats in inom ramen för denna granskning, daterade mellan 2000 och 2004, kan följande information hittas: —" pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_sv_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_sv_de | 44.684| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish Deustch", "tags": ["translation Swedish Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Kan kommissionen bekr\u00e4fta att i Olaf\u2011handlingar som samlats in inom ramen f\u00f6r denna granskning, daterade mellan 2000 och 2004, kan f\u00f6ljande information hittas: \u2014"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_sv_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Swedish Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_sv\_de model ========================================= Model on translating legal text from Swedish to Deustch. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_sv\_de model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Swedish to Deustch. ### How to use Here is how to use this model to translate legal text from Swedish to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_sv\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 200, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_de model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06698498129844666, 0.11174913495779037, -0.003118400229141116, 0.09867140650749207, 0.06737207621335983, -0.008542461320757866, 0.00874262023717165, 0.09783812612295151, -0.05191660299897194, 0.07342810928821564, 0.06682855635881424, 0.011029142886400223, 0.061142995953559875, 0.029246866703033447, 0.042785126715898514, -0.22278209030628204, 0.0030374391935765743, -0.03429390490055084, -0.04738067835569382, 0.09077981114387512, 0.09728600829839706, -0.04489941522479057, 0.04416661709547043, -0.04456013813614845, -0.04681555554270744, 0.0365968681871891, -0.08692830055952072, -0.030118804425001144, 0.10687053203582764, 0.07531579583883286, 0.07349731028079987, -0.009300051257014275, 0.07440455257892609, -0.17595809698104858, -0.006510134786367416, 0.0438997708261013, -0.014552616514265537, 0.026576418429613113, 0.09076260030269623, 0.023226259276270866, 0.20643286406993866, -0.06748583167791367, 0.030583128333091736, 0.038270413875579834, -0.0835600197315216, -0.13367877900600433, -0.06880821287631989, 0.01689155213534832, 0.08409292250871658, 0.1405584216117859, -0.04185164347290993, 0.0144277885556221, -0.012709246948361397, 0.08615369349718094, 0.09490154683589935, -0.23433341085910797, -0.023596785962581635, 0.07025876641273499, 0.0692099928855896, 0.09665528684854507, -0.041893355548381805, 0.023698601871728897, 0.06465120613574982, 0.08623502403497696, 0.08095622807741165, -0.048317596316337585, 0.005042580422013998, -0.01576804369688034, -0.12605762481689453, -0.04379405081272125, 0.16365909576416016, 0.02398303523659706, -0.028303515166044235, -0.10649281740188599, -0.04544822499155998, -0.06693894416093826, 0.016026638448238373, -0.05110671743750572, 0.015890005975961685, -0.012073708698153496, 0.046476829797029495, -0.0774686262011528, -0.11721159517765045, -0.05671150982379913, -0.06184624880552292, 0.043935682624578476, 0.020993296056985855, 0.010382178239524364, 0.04619789496064186, 0.06067647412419319, -0.13909190893173218, -0.08403816819190979, -0.007486648857593536, 0.01067402120679617, -0.08046381920576096, 0.015072119422256947, -0.015247886069118977, -0.22406324744224548, 0.007148048374801874, -0.006497993133962154, -0.07322856783866882, 0.015497141517698765, 0.039437711238861084, 0.029591932892799377, 0.05433866009116173, 0.1256464421749115, -0.10741187632083893, -0.13917869329452515, -0.02561005763709545, -0.02806692197918892, 0.014627453871071339, 0.007651151157915592, -0.07356545329093933, -0.04373783990740776, 0.030535047873854637, 0.04791497066617012, -0.005870780907571316, 0.006659666076302528, 0.0025241554249078035, -0.02217908762395382, 0.11133960634469986, -0.09978737682104111, -0.002005238551646471, -0.009600025601685047, -0.08855907618999481, -0.025391023606061935, 0.060299135744571686, -0.02363624796271324, -0.10200440883636475, 0.07671695947647095, -0.02305048517882824, -0.014840057119727135, -0.08779453486204147, -0.1972855031490326, 0.015307745896279812, -0.02114490047097206, -0.04116317257285118, -0.09800269454717636, -0.130392923951149, -0.08422280102968216, 0.025655966252088547, -0.06042689085006714, 0.005486094392836094, -0.10438580811023712, -0.006927632726728916, 0.03735804557800293, -0.02728145383298397, 0.0634772777557373, -0.04902348667383194, 0.029926203191280365, -0.021044716238975525, 0.06792432069778442, -0.012067082338035107, 0.03533290699124336, -0.08044413477182388, 0.03718860447406769, -0.0919450893998146, 0.15394753217697144, -0.03982233256101608, 0.005576076917350292, -0.1282557100057602, -0.0576266385614872, -0.06853459030389786, 0.054109837859869, 0.0857062116265297, 0.1292247176170349, -0.22245927155017853, -0.03254921734333038, 0.19205528497695923, -0.08442982286214828, -0.055643483996391296, 0.11443696916103363, -0.019273092970252037, 0.0355227030813694, 0.09841220080852509, 0.11590684950351715, 0.02829638309776783, -0.04473208263516426, -0.062438689172267914, 0.006426404230296612, 0.00051029899623245, 0.018929222598671913, 0.09509807080030441, -0.07344090938568115, 0.11163920909166336, 0.019025426357984543, 0.05476733669638634, 0.00403343141078949, -0.020856954157352448, -0.02929985709488392, 0.010827514342963696, -0.037411727011203766, -0.04848311096429825, 0.014003632590174675, 0.012104450725018978, -0.0783902183175087, -0.06781162321567535, 0.06336800754070282, 0.08134139329195023, -0.07180196046829224, 0.03697889298200607, 0.04682498425245285, -0.0617964044213295, -0.11948433518409729, 0.019619658589363098, -0.14133363962173462, -0.012608482502400875, 0.009816158562898636, -0.034681886434555054, 0.09519720077514648, 0.062199998646974564, 0.06543221324682236, 0.08864806592464447, -0.058305371552705765, -0.010369065217673779, -0.024148277938365936, -0.014771712012588978, -0.09894533455371857, -0.10756408423185349, -0.021577909588813782, -0.021699899807572365, -0.00013081317592877895, -0.14235156774520874, 0.004604500252753496, -0.027309447526931763, 0.08328043669462204, 0.009137590415775776, -0.026356637477874756, 0.03118445724248886, 0.06028330698609352, -0.03002203069627285, -0.03504178673028946, 0.026121988892555237, -0.017530059441924095, -0.06262928247451782, 0.11243932694196701, -0.09569054841995239, -0.10115097463130951, 0.10085766017436981, 0.03392971307039261, -0.0982021689414978, 0.002812800696119666, -0.000597648904658854, -0.05637023225426674, -0.04786897078156471, -0.08151566982269287, 0.1858617514371872, 0.05669785663485527, 0.14282193779945374, -0.10577208548784256, -0.046945054084062576, 0.017766814678907394, -0.02832445502281189, -0.0259007029235363, 0.16856516897678375, 0.029277019202709198, -0.18756340444087982, 0.09785911440849304, 0.002738296752795577, -0.019147906452417374, 0.17148876190185547, 0.0682985782623291, -0.10760533809661865, -0.0058427839539945126, 0.02535192109644413, -0.0021360390819609165, 0.05916747823357582, -0.07000089436769485, -0.01748601347208023, 0.02643960528075695, 0.07454752922058105, 0.07083442062139511, -0.08063793182373047, 0.05750024691224098, 0.06434693932533264, -0.04683994501829147, 0.06707842648029327, -0.026660224422812462, -0.06140570715069771, 0.09217703342437744, 0.022203488275408745, -0.03317006677389145, -0.03935299068689346, -0.03580326586961746, -0.10688062012195587, 0.1906345784664154, -0.1036449447274208, -0.24667611718177795, -0.1451215296983719, 0.03294702619314194, -0.0686986967921257, 0.026686839759349823, 0.05717679485678673, -0.05693037807941437, -0.05583576112985611, -0.10659408569335938, 0.1188090443611145, -0.08949954807758331, -0.05426449701189995, -0.09859007596969604, 0.04576786234974861, -0.013476897031068802, -0.14735130965709686, 0.016329806298017502, -0.008103698492050171, -0.020074140280485153, 0.0011192192323505878, -0.04018620401620865, 0.11286462098360062, 0.12231742590665817, -0.015508160926401615, -0.03038693219423294, 0.010163416154682636, 0.12822721898555756, -0.05880161374807358, 0.05656292662024498, 0.04284776374697685, 0.04096744954586029, 0.019547291100025177, 0.13905072212219238, 0.04088198393583298, -0.05224023759365082, 0.03632623329758644, 0.07074394822120667, -0.03772393986582756, -0.26245641708374023, -0.09455350041389465, -0.056641750037670135, -0.004427000414580107, 0.08391016721725464, 0.053921375423669815, -0.04995252937078476, 0.011502652429044247, -0.03522466495633125, 0.009379812516272068, 0.01903998851776123, 0.05624338611960411, 0.057627201080322266, -0.02800307609140873, 0.08571161329746246, -0.056929249316453934, -0.02772657759487629, 0.0956072136759758, 0.04097505286335945, 0.1693846732378006, -0.03850346431136131, 0.20070242881774902, 0.047924961894750595, 0.007840489037334919, -0.0076372879557311535, 0.07997054606676102, -0.03937575966119766, 0.019206134602427483, -0.018125666305422783, -0.06514378637075424, -0.009870203211903572, 0.07101856172084808, -0.0008552715880796313, 0.003949990961700678, -0.042520131915807724, -0.04230867326259613, 0.07010824233293533, 0.2108699381351471, 0.08133957535028458, -0.16680359840393066, -0.08289780467748642, 0.006914176978170872, -0.07589046657085419, -0.07549300044775009, 0.0063653262332081795, 0.1440960317850113, -0.08778972178697586, 0.02061009593307972, 0.01772141084074974, 0.13138911128044128, -0.11136341840028763, -0.017833007499575615, 0.013342236168682575, 0.027105489745736122, -0.024188166484236717, 0.11139418929815292, -0.24607202410697937, 0.16490758955478668, 0.02376781404018402, 0.06487919390201569, -0.04298904910683632, 0.015664946287870407, -0.04743901267647743, -0.008636699058115482, 0.11931030452251434, 0.03694247826933861, -0.07205713540315628, -0.09328847378492355, -0.09605120122432709, -0.020098645240068436, 0.051097843796014786, -0.05897410586476326, 0.08975781500339508, 0.06958363950252533, 0.010167052038013935, -0.02278370037674904, 0.05407978594303131, -0.01826782152056694, -0.15319907665252686, -0.010507345199584961, -0.025920767337083817, -0.0347098708152771, -0.005276163574308157, -0.05194705352187157, -0.06819171458482742, 0.23020395636558533, -0.12018810212612152, -0.11328045278787613, -0.08504953235387802, 0.0361713171005249, 0.10910727828741074, -0.07497608661651611, 0.028759470209479332, 0.022441763430833817, 0.03779023885726929, -0.07109805196523666, -0.03282838314771652, 0.08170071989297867, -0.04697854816913605, -0.06061948090791702, -0.03319595381617546, 0.14990414679050446, 0.06235291436314583, 0.040885839611291885, -0.0183061845600605, 0.05842363461852074, 0.006764785386621952, -0.1051095724105835, -0.0058632101863622665, 0.059389468282461166, 0.13363346457481384, 0.04143191874027252, -0.049662500619888306, -0.06467582285404205, -0.06050723418593407, -0.08417858183383942, 0.16110527515411377, 0.16826142370700836, -0.05340718850493431, 0.0500352568924427, 0.17313790321350098, -0.10681308805942535, -0.19418488442897797, -0.0463079996407032, 0.08328065276145935, 0.08561019599437714, -0.016627976670861244, -0.16944867372512817, 0.023529451340436935, 0.12614057958126068, 0.0010920820059254766, 0.04900582879781723, -0.3414076566696167, -0.1520317792892456, 0.024422110989689827, 0.030296852812170982, 0.008056296966969967, -0.09925844520330429, -0.05032084137201309, -0.07380635291337967, -0.09230873733758926, 0.0911797434091568, -0.034823477268218994, 0.10281696170568466, -0.0006126069347374141, 0.040475692600011826, 0.04632524028420448, -0.04075092822313309, 0.13463634252548218, 0.011663972400128841, 0.026472173631191254, -0.0742843747138977, 0.07054252922534943, 0.009323074482381344, -0.02722136862576008, 0.1848408728837967, -0.06602462381124496, 0.04300984367728233, -0.1427331119775772, -0.07104294747114182, -0.0628521665930748, 0.038463033735752106, -0.03635761886835098, -0.0879514142870903, -0.05740788206458092, 0.032767705619335175, 0.049371544271707535, -0.026680434122681618, 0.04141068086028099, -0.08286711573600769, 0.03140471130609512, 0.17104975879192352, 0.07215802371501923, 0.023552579805254936, -0.08847758919000626, 0.018784543499350548, -0.009234814904630184, 0.0644444078207016, -0.1366693675518036, 0.0032500082161277533, 0.1512233316898346, 0.025642335414886475, 0.11506161093711853, -0.032570719718933105, -0.13113534450531006, 0.0107793640345335, 0.05684779956936836, -0.09447213262319565, -0.11321356892585754, -0.016463447362184525, -0.020859183743596077, -0.04202433302998543, -0.006168660707771778, 0.13126152753829956, -0.0957823097705841, -0.0175164844840765, -0.008042545057833195, 0.03929530456662178, -0.0714493915438652, 0.2219521701335907, 0.0376693494617939, 0.05213693156838417, -0.06121592968702316, 0.12521912157535553, 0.10285338014364243, -0.11477507650852203, 0.05849674716591835, 0.19335466623306274, -0.07071947306394577, -0.05184439942240715, 0.014071827754378319, 0.12721297144889832, -0.0642823725938797, -0.05630314350128174, -0.016276253387331963, -0.0425577349960804, 0.02175670675933361, 0.001751222531311214, 0.04440046474337578, 0.037901993840932846, -0.008997869677841663, -0.04184732958674431, -0.07972189784049988, 0.0982869416475296, 0.04962852597236633, 0.002958819270133972, -0.028791263699531555, 0.08028280735015869, -0.001573404879309237, 0.002251832513138652, -0.021402914077043533, 0.01973654143512249, -0.044780127704143524, 0.003948821686208248, -0.09746469557285309, -0.004866389557719231, -0.06268095970153809, -0.004585896153002977, -0.04117072373628616, 0.007041244767606258, -0.012984457425773144, 0.007619715761393309, -0.03933682665228844, -0.052420955151319504, -0.06811203807592392, 0.01734660379588604, -0.09754166752099991, -0.05528708174824715, -0.01648802123963833, -0.016526147723197937, 0.058277811855077744, 0.02651742286980152, 0.004563115071505308, 0.0452878512442112, 0.009272392839193344, 0.0724325031042099, 0.028611956164240837, 0.04191944748163223, 0.016914349049329758, -0.042080093175172806, -0.008455398492515087, 0.0320645235478878, -0.01266011968255043, -0.005261947400867939, 0.0008462709956802428, -0.1297704130411148, -0.05800897628068924, -0.05264759808778763, -0.026767853647470474, -0.06323204189538956, 0.09839998185634613, 0.0710894837975502, 0.08074174076318741, 0.1070081889629364, -0.07755061984062195, 0.07616962492465973, -0.15901398658752441, -0.004451202228665352, 0.019162194803357124, -0.03594469651579857, -0.012776319868862629, 0.0007642224663868546, 0.04579061269760132, -0.08169741928577423, 0.14522281289100647, 0.020970717072486877, 0.055888596922159195, 0.0268512275069952, -0.07670220732688904, 0.010301330126821995, 0.02082664892077446, 0.12483876943588257, -0.02707732282578945, -0.022001707926392555, -0.08604154735803604, 0.08677134662866592, -0.001409433432854712, 0.11773580312728882, 0.05506391078233719, 0.12306807935237885, 0.12728777527809143, 0.05274851992726326, 0.02284771203994751, -0.0704101100564003, -0.07682470232248306, 0.05358777195215225, 0.015374444425106049, 0.062373947352170944, -0.006728440057486296, 0.07516083866357803, 0.16541552543640137, -0.17069698870182037, 0.11471829563379288, 0.0038933446630835533, -0.08199407160282135, -0.06303650885820389, -0.1337457001209259, -0.0694926381111145, -0.02984648197889328, -0.029721150174736977, -0.12637460231781006, 0.005407135002315044, 0.029714783653616905, 0.06603698432445526, -0.027339834719896317, 0.10883975774049759, -0.00036965362960472703, -0.10979834944009781, 0.06191820278763771, 0.00795144122093916, 0.06503986567258835, 0.007020409684628248, 0.03619629889726639, 0.06152472645044327, 0.009891207329928875, 0.026038771495223045, 0.046901509165763855, -0.009685633704066277, -0.006241153925657272, -0.0011668010847643018, -0.05888177081942558, -0.04057459160685539, 0.0422409288585186, 0.07077853381633759, 0.17873969674110413, 0.06565739959478378, -0.09957554191350937, -0.023848894983530045, 0.17093954980373383, -0.0467362143099308, -0.09856627881526947, -0.10128997266292572, 0.21145382523536682, 0.03543976694345474, 0.05940987169742584, -0.009044886566698551, -0.10243019461631775, -0.003084431169554591, 0.09235969930887222, 0.21242426335811615, -0.01394394040107727, -0.029770812019705772, -0.0002431460452498868, -0.011450368911027908, 0.008339481428265572, 0.02386178821325302, 0.03831970691680908, 0.2585453391075134, -0.06783562898635864, 0.06410257518291473, -0.06798767298460007, 0.012378597632050514, -0.003597910748794675, 0.13909950852394104, 0.0036011438351124525, 0.006043489556759596, -0.04462943598628044, 0.10167191922664642, -0.005956082139164209, -0.17319218814373016, -0.0061425818130373955, -0.09196639060974121, -0.12698523700237274, 0.012538216076791286, -0.026960035786032677, 0.0409676693379879, 0.06337402015924454, 0.0186007097363472, 0.05485319718718529, 0.01607438549399376, 0.015821637585759163, -0.11280766874551773, -0.10706044733524323, 0.00672807777300477, 0.019225578755140305, 0.06346289813518524, 0.010259444825351238, 0.10520324856042862, 0.09571708738803864, 0.02863251604139805, -0.07255309820175171, 0.10988426953554153, 0.03246024623513222, 0.02196955680847168, 0.07630991190671921, 0.10970926284790039, -0.015220396220684052, 0.04318800941109657, 0.0406571589410305, -0.07342909276485443, 0.016738159582018852, -0.030585838481783867, -0.03223661333322525, -0.11514250189065933, 0.09901387244462967, -0.03659184277057648, 0.1363077461719513, 0.17896269261837006, -0.0056197321973741055, -0.006801992654800415, -0.06177268549799919, 0.026863139122724533, -0.008131188340485096, 0.07690176367759705, -0.007419855333864689, -0.19708514213562012, 0.020577453076839447, -0.04140095412731171, 0.02122334949672222, -0.23402278125286102, -0.03810025379061699, 0.024822702631354332, -0.04072488844394684, -0.016013875603675842, 0.07963239401578903, 0.06044430285692215, 0.02268790453672409, -0.032629482448101044, -0.0639011338353157, 0.006315303035080433, 0.1025901585817337, -0.11534025520086288, -0.11853472143411636 ]
null
null
transformers
# legal_t5_small_multitask_sv_en model Model on translating legal text from Swedish to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_sv_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Swedish to English. ### How to use Here is how to use this model to translate legal text from Swedish to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_sv_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_sv_en", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "inlämnat av följande ledamöter:" pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_sv_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_sv_en | 36.195| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish English", "tags": ["translation Swedish English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "inl\u00e4mnat av f\u00f6ljande ledam\u00f6ter:"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_sv_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Swedish English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Swedish English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_sv\_en model ========================================= Model on translating legal text from Swedish to English. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_sv\_en model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Swedish to English. ### How to use Here is how to use this model to translate legal text from Swedish to English in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_sv\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_en model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.058688051998615265, 0.11715979129076004, -0.0037083602510392666, 0.08419007807970047, 0.052181217819452286, -0.005152542144060135, 0.020043058320879936, 0.09307496249675751, -0.08041208982467651, 0.06310098618268967, 0.05751346796751022, 0.007723903749138117, 0.07753987610340118, 0.024035435169935226, 0.03942069038748741, -0.2066059410572052, 0.0001444102672394365, -0.032697103917598724, -0.024659689515829086, 0.10325582325458527, 0.09550075232982635, -0.05763886123895645, 0.03239911422133446, -0.025252947583794594, -0.04213035851716995, 0.025465957820415497, -0.09502387791872025, -0.0327773317694664, 0.10199187695980072, 0.07298673689365387, 0.09131024032831192, 0.006047661881893873, 0.0794818103313446, -0.1846768707036972, -0.012852702289819717, 0.04962785169482231, -0.0037501072511076927, 0.03359777480363846, 0.10278481245040894, 0.03398708626627922, 0.18194223940372467, -0.0466373935341835, 0.02958553098142147, 0.035583265125751495, -0.0817786157131195, -0.15161123871803284, -0.04780540242791176, -0.00011541655840119347, 0.09630590677261353, 0.13572104275226593, -0.050963904708623886, 0.0469764769077301, -0.010035494342446327, 0.09288126230239868, 0.06569908559322357, -0.23240821063518524, -0.03186893090605736, 0.03676082193851471, 0.06606528162956238, 0.10952599346637726, -0.04442529380321503, 0.002686994383111596, 0.07238903641700745, 0.07443543523550034, 0.07459928095340729, -0.04919527843594551, -0.047934435307979584, -0.022019939497113228, -0.14031864702701569, -0.038044802844524384, 0.18014290928840637, 0.014585311524569988, -0.033929456025362015, -0.10457135736942291, -0.05108696222305298, -0.05076606199145317, 0.01850716955959797, -0.0437941737473011, 0.010603194124996662, -0.0063352324068546295, 0.046872880309820175, -0.06970512121915817, -0.13182751834392548, -0.04864468425512314, -0.046787079423666, 0.06040402501821518, 0.04347209259867668, 0.017590157687664032, 0.04266918823122978, 0.06659458577632904, -0.12187802791595459, -0.08230967074632645, 0.009634200483560562, -0.002166193909943104, -0.091018445789814, 0.006985196378082037, -0.002737029455602169, -0.2497435361146927, -0.010653531178832054, -0.039344850927591324, -0.07351671159267426, 0.029621398076415062, 0.05471938103437424, 0.04540853202342987, 0.06171885505318642, 0.12779340147972107, -0.08669652044773102, -0.11067336052656174, -0.0441901758313179, -0.019735140725970268, -0.008210665546357632, 0.013298902660608292, -0.06302476674318314, -0.03164452686905861, 0.0022719588596373796, 0.02680455520749092, -0.004637647420167923, 0.006880593951791525, -0.01847013272345066, -0.008101179264485836, 0.07722041755914688, -0.09501573443412781, 0.008010595105588436, 0.0026773351710289717, -0.09106428921222687, -0.046196695417165756, 0.06621019542217255, -0.008879447355866432, -0.11723805218935013, 0.07201457768678665, -0.012134717777371407, -0.00765352975577116, -0.09480713307857513, -0.19735172390937805, 0.008620012551546097, -0.021324550732970238, -0.05156939476728439, -0.08532047271728516, -0.09926425665616989, -0.08852969855070114, 0.03652234002947807, -0.05152052640914917, -0.002008723095059395, -0.11123053729534149, -0.002637465950101614, 0.030672172084450722, -0.029583830386400223, 0.08581209182739258, -0.047600068151950836, 0.038209863007068634, -0.013696091249585152, 0.0709150955080986, 0.010934770107269287, 0.02748151868581772, -0.11049740761518478, 0.022906672209501266, -0.08175743371248245, 0.14188748598098755, -0.03998569771647453, -0.004012447781860828, -0.1298695206642151, -0.06429492682218552, -0.06757227331399918, 0.044602081179618835, 0.08003684878349304, 0.12517738342285156, -0.21964289247989655, -0.021413635462522507, 0.20362651348114014, -0.08106411248445511, -0.06694617867469788, 0.12963755428791046, -0.020518256351351738, 0.05186295881867409, 0.07944189012050629, 0.09217402338981628, 0.04158893600106239, -0.0439363494515419, -0.06188168749213219, 0.021402286365628242, 0.01263942290097475, 0.03554858639836311, 0.09252133965492249, -0.07243070006370544, 0.09130896627902985, 0.00914096087217331, 0.04428765922784805, 0.012548534199595451, -0.028610197827219963, -0.038758210837841034, 0.005749085918068886, -0.03998648747801781, -0.022678690031170845, 0.032390695065259933, 0.008672434836626053, -0.08401665091514587, -0.07851676642894745, -0.002339498372748494, 0.07144714891910553, -0.07080735266208649, 0.0437077060341835, 0.0384414941072464, -0.0692586675286293, -0.11322880536317825, 0.007422289345413446, -0.14405901730060577, -0.012764894403517246, 0.024190887808799744, -0.02006213366985321, 0.09189780056476593, 0.0710604265332222, 0.06564825773239136, 0.1002240926027298, -0.04724588245153427, -0.03143360838294029, -0.0009990745456889272, -0.022394834086298943, -0.10318159312009811, -0.1284184753894806, -0.017208492383360863, -0.020894944667816162, 0.0048698256723582745, -0.14052903652191162, 0.0059027886018157005, -0.04312488064169884, 0.09203055500984192, 0.009215962141752243, -0.027772411704063416, 0.033500414341688156, 0.07973555475473404, -0.038454774767160416, -0.02725485898554325, 0.03491750359535217, -0.020575828850269318, -0.07994930446147919, 0.12745502591133118, -0.07261303812265396, -0.13758917152881622, 0.08374672383069992, -0.0029046309646219015, -0.09283850342035294, 0.006691746413707733, -0.011108576320111752, -0.06479007005691528, -0.06277722865343094, -0.06428568810224533, 0.23623855412006378, 0.05449482798576355, 0.14509375393390656, -0.11544972658157349, -0.047282468527555466, 0.017955144867300987, -0.05212082341313362, -0.0201302208006382, 0.17912410199642181, 0.048076871782541275, -0.18125012516975403, 0.08948502689599991, 0.017773102968931198, -0.021169207990169525, 0.15340472757816315, 0.05794327333569527, -0.1077316403388977, 0.014640920795500278, 0.05219344422221184, -0.011992503888905048, 0.04925479367375374, -0.07980707287788391, -0.00950623955577612, 0.03142866864800453, 0.06261910498142242, 0.06574851274490356, -0.1056145429611206, 0.06538454443216324, 0.06785417348146439, -0.044586651027202606, 0.047215357422828674, -0.04386981949210167, -0.047672469168901443, 0.08455193042755127, -0.004071432631462812, -0.045228682458400726, -0.0358356311917305, -0.03917775675654411, -0.11389637738466263, 0.18423418700695038, -0.09896405786275864, -0.2344352900981903, -0.14533889293670654, 0.017934130504727364, -0.04412566125392914, 0.018034160137176514, 0.055175743997097015, -0.05978536233305931, -0.06269798427820206, -0.10112007707357407, 0.09035620093345642, -0.09321191161870956, -0.06734789162874222, -0.10771021991968155, 0.0513414591550827, -0.0018441386055201292, -0.14393183588981628, 0.03374335542321205, 0.002804832998663187, -0.014791479334235191, 0.007237499579787254, -0.030121076852083206, 0.11346380412578583, 0.1123773604631424, -0.029021456837654114, -0.038710497319698334, 0.004433389753103256, 0.14378874003887177, -0.06170329079031944, 0.04828441143035889, 0.0319577120244503, 0.014842897653579712, 0.04335116595029831, 0.14591020345687866, 0.03739118203520775, -0.03876204416155815, 0.029981471598148346, 0.054432619363069534, -0.01837848126888275, -0.27011072635650635, -0.09113512188196182, -0.05596514418721199, -0.01066927146166563, 0.08031601458787918, 0.046469178050756454, -0.08841241151094437, 0.01811077445745468, -0.04203000292181969, 0.004764235578477383, 0.01466672495007515, 0.05146533250808716, 0.017345048487186432, -0.025932524353265762, 0.07885245233774185, -0.05277055874466896, -0.04473258927464485, 0.08792763203382492, 0.021794674918055534, 0.19056275486946106, -0.0560959056019783, 0.2006414830684662, 0.052484702318906784, 0.055904973298311234, -0.004503450356423855, 0.07473083585500717, -0.048294924199581146, 0.028479328379034996, -0.013266653753817081, -0.061275504529476166, -0.012437536381185055, 0.07106287032365799, 0.013636383228003979, 0.013452672399580479, -0.035951703786849976, -0.02855253405869007, 0.06684832274913788, 0.20140080153942108, 0.09014156460762024, -0.17152941226959229, -0.06237750127911568, 0.010604213923215866, -0.07953935116529465, -0.08082130551338196, 0.008334693498909473, 0.17317217588424683, -0.08311903476715088, 0.013683718629181385, 0.011573057621717453, 0.13174697756767273, -0.11190896481275558, -0.022501111030578613, 0.011284437961876392, 0.024186359718441963, -0.018468007445335388, 0.12900619208812714, -0.23114947974681854, 0.17790599167346954, 0.015955578535795212, 0.07171459496021271, -0.045804206281900406, 0.029971538111567497, -0.06462837010622025, 0.004756172653287649, 0.12227021902799606, 0.03439135104417801, -0.07059647887945175, -0.10225693881511688, -0.10118367522954941, -0.01428311038762331, 0.06265178322792053, -0.038758669048547745, 0.09152310341596603, 0.0732332170009613, 0.021084291860461235, -0.021435260772705078, 0.02574581652879715, -0.033490583300590515, -0.15090420842170715, -0.0006075405981391668, -0.008615528233349323, -0.03439146652817726, -0.007664298173040152, -0.047672614455223083, -0.09685540944337845, 0.22157005965709686, -0.1392504721879959, -0.1092766672372818, -0.07250002026557922, 0.01524206530302763, 0.12372998148202896, -0.06734275072813034, 0.005830423440784216, 0.02907605841755867, 0.028309620916843414, -0.057087793946266174, -0.008031150326132774, 0.0705079659819603, -0.053535595536231995, -0.06728480011224747, -0.051973987370729446, 0.12700273096561432, 0.06577595323324203, 0.041166309267282486, -0.019527504220604897, 0.04515533521771431, -0.009794835932552814, -0.09602871537208557, 0.0020681004971265793, 0.039729874581098557, 0.15909698605537415, 0.03460484370589256, -0.05176183953881264, -0.07140938937664032, -0.07341832667589188, -0.08298273384571075, 0.15588858723640442, 0.17278575897216797, -0.047658588737249374, 0.056480128318071365, 0.1931377351284027, -0.11877953261137009, -0.17632170021533966, -0.05968962237238884, 0.09880535304546356, 0.09621431678533554, -0.011094097048044205, -0.16445232927799225, 0.022780809551477432, 0.10103881359100342, 0.002903078915551305, 0.017906375229358673, -0.37039342522621155, -0.14936023950576782, 0.016907399520277977, 0.041082218289375305, -0.0016897700261324644, -0.0858784094452858, -0.04761013016104698, -0.04463668167591095, -0.10036060214042664, 0.06352382153272629, -0.016325265169143677, 0.10204417258501053, 0.012531264685094357, 0.032965462654829025, 0.05264989286661148, -0.04235139489173889, 0.12953810393810272, -0.007606859318912029, 0.01731904037296772, -0.07241855561733246, 0.08196111768484116, 0.012081778608262539, -0.014477410353720188, 0.17248573899269104, -0.06863788515329361, 0.04652462527155876, -0.16420432925224304, -0.050867244601249695, -0.055053722113370895, 0.02819630317389965, -0.03699805215001106, -0.07969129830598831, -0.05003666132688522, 0.031132789328694344, 0.0655437782406807, -0.016478870064020157, 0.042740870267152786, -0.06963218003511429, 0.03079136461019516, 0.17270797491073608, 0.10812186449766159, 0.027242375537753105, -0.09031201899051666, 0.015847301110625267, -0.00016282776778098196, 0.061956629157066345, -0.1298617422580719, 0.00625648582354188, 0.15425242483615875, 0.017805220559239388, 0.11157166957855225, -0.031206533312797546, -0.13235029578208923, 0.01772458106279373, 0.061880923807621, -0.10677561163902283, -0.1364673376083374, -0.019786568358540535, -0.012004591524600983, -0.049806538969278336, -0.011880316771566868, 0.10688092559576035, -0.09993737936019897, -0.01744912751019001, -0.013851093128323555, 0.04101259633898735, -0.06032467260956764, 0.2197609692811966, 0.01551881805062294, 0.05721185728907585, -0.05739694461226463, 0.1219749003648758, 0.13170225918293, -0.1276196986436844, 0.046313825994729996, 0.18246901035308838, -0.06883088499307632, -0.043857723474502563, 0.051289886236190796, 0.130074143409729, -0.007680115755647421, -0.06315995752811432, -0.010932693257927895, -0.037487324327230453, 0.0059203654527664185, -0.009044031612575054, 0.03362127020955086, 0.03121436946094036, -0.005338636692613363, -0.04517481476068497, -0.09450498223304749, 0.10636363178491592, 0.06921079754829407, 0.008405925706028938, -0.03194068744778633, 0.11277290433645248, -0.0006079526501707733, -0.008698527701199055, -0.017562588676810265, 0.02961675450205803, -0.03523699566721916, 0.014781205914914608, -0.07083979994058609, -0.002194801578298211, -0.049811314791440964, -0.006797668524086475, -0.03467880189418793, -0.005804222542792559, -0.008695988915860653, 0.009359543211758137, -0.04452057555317879, -0.03557872399687767, -0.04972238093614578, 0.024998877197504044, -0.0913100317120552, -0.053126607090234756, 0.00030206292285583913, -0.017322666943073273, 0.056189410388469696, 0.012362874113023281, -0.01565289869904518, 0.03268668055534363, -0.0012157809687778354, 0.07421151548624039, 0.02641075663268566, 0.042694367468357086, 0.016309553757309914, -0.05547323822975159, -0.005123920738697052, 0.03125529736280441, -0.019649872556328773, -0.009528950788080692, 0.015989990904927254, -0.1262984275817871, -0.048835184425115585, -0.03235344961285591, -0.04152251034975052, -0.066457100212574, 0.10388896614313126, 0.06292222440242767, 0.07781647890806198, 0.13307788968086243, -0.07297304272651672, 0.07800191640853882, -0.14841555058956146, 0.000014804081729380414, 0.03937854990363121, -0.042462389916181564, -0.012015830725431442, -0.012333947233855724, 0.04145645350217819, -0.09832202643156052, 0.12881767749786377, 0.017040349543094635, 0.06147340312600136, 0.013859999366104603, -0.08579147607088089, -0.007391530554741621, 0.0188292246311903, 0.1077415868639946, -0.038345467299222946, -0.021109089255332947, -0.08477387577295303, 0.07783151417970657, -0.002145149279385805, 0.10171698778867722, 0.04285164549946785, 0.10627599060535431, 0.14468806982040405, 0.05814565718173981, 0.008700148202478886, -0.07447998970746994, -0.07706226408481598, 0.085719533264637, 0.007698405534029007, 0.06663467735052109, -0.01734336093068123, 0.1145424172282219, 0.15002277493476868, -0.1529874950647354, 0.10643456876277924, 0.006371861323714256, -0.09370951354503632, -0.07108240574598312, -0.1278310865163803, -0.0656433254480362, -0.02691471017897129, -0.023009195923805237, -0.12833483517169952, 0.011793245561420918, 0.007848629727959633, 0.059128858149051666, -0.02776762843132019, 0.10263152420520782, 0.014729178510606289, -0.11119524389505386, 0.06714335083961487, 0.00020528517779894173, 0.09528668969869614, -0.009927209466695786, 0.022983798757195473, 0.06723666191101074, 0.0017033049371093512, 0.031699880957603455, 0.047483813017606735, -0.01168280653655529, -0.0011671364773064852, 0.0005264203064143658, -0.06450267136096954, -0.04355648159980774, 0.03647051006555557, 0.08251166343688965, 0.16011369228363037, 0.0691259354352951, -0.06692100316286087, -0.03832358121871948, 0.1760147213935852, -0.05470942705869675, -0.08699323982000351, -0.10920966416597366, 0.1935621052980423, 0.02118440344929695, 0.04440135881304741, 0.021145613864064217, -0.10292935371398926, 0.0008639013394713402, 0.12077482789754868, 0.18760739266872406, -0.010039973072707653, -0.0333140604197979, 0.006520758382976055, -0.014154821634292603, 0.0035899668000638485, 0.03466179966926575, 0.03714435175061226, 0.23819392919540405, -0.07933301478624344, 0.061991747468709946, -0.06936302036046982, 0.03125220537185669, -0.01449151523411274, 0.14098604023456573, 0.0015138833550736308, 0.0029344982467591763, -0.05363587290048599, 0.10030143707990646, 0.000057360666687600315, -0.1480698436498642, -0.012938755564391613, -0.08062667399644852, -0.12419945001602173, 0.022212866693735123, 0.03363875672221184, 0.036585431545972824, 0.0617009662091732, 0.013735764659941196, 0.04508211836218834, 0.03771873936057091, 0.01285817101597786, -0.09156377613544464, -0.08893541991710663, -0.0044736783020198345, -0.0588098019361496, 0.08348970860242844, 0.023513268679380417, 0.1366250365972519, 0.09274882078170776, 0.020265670493245125, -0.07497538626194, 0.10379956662654877, 0.03026784025132656, 0.02285068854689598, 0.07907876372337341, 0.12101282179355621, -0.009352818131446838, 0.08051260560750961, 0.04108927398920059, -0.06408048421144485, 0.037047382444143295, -0.044069960713386536, -0.020801987498998642, -0.0975121259689331, 0.09955951571464539, -0.034448545426130295, 0.13465215265750885, 0.1890856772661209, -0.0032577887177467346, -0.010691662319004536, -0.060964975506067276, 0.016678016632795334, -0.024982232600450516, 0.08411423861980438, -0.009331827983260155, -0.20011202991008759, 0.014661576598882675, -0.021105708554387093, 0.03635562211275101, -0.2133045196533203, -0.02451794594526291, 0.025891082361340523, -0.04718612879514694, -0.02745538391172886, 0.07771021127700806, 0.07793185859918594, 0.01680978201329708, -0.02381509728729725, -0.08223274350166321, 0.013264353387057781, 0.10170887410640717, -0.10269570350646973, -0.10673514753580093 ]
null
null
transformers
# legal_t5_small_multitask_sv_es model Model on translating legal text from Swedish to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_sv_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Swedish to Spanish. ### How to use Here is how to use this model to translate legal text from Swedish to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_sv_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_sv_es", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "med beaktande av sin resolution av den 14 april 2005 om torkan i Portugal," pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_sv_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_sv_es | 35.506| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish Spanish", "tags": ["translation Swedish Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "med beaktande av sin resolution av den 14 april 2005 om torkan i Portugal,"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_sv_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Swedish Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_sv\_es model ========================================= Model on translating legal text from Swedish to Spanish. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_sv\_es model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Swedish to Spanish. ### How to use Here is how to use this model to translate legal text from Swedish to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_sv\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_es model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06830596178770065, 0.12060891091823578, -0.004060277249664068, 0.08788666874170303, 0.05731939151883125, -0.0065931761637330055, 0.012617374770343304, 0.09354101866483688, -0.0806465893983841, 0.06987113505601883, 0.04934593290090561, 0.02039637789130211, 0.07345244288444519, 0.012455209158360958, 0.029985614120960236, -0.20272721350193024, 0.001125606126151979, -0.034015290439128876, -0.027827808633446693, 0.09849519282579422, 0.08933444321155548, -0.0538477748632431, 0.03192627802491188, -0.03038197010755539, -0.043790243566036224, 0.03233392536640167, -0.08793962746858597, -0.04856400564312935, 0.10481923073530197, 0.07470142096281052, 0.09589439630508423, 0.01087114680558443, 0.0757465586066246, -0.16974981129169464, -0.015352125279605389, 0.04861823469400406, -0.008463773876428604, 0.03808984532952309, 0.11352881044149399, 0.024552999064326286, 0.1836031973361969, -0.05261421576142311, 0.028606442734599113, 0.03695737197995186, -0.09981529414653778, -0.1534852832555771, -0.05345133692026138, -0.016947321593761444, 0.09553239494562149, 0.1307573765516281, -0.045563727617263794, 0.04476809874176979, -0.009205284528434277, 0.08047961443662643, 0.06624191254377365, -0.2273702472448349, -0.03540133312344551, 0.03095812350511551, 0.06480025500059128, 0.11328864842653275, -0.0321967713534832, 0.008566606789827347, 0.07934127748012543, 0.06689995527267456, 0.0681285709142685, -0.056967586278915405, -0.05724242702126503, -0.023102808743715286, -0.13574020564556122, -0.049684178084135056, 0.16822506487369537, 0.008285412564873695, -0.03009001724421978, -0.10133780539035797, -0.058597832918167114, -0.04549724981188774, 0.02089397981762886, -0.0435449592769146, 0.014599869959056377, -0.01007017120718956, 0.0537782683968544, -0.05841881036758423, -0.12783686816692352, -0.0504874661564827, -0.04854171350598335, 0.06085438281297684, 0.040909890085458755, 0.011227983981370926, 0.05060587823390961, 0.07221738994121552, -0.10938217490911484, -0.0873490646481514, 0.01769143156707287, 0.008074046112596989, -0.08932867646217346, 0.01221485435962677, -0.0013430765829980373, -0.24885159730911255, -0.004797991830855608, -0.05201729014515877, -0.09331975877285004, 0.026613648980855942, 0.05464794859290123, 0.04380827769637108, 0.05900368094444275, 0.12484152615070343, -0.08182445168495178, -0.10459649562835693, -0.047595150768756866, -0.02763570100069046, -0.008687338791787624, 0.012800103053450584, -0.07364876568317413, -0.039473868906497955, -0.004995338153094053, 0.03865567222237587, 0.005549910478293896, 0.0003052256943192333, -0.02342114970088005, -0.016334805637598038, 0.07870576530694962, -0.08995147049427032, 0.014224372804164886, 0.004114153794944286, -0.09156633168458939, -0.041132088750600815, 0.05450578033924103, -0.015994910150766373, -0.11630488187074661, 0.061947453767061234, -0.012252973392605782, -0.01579216495156288, -0.10535591095685959, -0.1967996507883072, 0.015331602655351162, -0.025265906006097794, -0.05103515833616257, -0.08890101313591003, -0.09050489962100983, -0.09881594032049179, 0.0350186713039875, -0.06704273819923401, 0.0055795833468437195, -0.11621226370334625, 0.007104826625436544, 0.03152381256222725, -0.03142433613538742, 0.08457568287849426, -0.04850294440984726, 0.0447176918387413, 0.0005276167648844421, 0.0734521746635437, 0.009962517768144608, 0.025960957631468773, -0.10643259435892105, 0.03055090270936489, -0.08927594870328903, 0.15281648933887482, -0.03513924777507782, -0.003950396552681923, -0.13656963407993317, -0.06083421781659126, -0.07080821692943573, 0.04842807352542877, 0.08212108165025711, 0.13506090641021729, -0.21595817804336548, -0.02580896019935608, 0.21248967945575714, -0.07204650342464447, -0.06485502421855927, 0.12350524961948395, -0.018874390050768852, 0.05912052467465401, 0.07889324426651001, 0.09582564979791641, 0.037589963525533676, -0.04577960446476936, -0.05178984999656677, 0.0165456160902977, 0.02059554122388363, 0.025739343836903572, 0.09825947880744934, -0.0790153369307518, 0.09179062396287918, 0.009655731730163097, 0.02885083109140396, 0.011169914156198502, -0.026837950572371483, -0.03651032969355583, 0.01296135876327753, -0.030575424432754517, -0.022341648116707802, 0.03160467743873596, 0.014529057778418064, -0.07915981858968735, -0.07413458824157715, -0.0017418528441339731, 0.06390076130628586, -0.06679575890302658, 0.03857401758432388, 0.03640316054224968, -0.05886583402752876, -0.12295649945735931, 0.007157137617468834, -0.14632809162139893, -0.010260920971632004, 0.02473733015358448, -0.011622581630945206, 0.08776003867387772, 0.06552774459123611, 0.06184401735663414, 0.09795689582824707, -0.044471289962530136, -0.031282227486371994, -0.004907897673547268, -0.024940716102719307, -0.0963793396949768, -0.1297106295824051, -0.015877192839980125, -0.01857919432222843, 0.0004728507192339748, -0.1420564502477646, 0.004789819475263357, -0.03971157222986221, 0.08286328613758087, 0.002199470531195402, -0.020923813804984093, 0.02404879964888096, 0.08921342343091965, -0.03876705840229988, -0.03410806134343147, 0.041885919868946075, -0.015084897167980671, -0.06878165155649185, 0.12114465236663818, -0.07997044175863266, -0.13435319066047668, 0.08084479719400406, -0.003819916630163789, -0.09367556869983673, -0.005739937070757151, -0.00635830033570528, -0.058761462569236755, -0.0644611120223999, -0.062312569469213486, 0.2424929440021515, 0.05096714198589325, 0.15215303003787994, -0.12825468182563782, -0.04119468852877617, 0.02110840566456318, -0.05323800444602966, -0.02528017945587635, 0.18198665976524353, 0.05316825211048126, -0.17602068185806274, 0.09636338800191879, 0.009081540629267693, -0.015786731615662575, 0.15147079527378082, 0.05971960723400116, -0.10906606912612915, 0.012526912614703178, 0.062209825962781906, 0.0015479783760383725, 0.049629759043455124, -0.08260927349328995, -0.008572133257985115, 0.029570147395133972, 0.06348346918821335, 0.07296407967805862, -0.10951147228479385, 0.06407051533460617, 0.06831497699022293, -0.04338385537266731, 0.043610312044620514, -0.042861782014369965, -0.048259470611810684, 0.08805553615093231, 0.0050230929628014565, -0.04877142608165741, -0.0393504723906517, -0.03718498721718788, -0.11478640139102936, 0.18492108583450317, -0.09635429084300995, -0.24476739764213562, -0.1532047688961029, 0.01922566257417202, -0.039671868085861206, 0.032261885702610016, 0.0572543740272522, -0.062319040298461914, -0.05115192010998726, -0.08125132322311401, 0.0956542044878006, -0.08586089313030243, -0.0733024924993515, -0.1063610166311264, 0.05909993499517441, -0.008953922428190708, -0.1418684720993042, 0.03272641822695732, 0.006914944387972355, -0.020749276503920555, 0.001708409981802106, -0.0415562205016613, 0.127066969871521, 0.12239688634872437, -0.023546066135168076, -0.04053447023034096, 0.002838047221302986, 0.13418415188789368, -0.06716408580541611, 0.0390440970659256, 0.045008864253759384, 0.02569086104631424, 0.03759715333580971, 0.14044028520584106, 0.03697913885116577, -0.04348217695951462, 0.020473888143897057, 0.04883619025349617, -0.022585401311516762, -0.2729889452457428, -0.0974966287612915, -0.057873185724020004, -0.022373806685209274, 0.08636002987623215, 0.0416463166475296, -0.08013253659009933, 0.029023297131061554, -0.041233427822589874, 0.009129897691309452, 0.007734143640846014, 0.05308723449707031, 0.023585064336657524, -0.02574004791676998, 0.06951167434453964, -0.054486505687236786, -0.0556744821369648, 0.09328458458185196, 0.035855237394571304, 0.18880237638950348, -0.054656270891427994, 0.21192318201065063, 0.054363034665584564, 0.05740249529480934, -0.011119389906525612, 0.07274918258190155, -0.04625093564391136, 0.02978701703250408, -0.021592387929558754, -0.06435506045818329, -0.01666027493774891, 0.06077902391552925, 0.005200496409088373, 0.01143776997923851, -0.053128864616155624, -0.03945024311542511, 0.0671907365322113, 0.19571606814861298, 0.08012568950653076, -0.18100231885910034, -0.05824197456240654, 0.008307627402245998, -0.06605732440948486, -0.08410036563873291, 0.0016025988152250648, 0.1769375056028366, -0.0858905166387558, 0.013831469230353832, 0.008694712072610855, 0.13501125574111938, -0.11612467467784882, -0.01927613466978073, 0.007593284361064434, 0.025843467563390732, -0.013650333508849144, 0.13483184576034546, -0.23041699826717377, 0.18696625530719757, 0.017736656591296196, 0.06891009956598282, -0.04734160751104355, 0.03063419833779335, -0.07615050673484802, -0.0011183647438883781, 0.12128946930170059, 0.032417964190244675, -0.06495874375104904, -0.10036812722682953, -0.09788724035024643, -0.01642230525612831, 0.05485454201698303, -0.04383260756731033, 0.09383184462785721, 0.07572041451931, 0.01959046721458435, -0.022532718256115913, 0.02210536226630211, -0.019891804084181786, -0.16143213212490082, -0.0014502262929454446, -0.01686367765069008, -0.033154360949993134, -0.007695061154663563, -0.043180521577596664, -0.09071944653987885, 0.21733488142490387, -0.13624466955661774, -0.10364001989364624, -0.07492105662822723, 0.015647288411855698, 0.12679380178451538, -0.06719176471233368, 0.008421151898801327, 0.0302752573043108, 0.023591110482811928, -0.05371275544166565, -0.005907867569476366, 0.08511137217283249, -0.06105984374880791, -0.06195804476737976, -0.058883439749479294, 0.12457016110420227, 0.0608551949262619, 0.038328204303979874, -0.011862685903906822, 0.03997268155217171, -0.0029382011853158474, -0.09620560705661774, -0.006700527388602495, 0.03139449283480644, 0.16141754388809204, 0.0308636873960495, -0.05960459262132645, -0.0761595070362091, -0.0707930400967598, -0.08425337821245193, 0.14912304282188416, 0.16526904702186584, -0.04792889207601547, 0.057222820818424225, 0.19192850589752197, -0.12455589324235916, -0.16633081436157227, -0.05670534074306488, 0.10821168124675751, 0.09488126635551453, -0.013495378196239471, -0.17259372770786285, 0.0010655260412022471, 0.11047331243753433, 0.004100974649190903, 0.013986443169414997, -0.3976093828678131, -0.1406215876340866, 0.005263202358037233, 0.040928374975919724, 0.005204757675528526, -0.08774521201848984, -0.05933985859155655, -0.05246041342616081, -0.09399645775556564, 0.06698327511548996, -0.015612705610692501, 0.09959937632083893, 0.014607778750360012, 0.025376345962285995, 0.05801652744412422, -0.03846493735909462, 0.14311833679676056, -0.015882177278399467, 0.01468588039278984, -0.06614513695240021, 0.07708917558193207, 0.01553670596331358, -0.012313703075051308, 0.1575441211462021, -0.05459902435541153, 0.042593371123075485, -0.16708305478096008, -0.053101323544979095, -0.053834110498428345, 0.026690881699323654, -0.03943752124905586, -0.0716472640633583, -0.04116535931825638, 0.023122766986489296, 0.0683821514248848, -0.01648680679500103, 0.039375849068164825, -0.06659817695617676, 0.03496996685862541, 0.17736287415027618, 0.10557010024785995, 0.030526379123330116, -0.0934571772813797, 0.013527107425034046, 0.009053744375705719, 0.0640692189335823, -0.1188000962138176, 0.004754409659653902, 0.15362931787967682, 0.015308050438761711, 0.1011405661702156, -0.02964225970208645, -0.13342317938804626, 0.022143328562378883, 0.0734875500202179, -0.0915103480219841, -0.13243718445301056, -0.021949149668216705, -0.0029615096282213926, -0.04093865677714348, -0.014618190936744213, 0.1064402163028717, -0.08641090244054794, -0.02663728967308998, -0.015748372301459312, 0.031839340925216675, -0.05769537389278412, 0.226358100771904, 0.012021038681268692, 0.05570254102349281, -0.05869130417704582, 0.11740949749946594, 0.13722972571849823, -0.13768567144870758, 0.04121233522891998, 0.181283101439476, -0.06544677913188934, -0.040238283574581146, 0.06115751713514328, 0.13273321092128754, -0.03768504410982132, -0.06974924355745316, -0.026921458542346954, -0.03566691651940346, 0.0036260890774428844, -0.005359876435250044, 0.0317305363714695, 0.029521802440285683, -0.0027881150599569082, -0.04399075359106064, -0.08701494336128235, 0.09148465096950531, 0.07000307738780975, 0.008912705816328526, -0.034393493086099625, 0.11515314131975174, 0.0005000345990993083, -0.01667053811252117, -0.01751495897769928, 0.030083969235420227, -0.04934920743107796, 0.019331661984324455, -0.06927772611379623, 0.006676523480564356, -0.047740232199430466, -0.010214969515800476, -0.037457335740327835, 0.0010145774576812983, -0.0055648647248744965, 0.0053071873262524605, -0.044500913470983505, -0.031367793679237366, -0.0544753298163414, 0.030147798359394073, -0.08440252393484116, -0.045203231275081635, -0.005353278946131468, -0.019456196576356888, 0.04957621544599533, 0.005739250220358372, -0.014010925777256489, 0.03416476026177406, -0.02118673361837864, 0.0788145437836647, 0.03282908722758293, 0.046664655208587646, 0.02341587282717228, -0.053985144942998886, 0.007627895101904869, 0.038438521325588226, -0.01914471946656704, -0.011461419053375721, 0.014725332148373127, -0.12903599441051483, -0.04502470791339874, -0.026950784027576447, -0.05338044464588165, -0.060363996773958206, 0.11705505847930908, 0.06667632609605789, 0.06927014887332916, 0.12347937375307083, -0.07800373435020447, 0.07483627647161484, -0.14495010673999786, -0.005856842268258333, 0.035647157579660416, -0.03530549257993698, -0.011408216319978237, -0.009146149270236492, 0.04544832929968834, -0.08408868312835693, 0.12601174414157867, 0.03252207860350609, 0.08482243865728378, 0.0064316364005208015, -0.08245502412319183, -0.0003789950569625944, 0.01712055504322052, 0.1097349226474762, -0.03249870240688324, -0.012611174955964088, -0.08963058888912201, 0.09339979290962219, -0.00053874944569543, 0.10718106478452682, 0.030875224620103836, 0.11107286065816879, 0.14865455031394958, 0.0578235425055027, 0.01526291761547327, -0.076629139482975, -0.07278630882501602, 0.08613582700490952, 0.013656132854521275, 0.06133725494146347, -0.018292946740984917, 0.09525782614946365, 0.14407677948474884, -0.1554936319589615, 0.11185091733932495, 0.009683242999017239, -0.08820649981498718, -0.07026635110378265, -0.12869495153427124, -0.059395089745521545, -0.03782082721590996, -0.029823943972587585, -0.12447231262922287, 0.014790302142500877, 0.0005486115114763379, 0.05112549290060997, -0.032826632261276245, 0.10065209865570068, 0.015447377227246761, -0.12280574440956116, 0.06583385169506073, 0.0095227574929595, 0.11650415509939194, -0.015778668224811554, 0.029590031132102013, 0.0632556676864624, 0.015403186902403831, 0.03268866986036301, 0.05438452586531639, -0.005034966394305229, -0.0021853523794561625, 0.004289625678211451, -0.05767497792840004, -0.042001791298389435, 0.03515537455677986, 0.08563604950904846, 0.17264653742313385, 0.06339798867702484, -0.06877610832452774, -0.036584749817848206, 0.18962562084197998, -0.0545993447303772, -0.08070048689842224, -0.10785199701786041, 0.19675689935684204, 0.020795507356524467, 0.05170055106282234, 0.01765976846218109, -0.10612274706363678, -0.001827942207455635, 0.12126274406909943, 0.1817510724067688, -0.00898819137364626, -0.0379878394305706, 0.006158304866403341, -0.010507920756936073, 0.013440551236271858, 0.042320575565099716, 0.031244726851582527, 0.2633861005306244, -0.0807407796382904, 0.055806685239076614, -0.06638786196708679, 0.04409107193350792, -0.01905697025358677, 0.14923958480358124, -0.005480750929564238, 0.0005851021269336343, -0.04627937823534012, 0.10896386206150055, 0.004362525884062052, -0.15710614621639252, -0.002880813553929329, -0.08640719205141068, -0.12753580510616302, 0.015007952228188515, 0.037674665451049805, 0.03780025988817215, 0.07013589888811111, 0.015998870134353638, 0.03951549157500267, 0.03517179191112518, 0.01872505433857441, -0.09142297506332397, -0.09813940525054932, -0.00599297322332859, -0.05694388970732689, 0.0905771404504776, 0.016435114666819572, 0.1327013224363327, 0.09497211128473282, 0.019664591178297997, -0.07771647721529007, 0.10060416162014008, 0.02747504599392414, 0.02198624610900879, 0.0973757952451706, 0.09388764202594757, -0.009542210027575493, 0.07162512838840485, 0.0378958061337471, -0.07088663429021835, 0.04221845045685768, -0.0328274667263031, -0.009697556495666504, -0.10137953609228134, 0.10078587383031845, -0.03776506707072258, 0.12564541399478912, 0.1890227198600769, -0.005765608511865139, -0.0028823965694755316, -0.059767331928014755, 0.02774936705827713, -0.018765117973089218, 0.09969673305749893, -0.010333919897675514, -0.20950263738632202, 0.0112680122256279, -0.0199672132730484, 0.03376239538192749, -0.21483100950717926, -0.015131599269807339, 0.029362887144088745, -0.05088597163558006, -0.02973056770861149, 0.07112076878547668, 0.06805144995450974, 0.022891562432050705, -0.021992728114128113, -0.08741816133260727, 0.006577504798769951, 0.1027635931968689, -0.09421694278717041, -0.10185649245977402 ]
null
null
transformers
# legal_t5_small_multitask_sv_fr model Model on translating legal text from Swedish to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_sv_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Swedish to French. ### How to use Here is how to use this model to translate legal text from Swedish to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_sv_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_sv_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "Europaparlamentet understryker att det stora antalet kvinnor och barn bland flyktingar och internt fördrivna som registrerats av internationella organ som resultat av väpnade konflikter och inbördeskrig är mycket oroväckande." pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_sv_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_sv_fr | 45.790| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish French", "tags": ["translation Swedish French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Europaparlamentet understryker att det stora antalet kvinnor och barn bland flyktingar och internt f\u00f6rdrivna som registrerats av internationella organ som resultat av v\u00e4pnade konflikter och inb\u00f6rdeskrig \u00e4r mycket orov\u00e4ckande."}]}
text2text-generation
SEBIS/legal_t5_small_multitask_sv_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Swedish French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Swedish French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_sv\_fr model ========================================= Model on translating legal text from Swedish to French. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_sv\_fr model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Swedish to French. ### How to use Here is how to use this model to translate legal text from Swedish to French in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_sv\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_fr model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0610625259578228, 0.12564188241958618, -0.0038357337471097708, 0.09074143320322037, 0.042380593717098236, -0.016781991347670555, 0.02426631562411785, 0.08393581956624985, -0.07806786149740219, 0.062182165682315826, 0.06193497031927109, -0.00898397620767355, 0.06620588898658752, 0.027112001553177834, 0.04668082296848297, -0.194296196103096, 0.0036284946836531162, -0.034347083419561386, -0.03436526283621788, 0.10264860093593597, 0.09931443631649017, -0.046963244676589966, 0.03985828161239624, -0.013970945961773396, -0.04171030968427658, 0.03985542804002762, -0.09240242093801498, -0.032635994255542755, 0.10777119547128677, 0.07716593146324158, 0.08902234584093094, 0.008890902623534203, 0.08055666089057922, -0.18202684819698334, -0.014837791211903095, 0.05175306648015976, -0.015439716167747974, 0.03414064273238182, 0.11094430088996887, 0.023283690214157104, 0.1780758649110794, -0.052498940378427505, 0.026745567098259926, 0.037814222276210785, -0.07703462988138199, -0.14782382547855377, -0.05451817065477371, -0.006567865144461393, 0.07861240208148956, 0.13774538040161133, -0.04310193285346031, 0.039941590279340744, -0.024373339489102364, 0.08594705164432526, 0.0711832121014595, -0.2301793098449707, -0.03367841616272926, 0.026270708069205284, 0.06800895929336548, 0.10177363455295563, -0.04896106943488121, 0.010203687474131584, 0.068107008934021, 0.0672765001654625, 0.07726884633302689, -0.05601074546575546, -0.042845189571380615, -0.025328418239951134, -0.13784191012382507, -0.04143519327044487, 0.17815543711185455, 0.016834046691656113, -0.040042512118816376, -0.10268222540616989, -0.05233216658234596, -0.05231695622205734, 0.002723005134612322, -0.05212685093283653, 0.025097934529185295, -0.008782727643847466, 0.051567353308200836, -0.06379537284374237, -0.12220515310764313, -0.05380867049098015, -0.04704605042934418, 0.050436995923519135, 0.03999745100736618, 0.015061222948133945, 0.053516704589128494, 0.07050783932209015, -0.14361107349395752, -0.07006223499774933, 0.011650021187961102, -0.0009560506441630423, -0.09374987334012985, 0.009630754590034485, 0.001849057269282639, -0.22700370848178864, -0.0031767357140779495, -0.021559273824095726, -0.09511454403400421, 0.029067957773804665, 0.04905851185321808, 0.04462788626551628, 0.05247245728969574, 0.12905804812908173, -0.08720231801271439, -0.12077779322862625, -0.049970369786024094, -0.017842639237642288, -0.02427017316222191, 0.024895649403333664, -0.05936277285218239, -0.042238205671310425, -0.006487656384706497, 0.02354130893945694, -0.0034461598843336105, -0.007743736729025841, -0.0150859784334898, -0.005956096574664116, 0.0898870974779129, -0.09246188402175903, 0.016833283007144928, 0.001981621142476797, -0.0971313938498497, -0.036394841969013214, 0.06074022874236107, -0.007235878612846136, -0.12138132005929947, 0.0676732063293457, -0.012648559175431728, -0.01012607291340828, -0.1009521484375, -0.19569870829582214, 0.002983385929837823, -0.0019989877473562956, -0.06252811849117279, -0.08116798847913742, -0.08540218323469162, -0.08956804871559143, 0.04936227574944496, -0.059517934918403625, 0.003922801464796066, -0.11761274188756943, -0.007359699346125126, 0.03864054009318352, -0.015373053960502148, 0.07782337069511414, -0.05219155550003052, 0.02367692068219185, -0.025091690942645073, 0.06901425868272781, 0.00440462538972497, 0.02649473212659359, -0.0976995900273323, 0.03142422437667847, -0.09066292643547058, 0.15051594376564026, -0.041835520416498184, -0.02754152938723564, -0.13833998143672943, -0.07397522777318954, -0.0776645615696907, 0.03794378787279129, 0.08873633295297623, 0.13677601516246796, -0.2173326462507248, -0.0295493695884943, 0.2197733372449875, -0.08009327203035355, -0.06390652805566788, 0.15453577041625977, -0.02319624088704586, 0.03458964452147484, 0.07081109285354614, 0.0921018198132515, 0.05007467791438103, -0.052539531141519547, -0.059387944638729095, 0.03593727946281433, 0.019395336508750916, 0.03414475545287132, 0.10591167211532593, -0.07250611484050751, 0.08060996979475021, -0.0014230662491172552, 0.040288329124450684, 0.008875938132405281, -0.033566758036613464, -0.03852255642414093, 0.005685529671609402, -0.03134399279952049, -0.003757996717467904, 0.03310931473970413, 0.007702263537794352, -0.08239612728357315, -0.08424511551856995, -0.020680053159594536, 0.0729246735572815, -0.07221043854951859, 0.038115452975034714, 0.04728180915117264, -0.0667773112654686, -0.09669507294893265, 0.004390186630189419, -0.1380995810031891, -0.019051572307944298, 0.027246063575148582, -0.03165791556239128, 0.08681514859199524, 0.07071395963430405, 0.06978602707386017, 0.10998167097568512, -0.05104929208755493, -0.03027300350368023, -0.01630072109401226, -0.028567729517817497, -0.09003905206918716, -0.1333709955215454, -0.008550051599740982, -0.01952449604868889, 0.016869153827428818, -0.14625750482082367, 0.00530576054006815, -0.03959362208843231, 0.08945712447166443, 0.0052201636135578156, -0.01992795802652836, 0.007544477935880423, 0.0777919813990593, -0.03792586550116539, -0.0230605099350214, 0.03432149440050125, -0.025877825915813446, -0.05308270826935768, 0.12635324895381927, -0.053707607090473175, -0.12345007807016373, 0.08414103835821152, -0.0012292279861867428, -0.10159018635749817, -0.0035632250364869833, -0.017337119206786156, -0.05892770737409592, -0.05624667555093765, -0.05774979293346405, 0.23500113189220428, 0.05707155540585518, 0.15755391120910645, -0.11853817850351334, -0.049471478909254074, 0.02541189454495907, -0.04505544155836105, -0.023890623822808266, 0.18264256417751312, 0.05689966306090355, -0.1621202677488327, 0.09311211854219437, 0.023625168949365616, -0.019418783485889435, 0.14672566950321198, 0.05680099129676819, -0.10720507055521011, 0.011940503492951393, 0.06681083887815475, -0.006157637108117342, 0.05125168338418007, -0.07757940143346786, -0.013895859010517597, 0.027230331674218178, 0.05841466039419174, 0.06569242477416992, -0.10873640328645706, 0.07182682305574417, 0.06025811657309532, -0.05046788603067398, 0.05115790292620659, -0.03811066970229149, -0.05095848813652992, 0.09458832442760468, 0.005965542513877153, -0.07191039621829987, -0.04220639541745186, -0.04102661460638046, -0.11194533109664917, 0.19312122464179993, -0.09703031927347183, -0.2343311458826065, -0.13772578537464142, 0.020965058356523514, -0.06351838260889053, 0.02043445222079754, 0.05255797877907753, -0.05778473615646362, -0.05650718882679939, -0.10190854221582413, 0.07013525813817978, -0.09569526463747025, -0.06262122839689255, -0.11067860573530197, 0.0539296418428421, -0.00700934324413538, -0.14981307089328766, 0.02758011221885681, 0.0008926272275857627, -0.021171998232603073, -0.006417729426175356, -0.033000215888023376, 0.11747869104146957, 0.11241993308067322, -0.04217395931482315, -0.03917301073670387, 0.009151943027973175, 0.14827728271484375, -0.06324844807386398, 0.04295746237039566, 0.027816271409392357, 0.037096090614795685, 0.05083085224032402, 0.1422736942768097, 0.040484003722667694, -0.03148294612765312, 0.027759579941630363, 0.061886850744485855, -0.016899311915040016, -0.25749751925468445, -0.10752110928297043, -0.05844984948635101, -0.012712167575955391, 0.08405786007642746, 0.04532637819647789, -0.07454656809568405, 0.013614206574857235, -0.044194817543029785, 0.009394416585564613, 0.02163713425397873, 0.05039879307150841, 0.02609984576702118, -0.0223691463470459, 0.07630941271781921, -0.056501682847738266, -0.06080591306090355, 0.09470098465681076, 0.0207775067538023, 0.1846884936094284, -0.05902935937047005, 0.20228058099746704, 0.05460672453045845, 0.05680461972951889, -0.012824309058487415, 0.07456241548061371, -0.04267377406358719, 0.0270162932574749, -0.01749763824045658, -0.06319320946931839, 0.00009739722008816898, 0.06578607857227325, 0.005984777584671974, 0.0027484262827783823, -0.052983783185482025, -0.029457025229930878, 0.07561679929494858, 0.20168103277683258, 0.08457645773887634, -0.1766921430826187, -0.055630214512348175, -0.004968622699379921, -0.07500462234020233, -0.08516275137662888, 0.014616168104112148, 0.18137675523757935, -0.09230095148086548, 0.005903593730181456, 0.012773461639881134, 0.1301865130662918, -0.09989636391401291, -0.017973745241761208, 0.02779572270810604, 0.03286351263523102, -0.01867462880909443, 0.12713128328323364, -0.2243233174085617, 0.18230828642845154, 0.012398233637213707, 0.05815310776233673, -0.043589890003204346, 0.03393487632274628, -0.06198093667626381, 0.02209550514817238, 0.13638055324554443, 0.0396290197968483, -0.06423380225896835, -0.0837542861700058, -0.09244105219841003, -0.020212024450302124, 0.06776247173547745, -0.042089372873306274, 0.08064870536327362, 0.06971500813961029, 0.0056297024711966515, -0.023215798661112785, 0.023372851312160492, -0.04004869982600212, -0.16195149719715118, 0.003880434902384877, -0.009231111966073513, -0.04150790348649025, -0.0067543648183345795, -0.045893751084804535, -0.09877409785985947, 0.2337261438369751, -0.12633657455444336, -0.09007095545530319, -0.07002676278352737, 0.0005084319855086505, 0.12573201954364777, -0.06728318333625793, 0.024230925366282463, 0.022226829081773758, 0.035226013511419296, -0.06401928514242172, -0.006731603294610977, 0.0735836774110794, -0.07061778753995895, -0.048104774206876755, -0.055392827838659286, 0.128891259431839, 0.06463633477687836, 0.03660334274172783, -0.010972542688250542, 0.04144768416881561, -0.014274564571678638, -0.10925669223070145, -0.009633993729948997, 0.013711778447031975, 0.15647846460342407, 0.035505056381225586, -0.059987012296915054, -0.08196862041950226, -0.06717032939195633, -0.06625665724277496, 0.1598319113254547, 0.17258571088314056, -0.05721713975071907, 0.06332127749919891, 0.19308270514011383, -0.11237361282110214, -0.18670420348644257, -0.05105402693152428, 0.11127226054668427, 0.08143477141857147, -0.023727422580122948, -0.16886475682258606, 0.014757026918232441, 0.09562445431947708, 0.003851301735267043, 0.018248556181788445, -0.3824717700481415, -0.1452961266040802, 0.004972186405211687, 0.03121357411146164, 0.011554330587387085, -0.07491736114025116, -0.03876384347677231, -0.05075858533382416, -0.09658768028020859, 0.07165451347827911, -0.02057058922946453, 0.09172821789979935, 0.015155251137912273, 0.011350844986736774, 0.04962468892335892, -0.04052722081542015, 0.13518989086151123, 0.005788746755570173, 0.021187253296375275, -0.05831511691212654, 0.08493545651435852, 0.017781782895326614, -0.011020650155842304, 0.16677401959896088, -0.05669165402650833, 0.04753714054822922, -0.16371726989746094, -0.04541783779859543, -0.05536044389009476, 0.03517589718103409, -0.04088449105620384, -0.07086549699306488, -0.049890100955963135, 0.033711012452840805, 0.07715555280447006, -0.010613694787025452, 0.010659223422408104, -0.05424073711037636, 0.02246876247227192, 0.18491274118423462, 0.09709244966506958, 0.032930873334407806, -0.09995199739933014, 0.03379189968109131, 0.00826936960220337, 0.05898395925760269, -0.12079659849405289, 0.01685522496700287, 0.15427842736244202, 0.01411768514662981, 0.11092256009578705, -0.02787228673696518, -0.1249777302145958, 0.0012336549116298556, 0.063276007771492, -0.11040189862251282, -0.13849514722824097, -0.02880532667040825, -0.010413089767098427, -0.04002509266138077, -0.016232367604970932, 0.09960494190454483, -0.09908726811408997, -0.01754808984696865, -0.018038442358374596, 0.03373510017991066, -0.06839896738529205, 0.21124167740345, 0.005185525398701429, 0.06089670956134796, -0.05769389495253563, 0.10705632716417313, 0.1277928203344345, -0.1394730806350708, 0.03739451244473457, 0.18644966185092926, -0.06650484353303909, -0.044188130646944046, 0.05885776877403259, 0.13446427881717682, -0.02048412896692753, -0.06599143147468567, -0.020064007490873337, -0.03393000736832619, 0.009888029657304287, -0.0067606293596327305, 0.03552120923995972, 0.024150412529706955, 0.0054008858278393745, -0.05711545795202255, -0.0975399911403656, 0.10906801372766495, 0.07789844274520874, 0.0031072136480361223, -0.012569996528327465, 0.09653894603252411, -0.005176525563001633, -0.00171191047411412, -0.017109332606196404, 0.02984805218875408, -0.03878520429134369, 0.0067043909803032875, -0.06571773439645767, -0.0004441516939550638, -0.039685726165771484, 0.0006738508818671107, -0.038366373628377914, -0.002541065914556384, -0.006310778204351664, 0.009956629946827888, -0.049952562898397446, -0.03625953570008278, -0.043206989765167236, 0.031287167221307755, -0.09707199037075043, -0.043498601764440536, 0.006844778545200825, -0.024551158770918846, 0.056372594088315964, 0.01257320586591959, -0.007901093922555447, 0.024522140622138977, -0.014295714907348156, 0.0586933009326458, 0.006085652858018875, 0.050082772970199585, 0.014154342003166676, -0.06264868378639221, 0.010763181373476982, 0.03291987255215645, -0.014863048680126667, -0.014427335932850838, 0.009416515938937664, -0.11982783675193787, -0.04744786024093628, -0.03925188630819321, -0.055372294038534164, -0.06677018105983734, 0.11989609897136688, 0.05723074451088905, 0.08341310173273087, 0.12258069962263107, -0.0736079141497612, 0.07059108465909958, -0.13695505261421204, -0.005711663980036974, 0.039873577654361725, -0.04093341529369354, -0.007073065731674433, -0.018102701753377914, 0.0367857851088047, -0.08625683188438416, 0.13767410814762115, 0.03770465776324272, 0.08118691295385361, 0.013690811581909657, -0.10163295269012451, -0.016789404675364494, 0.024137675762176514, 0.08737538009881973, -0.03899445757269859, -0.01688832975924015, -0.09189116954803467, 0.08061806857585907, 0.004684134386479855, 0.11806277185678482, 0.04113908112049103, 0.11021685600280762, 0.13889753818511963, 0.056342706084251404, -0.001691288547590375, -0.08086160570383072, -0.07287663966417313, 0.09253276139497757, 0.01190160308033228, 0.06633008271455765, -0.03532261773943901, 0.10779337584972382, 0.13481181859970093, -0.15682248771190643, 0.11779054254293442, 0.016825953498482704, -0.09538374841213226, -0.076067253947258, -0.10989882051944733, -0.061653975397348404, -0.03609814867377281, -0.02754979021847248, -0.12193484604358673, 0.027771631255745888, 0.0009382748394273221, 0.07086886465549469, -0.021011672914028168, 0.10192723572254181, -0.016140226274728775, -0.11049025505781174, 0.07522116601467133, 0.002557696308940649, 0.10415171831846237, -0.01313056144863367, 0.026462294161319733, 0.06144049018621445, -0.0046119168400764465, 0.030914215371012688, 0.05079575255513191, -0.0035924650728702545, 0.00010070374264614657, 0.006556015461683273, -0.05577109009027481, -0.04315567761659622, 0.029435791075229645, 0.08548450469970703, 0.1694885492324829, 0.06713777780532837, -0.08004408329725266, -0.032566267997026443, 0.17460396885871887, -0.05692731589078903, -0.08773712813854218, -0.11654551327228546, 0.19571620225906372, 0.023935582488775253, 0.052291642874479294, 0.013162578456103802, -0.09354793280363083, -0.0022109525743871927, 0.11312199383974075, 0.19345705211162567, -0.007153528276830912, -0.03384619951248169, 0.013260346837341785, -0.013446557335555553, 0.023512305691838264, 0.025000765919685364, 0.040420517325401306, 0.2509223520755768, -0.08837343752384186, 0.05856180191040039, -0.06956233829259872, 0.03847470134496689, -0.012454859912395477, 0.1514686793088913, 0.002566239330917597, 0.001747254398651421, -0.053345222026109695, 0.09991326183080673, 0.01311869453638792, -0.15344341099262238, 0.0010885644005611539, -0.08430373668670654, -0.1188991516828537, 0.021658821031451225, 0.014751577749848366, 0.04008342698216438, 0.06553765386343002, 0.012349932454526424, 0.03343715891242027, 0.04348568245768547, 0.013116463087499142, -0.08683597296476364, -0.09923240542411804, -0.012607724405825138, -0.053511690348386765, 0.10445327311754227, 0.021319283172488213, 0.14384028315544128, 0.0940714031457901, 0.010446920990943909, -0.0703982263803482, 0.08261183649301529, 0.033530957996845245, 0.022957606241106987, 0.0736892893910408, 0.10952331125736237, -0.015996331349015236, 0.07619233429431915, 0.022407490760087967, -0.05581250041723251, 0.04579189419746399, -0.054761964827775955, -0.026740608736872673, -0.09879636019468307, 0.10893069952726364, -0.038286175578832626, 0.12983198463916779, 0.19361408054828644, 0.001335982233285904, 0.004154517315328121, -0.06767716258764267, 0.018366187810897827, -0.024541275575757027, 0.07853327691555023, -0.006826550234109163, -0.19566147029399872, 0.03322794288396835, -0.023586006835103035, 0.03856564313173294, -0.21328726410865784, -0.020008575171232224, 0.024736817926168442, -0.03572973236441612, -0.02223309502005577, 0.07490185648202896, 0.07415514439344406, 0.012136540375649929, -0.025497345253825188, -0.09080985188484192, 0.0054985275492072105, 0.09725184738636017, -0.07944222539663315, -0.10107339173555374 ]
null
null
transformers
# legal_t5_small_multitask_sv_it model Model on translating legal text from Swedish to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. ## Model description No pretraining is involved in case of legal_t5_small_multitask_sv_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. ## Intended uses & limitations The model could be used for translation of legal texts from Swedish to Italian. ### How to use Here is how to use this model to translate legal text from Swedish to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_multitask_sv_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_multitask_sv_it", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "De nationella tillsynsmyndigheterna får använda" pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_multitask_sv_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_multitask_sv_it | 44.242| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish Italian", "tags": ["translation Swedish Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "De nationella tillsynsmyndigheterna f\u00e5r anv\u00e4nda"}]}
text2text-generation
SEBIS/legal_t5_small_multitask_sv_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Swedish Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_multitask\_sv\_it model ========================================= Model on translating legal text from Swedish to Italian. It was first released in this repository. The model is parallely trained on the three parallel corpus with 42 language pair from jrc-acquis, europarl and dcep along with the unsupervised task where the model followed the task of prediction in a masked language model. Model description ----------------- No pretraining is involved in case of legal\_t5\_small\_multitask\_sv\_it model, rather the unsupervised task is added with all the translation task to realize the multitask learning scenario. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Swedish to Italian. ### How to use Here is how to use this model to translate legal text from Swedish to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_multitask\_sv\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 198, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Swedish Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Swedish to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_multitask\\_sv\\_it model (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0704728439450264, 0.13008534908294678, -0.003477026242762804, 0.09007397294044495, 0.057215820997953415, -0.00018154157442040741, 0.03142113238573074, 0.1043696179986, -0.06541020423173904, 0.07621956616640091, 0.04608243703842163, 0.002250349847599864, 0.08178292959928513, 0.03149985894560814, 0.026956424117088318, -0.2038758248090744, 0.007579237688332796, -0.03354642540216446, -0.023733945563435555, 0.10652101039886475, 0.09786159545183182, -0.06111367419362068, 0.03421735763549805, -0.034340985119342804, -0.06012086942791939, 0.034382414072752, -0.09209980815649033, -0.02960742637515068, 0.10351268202066422, 0.07744897156953812, 0.09293253719806671, 0.0061495499685406685, 0.08700032532215118, -0.17049449682235718, -0.014030812308192253, 0.05940255895256996, -0.006539561320096254, 0.02781447395682335, 0.11622899025678635, 0.031813398003578186, 0.1826857030391693, -0.0356648787856102, 0.01793922856450081, 0.03350290283560753, -0.07883060723543167, -0.13514132797718048, -0.04869069159030914, -0.003943603951483965, 0.08552181720733643, 0.13849464058876038, -0.04791099578142166, 0.05708414688706398, -0.021996811032295227, 0.08815592527389526, 0.06706267595291138, -0.21888096630573273, -0.04011353850364685, 0.021558338776230812, 0.05591554194688797, 0.11571863293647766, -0.03690463304519653, 0.010316099040210247, 0.06969432532787323, 0.06392066180706024, 0.05399996414780617, -0.04039374366402626, -0.03569139540195465, -0.031197018921375275, -0.13937842845916748, -0.039555586874485016, 0.18381249904632568, 0.009832609444856644, -0.03769968822598457, -0.09953904896974564, -0.051895834505558014, -0.045529890805482864, 0.016266262158751488, -0.04731599614024162, 0.0061997962184250355, -0.0017347497632727027, 0.06262628734111786, -0.04989929124712944, -0.1297377496957779, -0.05843321606516838, -0.03694537281990051, 0.08581778407096863, 0.051460735499858856, 0.013055160641670227, 0.050438567996025085, 0.07044211775064468, -0.12532538175582886, -0.07833311706781387, 0.015696004033088684, 0.0082007497549057, -0.08767633885145187, 0.0053512705489993095, -0.003568066516891122, -0.2480652779340744, -0.00469741178676486, -0.03482860326766968, -0.059482645243406296, 0.020918618887662888, 0.05298329517245293, 0.05119020491838455, 0.052036676555871964, 0.12023938447237015, -0.09740229696035385, -0.11686228960752487, -0.046117063611745834, -0.020660672336816788, -0.010010316036641598, 0.011708435602486134, -0.0727548748254776, -0.04480281472206116, -0.010430053807795048, 0.03377551957964897, 0.007108520250767469, 0.0021594036370515823, -0.02525050938129425, -0.01755443587899208, 0.06941860169172287, -0.10220129787921906, 0.01607147417962551, 0.004405987914651632, -0.10182391852140427, -0.02764788083732128, 0.046491172164678574, -0.015659429132938385, -0.12597718834877014, 0.0767781138420105, -0.015470484271645546, -0.011319627054035664, -0.10960588604211807, -0.19458036124706268, 0.013971981592476368, -0.027455363422632217, -0.056048788130283356, -0.08578924089670181, -0.08899223804473877, -0.08625662326812744, 0.03692390397191048, -0.06552508473396301, 0.011996460147202015, -0.10679759830236435, 0.007609723601490259, 0.025517335161566734, -0.023716973140835762, 0.07691384851932526, -0.04823694005608559, 0.041021596640348434, -0.015813106670975685, 0.07125179469585419, 0.0018802519189193845, 0.0274712685495615, -0.10974685847759247, 0.024045083671808243, -0.07903918623924255, 0.13519538938999176, -0.03606989607214928, -0.010975643992424011, -0.12972387671470642, -0.06346046924591064, -0.0754270851612091, 0.03706322982907295, 0.09313274174928665, 0.14536453783512115, -0.22528010606765747, -0.02092732861638069, 0.20831382274627686, -0.09144891053438187, -0.06809429079294205, 0.14740751683712006, -0.015492056496441364, 0.05855236202478409, 0.07652296125888824, 0.09704501181840897, 0.04151298478245735, -0.05242849141359329, -0.06493714451789856, 0.014408450573682785, 0.012662974186241627, 0.028236331418156624, 0.09760373085737228, -0.06120382994413376, 0.11296478658914566, 0.007866168394684792, 0.02533808723092079, 0.008269330486655235, -0.036288753151893616, -0.04111060872673988, 0.0012702184030786157, -0.047536179423332214, -0.02179734781384468, 0.03557196259498596, 0.013228897005319595, -0.0829862505197525, -0.08626636862754822, -0.038133908063173294, 0.08179345726966858, -0.08068244904279709, 0.042125288397073746, 0.030832752585411072, -0.03811696544289589, -0.0998709425330162, 0.007993267849087715, -0.13609842956066132, -0.01677289791405201, 0.03796479105949402, -0.01947571150958538, 0.0815943032503128, 0.062015317380428314, 0.06806271523237228, 0.10340123623609543, -0.0494016595184803, -0.030560625717043877, -0.005287668667733669, -0.026933742687106133, -0.09734606742858887, -0.1361255645751953, -0.025824472308158875, -0.013838971965014935, 0.014707185328006744, -0.1445290446281433, 0.011267622001469135, -0.02822933904826641, 0.10266601294279099, -0.0006118636229075491, -0.025781773030757904, 0.005100666545331478, 0.073525071144104, -0.04627479985356331, -0.03236746788024902, 0.03336919844150543, -0.01920497417449951, -0.06465891003608704, 0.1155422180891037, -0.06702345609664917, -0.11796180158853531, 0.0795663595199585, -0.008703303523361683, -0.09317104518413544, -0.007608743850141764, -0.011350183747708797, -0.06581521779298782, -0.05911017209291458, -0.0640021339058876, 0.2168429046869278, 0.05118652433156967, 0.13697749376296997, -0.1184619590640068, -0.03088030032813549, 0.028591861948370934, -0.046539824455976486, -0.030855171382427216, 0.1694166660308838, 0.06566140055656433, -0.17734190821647644, 0.09730660170316696, 0.02245011366903782, -0.029661066830158234, 0.1536809206008911, 0.06445768475532532, -0.11889061331748962, 0.01853073202073574, 0.07060561329126358, -0.00232996279373765, 0.04549453407526016, -0.09129583835601807, -0.002102167345583439, 0.02629929780960083, 0.059299178421497345, 0.07300174236297607, -0.10717512667179108, 0.06093859300017357, 0.05890343710780144, -0.04031411185860634, 0.0454118512570858, -0.05238854140043259, -0.061040040105581284, 0.09799285233020782, 0.00404557166621089, -0.06646738201379776, -0.03535100072622299, -0.03882712498307228, -0.10613130033016205, 0.18109473586082458, -0.08938419818878174, -0.23579396307468414, -0.14239530265331268, 0.020429426804184914, -0.049874577671289444, 0.025847474113106728, 0.043755631893873215, -0.054175447672605515, -0.05622517317533493, -0.0889890268445015, 0.0767643079161644, -0.09317267686128616, -0.06465734541416168, -0.11140579730272293, 0.061506565660238266, -0.01452560629695654, -0.13353917002677917, 0.030876772478222847, 0.0013173776678740978, -0.02827327512204647, -0.002779569011181593, -0.05010461062192917, 0.13296784460544586, 0.12309692800045013, -0.03563593327999115, -0.027979137375950813, 0.010053125210106373, 0.12101136893033981, -0.0784073919057846, 0.050925422459840775, 0.04681741073727608, 0.017164960503578186, 0.04122168943285942, 0.14416907727718353, 0.03659863397479057, -0.041649460792541504, 0.022541068494319916, 0.054314274340867996, -0.01719462312757969, -0.2619282305240631, -0.10011249780654907, -0.06146709620952606, -0.010617866180837154, 0.07685837894678116, 0.0426224023103714, -0.08406562358140945, 0.023924661800265312, -0.04851318895816803, -0.013311382383108139, 0.021632395684719086, 0.049208831042051315, 0.004573356825858355, -0.025729253888130188, 0.07931259274482727, -0.05272170528769493, -0.052906326949596405, 0.09781990945339203, 0.04069405421614647, 0.20371760427951813, -0.05840791389346123, 0.2138783484697342, 0.046679891645908356, 0.06821057200431824, -0.007575550582259893, 0.07417352497577667, -0.03495882824063301, 0.019455401226878166, -0.01090486254543066, -0.06532681733369827, -0.006869879085570574, 0.0720723569393158, 0.00830979272723198, -0.0033776198979467154, -0.041700080037117004, -0.02457868680357933, 0.07491441071033478, 0.21738597750663757, 0.07956171780824661, -0.18498247861862183, -0.05475173890590668, -0.0008745430968701839, -0.061631545424461365, -0.08244815468788147, 0.0009017492993734777, 0.18240971863269806, -0.0787288248538971, 0.0027005018200725317, 0.012584568932652473, 0.13091562688350677, -0.1219983622431755, -0.02502770721912384, 0.019204571843147278, 0.03281651437282562, -0.021248789504170418, 0.13914388418197632, -0.2263336032629013, 0.1838901937007904, 0.016083501279354095, 0.08160118013620377, -0.061590198427438736, 0.030117489397525787, -0.06540663540363312, 0.024643059819936752, 0.13174769282341003, 0.03213176503777504, -0.062374260276556015, -0.10970956832170486, -0.10693550854921341, -0.020719939842820168, 0.08395721763372421, -0.03450830280780792, 0.08489882200956345, 0.0651070848107338, 0.01884179189801216, -0.01754123345017433, 0.037044115364551544, -0.020452182739973068, -0.1563413143157959, 0.01517523918300867, 0.0039010720793157816, -0.04919564723968506, -0.004945353139191866, -0.054859161376953125, -0.08126647025346756, 0.23608426749706268, -0.12709781527519226, -0.08887657523155212, -0.07648394256830215, 0.02170661650598049, 0.1180679202079773, -0.06754402071237564, 0.007865672931075096, 0.02527899295091629, 0.03645983338356018, -0.058670684695243835, -0.00228691753000021, 0.08300530165433884, -0.061121415346860886, -0.05958738178014755, -0.059504490345716476, 0.11508005857467651, 0.06802491843700409, 0.034268248826265335, -0.008574938401579857, 0.036483872681856155, -0.0045712790451943874, -0.09304817020893097, 0.0003607483231462538, 0.024711884558200836, 0.15542523562908173, 0.04382682964205742, -0.06831412017345428, -0.07589951157569885, -0.07185574620962143, -0.08112964034080505, 0.14913839101791382, 0.1689581722021103, -0.05314493179321289, 0.036531608551740646, 0.1862679421901703, -0.11946653574705124, -0.18661488592624664, -0.035597216337919235, 0.10026717931032181, 0.09241405874490738, -0.02202301286160946, -0.16680946946144104, 0.0106812110170722, 0.11781871318817139, -0.0015193758299574256, 0.030007420107722282, -0.385427325963974, -0.13636711239814758, -0.0007929837447591126, 0.043844785541296005, 0.009765208698809147, -0.07967979460954666, -0.0378214530646801, -0.04655938223004341, -0.09315912425518036, 0.04604579508304596, -0.009241742081940174, 0.09902441501617432, 0.016665881499648094, 0.007494653575122356, 0.056049492210149765, -0.045533351600170135, 0.13075004518032074, -0.0010509316343814135, 0.01618553325533867, -0.06911724805831909, 0.07879214733839035, 0.01617061346769333, -0.0038715656846761703, 0.16330477595329285, -0.059699442237615585, 0.03358845412731171, -0.1494952142238617, -0.05544668808579445, -0.05162252113223076, 0.036347098648548126, -0.03771942853927612, -0.07786447554826736, -0.040738046169281006, 0.038754258304834366, 0.06564228981733322, -0.00366887915879488, 0.02358120307326317, -0.0721949115395546, 0.025414247065782547, 0.18080583214759827, 0.09760604798793793, 0.02590600959956646, -0.08521555364131927, 0.014556290581822395, 0.008179918862879276, 0.05709705501794815, -0.09931183606386185, 0.012901149690151215, 0.1568390280008316, 0.012168694287538528, 0.10593161731958389, -0.025278210639953613, -0.13218936324119568, 0.010759122669696808, 0.07351407408714294, -0.10310140252113342, -0.13568860292434692, -0.02516760490834713, -0.0053175282664597034, -0.046120304614305496, 0.005145789124071598, 0.10222910344600677, -0.1000434011220932, -0.014947379939258099, -0.021931886672973633, 0.04377305507659912, -0.0652557909488678, 0.22822819650173187, 0.00949263945221901, 0.048440683633089066, -0.057395610958337784, 0.1319401115179062, 0.12792372703552246, -0.1359795331954956, 0.03798515722155571, 0.18333885073661804, -0.0670342743396759, -0.04392234608530998, 0.05839892476797104, 0.11926978826522827, -0.017821744084358215, -0.0745343491435051, -0.034171510487794876, -0.03169061616063118, 0.0031867625657469034, -0.02547086775302887, 0.037787988781929016, 0.029579076915979385, -0.011605825275182724, -0.05039265379309654, -0.10957341641187668, 0.1025981679558754, 0.08525484800338745, 0.007026979234069586, -0.028870869427919388, 0.1150050163269043, 0.0029956086073070765, -0.027950461953878403, -0.016974950209259987, 0.02378893829882145, -0.03426132723689079, 0.018445506691932678, -0.0560574010014534, -0.01146136224269867, -0.04107163846492767, -0.0077738333493471146, -0.04292260855436325, -0.005687818396836519, -0.016143791377544403, 0.011016602627933025, -0.05560727044939995, -0.034440409392118454, -0.051668450236320496, 0.01993170566856861, -0.09023479372262955, -0.04305282235145569, -0.0015056566335260868, -0.022486858069896698, 0.060606442391872406, 0.016636623069643974, -0.012516398914158344, 0.024225130677223206, -0.01680060476064682, 0.07619278877973557, 0.012761395424604416, 0.04756108298897743, 0.013934871181845665, -0.04875718057155609, 0.012113897129893303, 0.04514089971780777, -0.025925161316990852, -0.011795595288276672, 0.014603224582970142, -0.11803606152534485, -0.033041127026081085, -0.03171269968152046, -0.05257316678762436, -0.06238045170903206, 0.12458539009094238, 0.06557536125183105, 0.07942000776529312, 0.11012176424264908, -0.0652201771736145, 0.07772590219974518, -0.13752016425132751, 0.000739866984076798, 0.045894939452409744, -0.050673361867666245, -0.0076768663711845875, -0.010219895280897617, 0.045008908957242966, -0.0866299495100975, 0.1191970705986023, 0.03070099465548992, 0.06989865750074387, 0.009933735243976116, -0.08304460346698761, -0.0026720997411757708, 0.020359022542834282, 0.10532600432634354, -0.04101227596402168, -0.014744575135409832, -0.08442334085702896, 0.08014274388551712, -0.008302075788378716, 0.12158125638961792, 0.034964412450790405, 0.12006845325231552, 0.13982684910297394, 0.05882207304239273, 0.019804805517196655, -0.08391018211841583, -0.08602137863636017, 0.08902767300605774, -0.0037803351879119873, 0.06975476443767548, -0.02683958411216736, 0.11406541615724564, 0.1441272795200348, -0.1561136543750763, 0.09664740413427353, 0.003777419915422797, -0.10030626505613327, -0.06840149313211441, -0.1412552148103714, -0.05943305417895317, -0.03580373153090477, -0.022978290915489197, -0.12890970706939697, 0.02553153596818447, 0.0033770783338695765, 0.0573037713766098, -0.04263759404420853, 0.11105406284332275, 0.006645101122558117, -0.10800837725400925, 0.07826132327318192, 0.002591631142422557, 0.10710925608873367, -0.01548849232494831, 0.002471397165209055, 0.05451590195298195, -0.003598658600822091, 0.037038713693618774, 0.0438360869884491, 0.0001614080829313025, -0.007574808783829212, 0.0051122792065143585, -0.05639466643333435, -0.04434405639767647, 0.023519262671470642, 0.0825800821185112, 0.16702960431575775, 0.053237903863191605, -0.06699242442846298, -0.037672385573387146, 0.17377513647079468, -0.0515952967107296, -0.07261287420988083, -0.11057823151350021, 0.1871521919965744, 0.023155439645051956, 0.04969044402241707, 0.013683670200407505, -0.10091532766819, 0.0006172105786390603, 0.12037205696105957, 0.18026870489120483, -0.0181015245616436, -0.03816898912191391, 0.014163754880428314, -0.011349871754646301, 0.010166964493691921, 0.04030805826187134, 0.030261564999818802, 0.23809446394443512, -0.08337970077991486, 0.0751994252204895, -0.0658087283372879, 0.04226989671587944, -0.015069548040628433, 0.15008428692817688, 0.0027449470944702625, 0.007133250590413809, -0.0688151940703392, 0.10522749274969101, 0.02307482808828354, -0.14122949540615082, -0.003427117597311735, -0.09410631656646729, -0.13073235750198364, 0.021244261413812637, 0.024028869345784187, 0.028054717928171158, 0.0818709060549736, 0.00390968332067132, 0.040413662791252136, 0.035181302577257156, 0.0173952654004097, -0.08767154067754745, -0.11640230566263199, -0.009411989711225033, -0.06525620818138123, 0.10295646637678146, 0.014539032243192196, 0.13850857317447662, 0.0962345227599144, 0.02335585653781891, -0.07623929530382156, 0.09466640651226044, 0.029129747301340103, 0.01406905334442854, 0.07409881800413132, 0.10665693134069443, -0.029393916949629784, 0.08281446993350983, 0.0374845452606678, -0.06749029457569122, 0.04833808168768883, -0.06215386465191841, -0.02002578042447567, -0.09871278703212738, 0.09837029874324799, -0.03881463408470154, 0.13774476945400238, 0.19323404133319855, -0.003798539750277996, -0.011708905920386314, -0.06417760252952576, 0.026925839483737946, -0.012391924858093262, 0.09671709686517715, -0.0067178369499742985, -0.22021722793579102, 0.013541243970394135, -0.02327621541917324, 0.041237931698560715, -0.19425074756145477, -0.02242477610707283, 0.022526226937770844, -0.0464199036359787, -0.033883705735206604, 0.07041334360837936, 0.08240281045436859, 0.01453208178281784, -0.02343134395778179, -0.11048537492752075, 0.011456163600087166, 0.09790673851966858, -0.08810603618621826, -0.1042010709643364 ]
null
null
transformers
# legal_t5_small_summ_cs model Model for Summarization of legal text written in Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_summ_cs is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for summarization of legal texts written in Cszech. ### How to use Here is how to use this model to summarize legal text written in Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_summ_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_summ_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "(2006/C 67/15) (Text s významem pro EHP) Dne 10. března 2006 se Komise rozhodla nevznést námitky proti výše uvedenému spojení a prohlásit ho za slučitelné se společným trhem. Toto rozhodnutí je založeno na čl. 6 odst. 1 písm. b) nařízení Rady (ES) č. 139/2004. Celý text rozhodnutí je přístupný pouze v angličtině a bude uveřejněn poté, co bude zbaven obchodního tajemství, které může případně obsahovat. Text bude dosažitelný: - na webové stránce Europa – hospodářská soutěž (http://europa.eu.int/comm/competition/mergers/cases/). Tato webová stránka umožňuje vyhledat jednotlivá rozhodnutí o spojení, a to včetně společnosti, čísla případu, data a indexu odvětví hospodářství. - v elektronické podobě na webové stránce EUR-Lex, pod dokumentem č. 32006M4093. EUR-Lex umožňuje přístup k Evropskému právu přes Internet. (http://europa.eu.int/eur-lex/lex) -------------------------------------------------- " pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_summ_cs model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 18 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | Rouge1 | Rouge2 | Rouge Lsum | |:-----:|:-----:|:-----:|:-----:| | legal_t5_small_summ_cs | 75.86|65.82 |74.95| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech", "tags": ["summarization Cszech model"], "datasets": ["jrc-acquis"], "widget": [{"text": "(2006/C 67/15) (Text s v\u00fdznamem pro EHP) Dne 10. b\u0159ezna 2006 se Komise rozhodla nevzn\u00e9st n\u00e1mitky proti v\u00fd\u0161e uveden\u00e9mu spojen\u00ed a prohl\u00e1sit ho za slu\u010diteln\u00e9 se spole\u010dn\u00fdm trhem. Toto rozhodnut\u00ed je zalo\u017eeno na \u010dl. 6 odst. 1 p\u00edsm. b) na\u0159\u00edzen\u00ed Rady (ES) \u010d. 139/2004. Cel\u00fd text rozhodnut\u00ed je p\u0159\u00edstupn\u00fd pouze v angli\u010dtin\u011b a bude uve\u0159ejn\u011bn pot\u00e9, co bude zbaven obchodn\u00edho tajemstv\u00ed, kter\u00e9 m\u016f\u017ee p\u0159\u00edpadn\u011b obsahovat. Text bude dosa\u017eiteln\u00fd: - na webov\u00e9 str\u00e1nce Europa \u2013 hospod\u00e1\u0159sk\u00e1 sout\u011b\u017e (http://europa.eu.int/comm/competition/mergers/cases/). Tato webov\u00e1 str\u00e1nka umo\u017e\u0148uje vyhledat jednotliv\u00e1 rozhodnut\u00ed o spojen\u00ed, a to v\u010detn\u011b spole\u010dnosti, \u010d\u00edsla p\u0159\u00edpadu, data a indexu odv\u011btv\u00ed hospod\u00e1\u0159stv\u00ed. - v elektronick\u00e9 podob\u011b na webov\u00e9 str\u00e1nce EUR-Lex, pod dokumentem \u010d. 32006M4093. EUR-Lex umo\u017e\u0148uje p\u0159\u00edstup k Evropsk\u00e9mu pr\u00e1vu p\u0159es Internet. (http://europa.eu.int/eur-lex/lex) -------------------------------------------------- "}]}
text2text-generation
SEBIS/legal_t5_small_summ_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "summarization Cszech model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #summarization Cszech model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_summ\_cs model ================================ Model for Summarization of legal text written in Cszech. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_summ\_cs is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for summarization of legal texts written in Cszech. ### How to use Here is how to use this model to summarize legal text written in Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_summ\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to summarize legal text written in Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Cszech model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to summarize legal text written in Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 68, 155, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Cszech model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to summarize legal text written in Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_cs model was trained on JRC-ACQUIS dataset consisting of 18 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.11413479596376419, 0.09702921658754349, -0.0023106213193386793, 0.054319217801094055, 0.08987230062484741, 0.027030261233448982, 0.07465356588363647, 0.10320234298706055, -0.07461323589086533, 0.07481224089860916, 0.07188806682825089, 0.025608250871300697, 0.08497494459152222, 0.12635642290115356, 0.02365574985742569, -0.20617368817329407, 0.012681893073022366, -0.01747109368443489, -0.008912530727684498, 0.15632832050323486, 0.12271346151828766, -0.10082854330539703, 0.030124615877866745, -0.032982949167490005, -0.14001090824604034, -0.025458648800849915, -0.0512952022254467, -0.06802263855934143, 0.059586331248283386, 0.024989310652017593, 0.12248896062374115, 0.022736085578799248, 0.0851958766579628, -0.17290320992469788, 0.0005183478933759034, 0.08606654405593872, 0.05185587331652641, 0.05458473041653633, 0.0574970468878746, -0.017418203875422478, 0.15060041844844818, -0.021733803674578667, 0.08340433984994888, 0.02565756067633629, -0.12965446710586548, -0.11339446157217026, -0.06576795130968094, 0.05238301679491997, 0.15769687294960022, 0.14021503925323486, -0.059183232486248016, 0.09128323942422867, -0.127201646566391, 0.061785127967596054, 0.0618506595492363, -0.23679864406585693, -0.07709234952926636, 0.052629876881837845, 0.04004225134849548, 0.06870601326227188, -0.07958818227052689, -0.06019056588411331, 0.02984590083360672, 0.018472256138920784, 0.031259141862392426, -0.0222180113196373, -0.001963662449270487, 0.0069035934284329414, -0.17542611062526703, -0.09886182844638824, 0.20103690028190613, 0.0073538911528885365, -0.06455571204423904, -0.08375957608222961, -0.027909882366657257, -0.15444079041481018, 0.019707228988409042, -0.036977581679821014, 0.03330142796039581, -0.008399595506489277, -0.00322935008443892, 0.01276534516364336, -0.12263616174459457, -0.10967507213354111, 0.046336155384778976, 0.06978531181812286, 0.0731482282280922, 0.000942109851166606, -0.009755203500390053, 0.15724562108516693, 0.02661694586277008, -0.0971599891781807, -0.03035127744078636, -0.002057662233710289, -0.1552180051803589, -0.05277077108621597, -0.03446874022483826, -0.1202666237950325, -0.057706840336322784, 0.12458749115467072, -0.008665647357702255, 0.07363367825746536, 0.04594597965478897, 0.032294586300849915, 0.03097485937178135, 0.14494943618774414, -0.05383221060037613, -0.045351553708314896, -0.05821702629327774, 0.0678257942199707, -0.06754262000322342, -0.0033743451349437237, 0.005980365443974733, 0.023550312966108322, 0.07889214158058167, 0.058422330766916275, -0.08042921125888824, 0.019275782629847527, -0.0528150238096714, -0.040300436317920685, 0.016845621168613434, -0.12023842334747314, -0.03576013073325157, 0.011661242693662643, -0.08935514837503433, -0.058781370520591736, 0.07959539443254471, -0.016532158479094505, -0.10606871545314789, 0.07686275243759155, -0.04852214828133583, -0.027965912595391273, -0.13187053799629211, -0.0725288987159729, -0.024810126051306725, -0.060244832187891006, -0.04846075549721718, -0.05178011208772659, -0.1578138768672943, -0.11058459430932999, 0.06803889572620392, -0.03425467386841774, -0.06396204978227615, -0.0665443167090416, -0.026830948889255524, 0.005668530240654945, -0.01931522600352764, 0.13588134944438934, -0.01956690475344658, 0.09869038313627243, 0.011736585758626461, 0.03148658573627472, 0.1619848608970642, 0.07421885430812836, -0.10471971333026886, 0.02453429065644741, -0.08669265359640121, 0.1433783918619156, -0.019967420026659966, 0.011600647121667862, -0.152593731880188, -0.07695489376783371, -0.05854712054133415, 0.03766413405537605, 0.07742622494697571, 0.10806877166032791, -0.14917229115962982, -0.0048997364938259125, 0.18642662465572357, -0.09412059932947159, -0.05702192708849907, 0.08900745958089828, -0.059006381779909134, 0.13517262041568756, 0.07783826440572739, 0.15811696648597717, 0.08031269162893295, -0.07186374068260193, -0.008055135607719421, -0.03461984917521477, 0.005064389668405056, 0.014736880548298359, 0.0879867821931839, -0.027652254328131676, -0.10588858276605606, -0.02135610580444336, -0.10068970173597336, 0.0016188176814466715, -0.0670163705945015, -0.06792687624692917, 0.008132671006023884, -0.05984032526612282, -0.04344332218170166, 0.06294236332178116, 0.028605258092284203, -0.04185272008180618, -0.12649311125278473, -0.008621284738183022, 0.11019095778465271, -0.06620236486196518, 0.009323843754827976, -0.07466662675142288, -0.04621926695108414, -0.05261221528053284, -0.015103096142411232, -0.17763979732990265, 0.03232031688094139, 0.04872005060315132, -0.005703098140656948, 0.028133917599916458, 0.047660358250141144, 0.02658616006374359, 0.03672829642891884, -0.00750702666118741, -0.05138799548149109, -0.04907465726137161, -0.03339739143848419, -0.12625561654567719, -0.13320665061473846, -0.0505276583135128, -0.023900415748357773, 0.14980967342853546, -0.2007913589477539, 0.036429598927497864, -0.06444409489631653, 0.03608952835202217, -0.011885667219758034, -0.06731090694665909, 0.04306731000542641, 0.019692150875926018, 0.019846389070153236, -0.06697319447994232, 0.03502700477838516, 0.04981222003698349, 0.013361039571464062, 0.0324215367436409, -0.125682532787323, -0.16499266028404236, 0.0765947550535202, 0.06271221488714218, -0.16525299847126007, -0.0006432259106077254, -0.05138105899095535, -0.06513604521751404, -0.07450307160615921, 0.008107601664960384, 0.21675850450992584, 0.004701925907284021, 0.12299615889787674, -0.09992807358503342, -0.060491595417261124, -0.009195110760629177, -0.012923061847686768, 0.03088648058474064, 0.12524834275245667, 0.07813775539398193, -0.11029431968927383, 0.04786341264843941, 0.021014925092458725, -0.02127641998231411, 0.12570735812187195, -0.01050772424787283, -0.12034996598958969, -0.007746416609734297, 0.06082510948181152, -0.023041781038045883, 0.10072298347949982, -0.16232219338417053, 0.011128305457532406, 0.019193824380636215, 0.022416789084672928, 0.034007906913757324, -0.1555584818124771, 0.020532842725515366, 0.0552135668694973, -0.030301235616207123, -0.0003468140203040093, -0.03326542675495148, -0.04118485748767853, 0.07125174254179001, 0.022873828187584877, -0.006736003328114748, -0.007997945882380009, -0.04293471947312355, -0.14948542416095734, 0.21582342684268951, -0.05004235729575157, -0.14074884355068207, -0.08169959485530853, 0.10148561745882034, 0.079693503677845, -0.012325840070843697, 0.028465276584029198, -0.0670761913061142, -0.050184790045022964, -0.1008625403046608, 0.07891349494457245, -0.05680660903453827, -0.02434246428310871, -0.06753066927194595, -0.0022147155832499266, 0.00897125992923975, -0.11533726006746292, 0.03684423491358757, -0.04209526255726814, -0.0869242399930954, 0.020380226895213127, -0.03646007180213928, 0.07109875977039337, 0.1788429617881775, -0.00538612948730588, 0.02815653197467327, -0.009725713171064854, 0.18949727714061737, -0.13007843494415283, 0.012040968053042889, 0.06515322625637054, 0.005293176043778658, 0.004950985312461853, 0.09328389912843704, -0.006500956602394581, -0.09258389472961426, 0.06719905138015747, 0.05851082131266594, -0.03759031370282173, -0.2733004093170166, -0.01450863666832447, -0.02965141087770462, -0.04313783347606659, 0.12403237819671631, 0.034903425723314285, 0.026532737538218498, 0.05086590722203255, -0.029638027772307396, -0.012425633147358894, 0.023100925609469414, 0.05593137815594673, -0.04945249482989311, 0.012983245775103569, 0.08188380300998688, -0.052558328956365585, -0.002865638816729188, 0.04197164624929428, -0.012855841778218746, 0.25632426142692566, -0.05063565447926521, 0.10793214291334152, 0.09103046357631683, 0.11679167300462723, 0.017358174547553062, 0.052213747054338455, -0.02694503776729107, 0.02156449854373932, 0.004367974121123552, -0.032068051397800446, -0.08375818282365799, 0.04995599016547203, 0.009467609226703644, 0.0271714199334383, -0.09754010289907455, -0.0047411322593688965, 0.007093570660799742, 0.34706565737724304, 0.0685652419924736, -0.2414221465587616, -0.07771897315979004, 0.006739291828125715, -0.06106667220592499, -0.0903710424900055, 0.055737122893333435, 0.08152022957801819, -0.12973624467849731, 0.0008075074874795973, -0.043560586869716644, 0.09317434579133987, -0.08923113346099854, -0.045414697378873825, 0.08142291009426117, 0.04389616847038269, -0.01949373446404934, 0.09018582850694656, -0.27327245473861694, 0.1841052770614624, -0.011591644957661629, 0.12095652520656586, -0.03342920169234276, 0.020366204902529716, -0.052614759653806686, -0.0014429670991376042, 0.15340040624141693, -0.004272916819900274, -0.004858078900724649, -0.07023320347070694, -0.09766162186861038, 0.02343493327498436, 0.0501682311296463, -0.08414189517498016, 0.10587407648563385, 0.005607612896710634, 0.032850757241249084, 0.012476980686187744, -0.10849382728338242, -0.11974430829286575, -0.09936682879924774, 0.012010959908366203, -0.09356088191270828, 0.03555356711149216, -0.05751383677124977, -0.042857732623815536, 0.024726783856749535, 0.15424981713294983, -0.13927242159843445, -0.07216938585042953, -0.10013126581907272, 0.04512719810009003, 0.09066733717918396, -0.03681434690952301, -0.013190160505473614, 0.020089177414774895, 0.044646136462688446, 0.01635652221739292, 0.0056469193659722805, 0.08333522081375122, -0.05392517149448395, -0.15950767695903778, -0.04797165095806122, 0.13144521415233612, 0.13158412277698517, 0.06724349409341812, -0.03327327221632004, 0.03028210811316967, -0.010129313915967941, -0.07768721133470535, 0.015889795497059822, 0.01010839082300663, 0.03105342388153076, 0.02463715709745884, -0.03829379379749298, 0.0150025375187397, -0.08600848913192749, -0.041527941823005676, 0.08480363339185715, 0.1264144331216812, -0.04873272776603699, 0.06798284500837326, 0.14244277775287628, -0.11040772497653961, -0.1705111712217331, 0.05074041709303856, 0.09611654281616211, 0.06218569353222847, -0.06835410743951797, -0.21938788890838623, 0.06160195544362068, 0.08945267647504807, 0.0027209892868995667, -0.037097834050655365, -0.3981952965259552, -0.12203119695186615, 0.09873227775096893, 0.0814092829823494, -0.07085756957530975, -0.10466507077217102, -0.01998608559370041, 0.05164759233593941, -0.0017738197930157185, 0.10897348821163177, -0.04490191116929054, 0.09195363521575928, 0.01181385014206171, -0.06124788895249367, 0.045393507927656174, -0.07060309499502182, 0.10660277307033539, 0.07654014974832535, 0.06681640446186066, -0.04390469193458557, 0.011879806406795979, 0.036289360374212265, -0.03967464715242386, 0.1381491869688034, 0.043397314846515656, 0.052996810525655746, -0.2084338665008545, -0.05954949930310249, -0.08793172985315323, -0.004224024713039398, -0.07829970866441727, -0.05349120870232582, -0.05234227329492569, 0.09435897320508957, 0.05736139789223671, -0.01859603077173233, 0.012941326946020126, -0.06176254525780678, 0.015184082090854645, 0.08815747499465942, 0.11563417315483093, 0.07808371633291245, -0.10133690387010574, 0.025494834408164024, 0.044036753475666046, 0.08904928714036942, -0.18875767290592194, -0.016736838966608047, 0.11577900499105453, 0.020520396530628204, 0.14618638157844543, 0.004024524241685867, -0.14050926268100739, -0.006373337935656309, 0.0373629592359066, -0.1078791618347168, -0.11210352182388306, -0.010393760167062283, -0.018362408503890038, -0.11440970003604889, -0.061388902366161346, 0.058507028967142105, -0.10295801609754562, -0.00662977434694767, -0.013410878367722034, 0.049869321286678314, -0.058730702847242355, 0.2119527906179428, 0.05289545655250549, 0.07333105057477951, -0.07244695723056793, 0.12025512754917145, 0.12246407568454742, -0.10389626771211624, 0.020126845687627792, 0.16383428871631622, -0.0900362879037857, -0.04777112230658531, 0.017941178753972054, 0.12544302642345428, 0.004759030416607857, -0.05290674790740013, -0.05102837085723877, -0.06225043162703514, 0.07534319162368774, 0.04014243558049202, 0.04327140375971794, 0.04421532154083252, -0.023943960666656494, 0.006795233581215143, -0.14922472834587097, 0.07910130172967911, 0.08239449560642242, 0.014456290751695633, -0.02266331948339939, 0.17629411816596985, 0.04985152557492256, 0.04235346242785454, -0.00679054856300354, -0.0502607636153698, -0.06182819977402687, 0.06686031818389893, 0.009626046754419804, -0.013427047058939934, -0.05722586438059807, -0.006808225065469742, -0.010891088284552097, -0.001486124936491251, -0.0019100407371297479, 0.02006932534277439, -0.05594499036669731, -0.029381388798356056, -0.026818232610821724, 0.05085643380880356, -0.07763485610485077, -0.006921646185219288, 0.006814422085881233, -0.07908589392900467, 0.07532655447721481, 0.01428248081356287, 0.0055665625259280205, -0.007228286936879158, -0.02483453042805195, 0.06637351214885712, -0.01860690303146839, 0.007695562206208706, -0.028799040243029594, -0.1194777637720108, 0.035554759204387665, -0.02703053504228592, -0.024346109479665756, 0.000845064059831202, 0.049277644604444504, -0.13600203394889832, 0.06363481283187866, -0.023019712418317795, -0.010053894482553005, -0.08064211905002594, 0.09858909249305725, 0.027006933465600014, 0.07723017781972885, 0.11012386530637741, -0.052228864282369614, 0.07105185091495514, -0.16638784110546112, -0.04649801179766655, 0.024855710566043854, 0.027713818475604057, -0.07015478610992432, -0.0510319247841835, 0.07607612013816833, -0.055199943482875824, 0.05441705137491226, 0.047895848751068115, 0.028859537094831467, 0.04522910341620445, -0.08799615502357483, 0.011123833246529102, 0.054215822368860245, 0.06578073650598526, 0.0013074349844828248, -0.0062685455195605755, 0.0450107604265213, 0.0332217738032341, -0.023010121658444405, 0.05682633817195892, 0.1564783900976181, 0.21214672923088074, 0.07910232990980148, 0.07008355855941772, -0.042442284524440765, -0.1114540547132492, -0.07604984194040298, 0.16398975253105164, -0.02379327081143856, 0.030764726921916008, -0.012747995555400848, 0.1141451895236969, 0.08151819556951523, -0.16358205676078796, 0.06702058017253876, -0.017983179539442062, -0.11349068582057953, -0.08427415043115616, -0.0651138499379158, -0.03891396522521973, -0.04383133351802826, -0.004760608542710543, -0.10502446442842484, 0.022266913205385208, 0.09151183813810349, 0.04938710853457451, -0.02731260284781456, 0.12963005900382996, 0.017121179029345512, -0.04182141274213791, 0.09155269712209702, -0.000813654565718025, 0.07775703072547913, -0.09456807374954224, 0.0050230263732373714, 0.040975943207740784, -0.006694403477013111, 0.06781857460737228, -0.002900527324527502, -0.015580672770738602, 0.03978186473250389, 0.04078008607029915, -0.0700380951166153, 0.017305292189121246, 0.0191186610609293, 0.10813634842634201, 0.06808207929134369, 0.06683345884084702, -0.025558538734912872, -0.019139250740408897, 0.2117396742105484, -0.04353836551308632, -0.08109646290540695, -0.17712445557117462, 0.17136822640895844, 0.05868876352906227, -0.006905431393533945, 0.05358860641717911, -0.09989762306213379, 0.017557241022586823, 0.2176143079996109, 0.11621227115392685, -0.03587877005338669, -0.029782351106405258, 0.025510616600513458, -0.006370241288095713, 0.013754000887274742, 0.11099788546562195, 0.04098252207040787, 0.15812262892723083, -0.09586431086063385, 0.03341948613524437, -0.06734566390514374, -0.0740121379494667, 0.016379568725824356, 0.13668392598628998, 0.025748861953616142, -0.004746475722640753, -0.05458642914891243, 0.10515079647302628, -0.007210275623947382, -0.19683977961540222, 0.08081167191267014, -0.07256797701120377, -0.13249683380126953, -0.0247181486338377, -0.010214956477284431, -0.013054977171123028, 0.04341394081711769, -0.004914619028568268, -0.010730801150202751, 0.12326309084892273, 0.03621971234679222, -0.06277836859226227, -0.14449766278266907, 0.09667473286390305, -0.026434483006596565, 0.16838636994361877, -0.008122062310576439, 0.07777407020330429, 0.08174309134483337, 0.017368467524647713, -0.11406118422746658, 0.06792358309030533, 0.059178903698921204, 0.019530074670910835, 0.043850503861904144, 0.13862451910972595, -0.027443112805485725, 0.1433175951242447, 0.04370693489909172, -0.11238549649715424, 0.034975603222846985, -0.14555864036083221, -0.036851666867733, -0.1772392988204956, 0.02293754555284977, -0.03617721050977707, 0.1407616138458252, 0.22195976972579956, -0.041879281401634216, 0.005430946126580238, -0.05623047798871994, 0.03267015889286995, -0.02417348325252533, 0.1499016135931015, 0.009972791187465191, -0.1834338754415512, 0.014641747809946537, -0.08113568276166916, 0.04057399928569794, -0.21721242368221283, -0.03591421619057655, 0.029662679880857468, -0.0846327394247055, -0.01637653075158596, 0.1324852854013443, 0.033397749066352844, 0.06302421540021896, -0.04775508865714073, 0.0037552393041551113, -0.012876057997345924, 0.15124624967575073, -0.13227181136608124, -0.0976957380771637 ]
null
null
transformers
# legal_t5_small_summ_de model Model for Summarization of legal text written in Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_summ_de is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for summarization of legal texts written in Deustch. ### How to use Here is how to use this model to summarize legal text written in Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_summ_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_summ_de", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "(90/365/EWG) DER RAT DER EUROPÄISCHEN GEMEINSCHAFTEN - gestützt auf den Vertrag zur Gründung der Europäischen Wirtschaftsgemeinschaft, insbesondere auf Artikel 235, auf Vorschlag der Kommission (1), nach Stellungnahme des Europäischen Parlaments (2), nach Stellungnahme des Wirtschafts- und Sozialausschusses (3), in Erwägung nachstehender Gründe: Gemäß Artikel 3 Buchstabe c) des Vertrages umfasst die Tätigkeit der Gemeinschaft, nach Maßgabe des Vertrages, die Beseitigung der Hindernisse für den freien Personenverkehr zwischen den Mitgliedstaaten. Artikel 8a des Vertrages sieht vor, daß der Binnenmarkt bis zum 31. Dezember 1992 zu verwirklichen ist. Der Binnenmarkt umfasst einen Raum ohne Binnengrenzen, in dem der freie Verkehr von Waren, Personen, Dienstleistungen und Kapital gemäß den Bestimmungen des Vertrages gewährleistet ist. Die Artikel 48 und 52 des Vertrages sehen die Freizuegigkeit der Arbeitnehmer und selbständig Erwerbstätigen vor, was ein Recht auf Aufenthalt in dem Mitgliedstaat beinhaltet, in dem sie ihr Berufsleben verbringen. Es empfiehlt sich, dieses Aufenthaltsrecht auch Personen zu gewähren, die aus dem Erwerbsleben ausgeschieden sind, auch wenn sie während ihres Berufslebens von dem Recht auf Freizuegigkeit keinen Gebrauch gemacht haben. Die Aufenthaltsberechtigten dürfen die öffentlichen Finanzen des Aufnahmemitgliedstaates nicht über Gebühr belasten. Nach Artikel 10 der Verordnung (EWG) Nr. 1408/71 (4) in der Fassung der Verordnung (EWG) Nr. 1390/81 (5) haben die Empfänger von Geldleistungen bei Invalidität und Alter und die Bezieher von Renten bei Arbeitsunfällen oder Berufskrankheiten auch dann weiterhin Anspruch auf diese Leistungen und Renten, wenn sie im Gebiet eines anderen Mitgliedstaates als des Staates wohnen, auf dessen Gebiet der zur Zahlung verpflichtete Träger seinen Sitz hat. Die Ausübung des Aufenthaltsrechts wird erst dann eine reale Möglichkeit, wenn es auch den Familienangehörigen zugestanden wird. Für die von dieser Richtlinie Begünstigten sollte eine Verwaltungsregelung entsprechend der insbesondere in der Richtlinie 68/360/EWG (6) und in der Richtlinie 64/221/EWG (7) vorgesehenen Regelung gelten. Der Vertrag enthält Befugnisse für den Erlaß der vorliegenden Richtlinie nur in Artikel 235 - HAT FOLGENDE RICHTLINIE ERLASSEN: Artikel 1 (1) Die Mitgliedstaaten gewähren den Angehörigen der Mitgliedstaaten, die in der Gemeinschaft eine Tätigkeit als Arbeitnehmer oder als Selbständige ausgeuebt haben, sowie deren Familienangehörigen nach der Definition von Absatz 2 unter der Bedingung das Aufenthaltsrecht, daß sie eine Invaliditäts-, Vorruhestands- oder Altersrente oder eine Rente wegen Arbeitsunfalls oder Berufskrankheit in einer solchen Höhe beziehen, daß sie während ihres Aufenthalts nicht die Sozialhilfe des Aufnahmemitgliedstaats in Anspruch nehmen müssen, und einen Krankenversicherungsschutz genießen, der im Aufnahmemitgliedstaat alle Risiken abdeckt. Die Existenzmittel des Antragstellers gelten als ausreichend, wenn sie einen Betrag übersteigen, unterhalb dessen der Aufnahmemitgliedstaat seinen Staatsangehörigen aufgrund der persönlichen Situation des Antragstellers und gegebenenfalls der Situation der nach Absatz 2 aufgenommenen Personen Sozialhilfe gewähren kann. Ist Unterabsatz 2 in einem Mitgliedstaat nicht anwendbar, so gelten die Existenzmittel des Antragstellers als ausreichend, wenn sie den Betrag der Grundrente der Sozialversicherung übersteigen, die der Aufnahmemitgliedstaat zahlt. (2) Bei dem Aufenthaltsberechtigten dürfen folgende Personen ungeachtet ihrer Staatsangehörigkeit in einem anderen Mitgliedstaat Wohnung nehmen: a) sein Ehegatte sowie die Verwandten in absteigender Linie, denen Unterhalt gewährt wird; b) seine Verwandten und die Verwandten seines Ehegatten in aufsteigender Linie, denen er Unterhalt gewährt. Artikel 2 (1) Zum Nachweis des Aufenthaltsrechts wird eine Bescheinigung, die »Aufenthaltserlaubnis für Staatsangehörige eines EWG-Mitgliedstaates%quot%, erteilt, deren Gültigkeit auf fünf Jahre mit Verlängerungsmöglichkeit begrenzt werden kann. Die Mitgliedstaaten können jedoch die Erneuerung der Aufenthaltserlaubnis nach den ersten zwei Aufenthaltsjahren verlangen, wenn sie dies für erforderlich halten. Einem Familienmitglied, das nicht die Staatsangehörigkeit eines Mitgliedstaats besitzt, wird ein Aufenthaltsdokument mit der gleichen Gültigkeitsdauer ausgestellt wie dem Staatsangehörigen, von dem es seine Rechte herleitet. Für die Erteilung der Aufenthaltserlaubnis oder des Aufenthaltsdokuments darf der Mitgliedstaat vom Antragsteller nur die Vorlage eines gültigen Personalausweises bzw. Reisepasses sowie den Nachweis verlangen, daß er die Voraussetzungen des Artikels 1 erfuellt. (2) Die Artikel 2 und 3, Artikel 6 Absatz 1 Buchstabe a) und Absatz 2 sowie Artikel 9 der Richtlinie 68/360/EWG finden auf die von dieser Richtlinie Begünstigten entsprechende Anwendung. Der Ehegatte eines Staatsangehörigen eines Mitgliedstaats, der im Hoheitsgebiet eines Mitgliedstaats aufenthaltsberechtigt ist, sowie die Kinder dieses Staatsangehörigen, denen er Unterhalt gewährt, haben, auch wenn sie die Staatsangehörigkeit eines Mitgliedstaats nicht besitzen, das Recht, im gesamten Hoheitsgebiet dieses Mitgliedstaats jedwede Tätigkeit im Lohn- oder Gehaltsverhältnis oder jedwede selbständige Erwerbstätigkeit auszuüben. Die Mitgliedstaaten dürfen nur aus Gründen der öffentlichen Ordnung, der öffentlichen Sicherheit oder der Volksgesundheit von den Bestimmungen dieser Richtlinie abweichen. In diesem Fall findet die Richtlinie 64/221/EWG Anwendung. (3) Die vorliegende Richtlinie berührt nicht die geltenden Rechtsvorschriften für den Erwerb von Zweitwohnungen. Artikel 3 Das Aufenthaltsrecht besteht, solange die Berechtigten die Bedingungen des Artikels 1 erfuellen. Artikel 4 Die Kommission arbeitet spätestens drei Jahre nach dem Beginn der Anwendung dieser Richtlinie und anschließend alle drei Jahre einen Bericht über ihre Anwendung aus und legt ihn dem Europäischen Parlament und dem Rat vor. Artikel 5 Die Mitgliedstaaten setzen die erforderlichen Rechts- und Verwaltungsvorschriften in Kraft, um dieser Richtlinie bis spätestens 30. Juni 1992 nachzukommen. Sie setzen die Kommission unverzueglich davon in Kenntnis. Artikel 6 Diese Richtlinie ist an die Mitgliedstaaten gerichtet. Geschehen zu Luxemburg am 28. Juni 1990. Im Namen des Rates Der Präsident M. GEOGHEGAN-QUINN (1) ABl. Nr. C 191 vom 28. 7. 1989, S. 3 und ABl. Nr. C 26 vom 3. 2. 1990, S. 19. (2) Stellungnahme vom 13. Juni 1990 (noch nicht im Amtsblatt veröffentlicht). (3) ABl. Nr. C 329 vom 30. 12. 1989, S. 25. (4) ABl. Nr. L 149 vom 5. 7. 1971, S. 2. (5) ABl. Nr. L 143 vom 29. 5. 1981, S. 1. (6) ABl. Nr. L 257 vom 19. 10. 1968, S. 13. (7) ABl. Nr. 56 vom 4. 4. 1964, S. 850/64. " pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_summ_de model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 23 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | Rouge1 | Rouge2 | Rouge Lsum | |:-----:|:-----:|:-----:|:-----:| | legal_t5_small_summ_de | 78.03|68.84 |76.95| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch", "tags": ["summarization Deustch model"], "datasets": ["jrc-acquis"], "widget": [{"text": "(90/365/EWG) DER RAT DER EUROP\u00c4ISCHEN GEMEINSCHAFTEN - gest\u00fctzt auf den Vertrag zur Gr\u00fcndung der Europ\u00e4ischen Wirtschaftsgemeinschaft, insbesondere auf Artikel 235, auf Vorschlag der Kommission (1), nach Stellungnahme des Europ\u00e4ischen Parlaments (2), nach Stellungnahme des Wirtschafts- und Sozialausschusses (3), in Erw\u00e4gung nachstehender Gr\u00fcnde: Gem\u00e4\u00df Artikel 3 Buchstabe c) des Vertrages umfasst die T\u00e4tigkeit der Gemeinschaft, nach Ma\u00dfgabe des Vertrages, die Beseitigung der Hindernisse f\u00fcr den freien Personenverkehr zwischen den Mitgliedstaaten. Artikel 8a des Vertrages sieht vor, da\u00df der Binnenmarkt bis zum 31. Dezember 1992 zu verwirklichen ist. Der Binnenmarkt umfasst einen Raum ohne Binnengrenzen, in dem der freie Verkehr von Waren, Personen, Dienstleistungen und Kapital gem\u00e4\u00df den Bestimmungen des Vertrages gew\u00e4hrleistet ist. Die Artikel 48 und 52 des Vertrages sehen die Freizuegigkeit der Arbeitnehmer und selbst\u00e4ndig Erwerbst\u00e4tigen vor, was ein Recht auf Aufenthalt in dem Mitgliedstaat beinhaltet, in dem sie ihr Berufsleben verbringen. Es empfiehlt sich, dieses Aufenthaltsrecht auch Personen zu gew\u00e4hren, die aus dem Erwerbsleben ausgeschieden sind, auch wenn sie w\u00e4hrend ihres Berufslebens von dem Recht auf Freizuegigkeit keinen Gebrauch gemacht haben. Die Aufenthaltsberechtigten d\u00fcrfen die \u00f6ffentlichen Finanzen des Aufnahmemitgliedstaates nicht \u00fcber Geb\u00fchr belasten. Nach Artikel 10 der Verordnung (EWG) Nr. 1408/71 (4) in der Fassung der Verordnung (EWG) Nr. 1390/81 (5) haben die Empf\u00e4nger von Geldleistungen bei Invalidit\u00e4t und Alter und die Bezieher von Renten bei Arbeitsunf\u00e4llen oder Berufskrankheiten auch dann weiterhin Anspruch auf diese Leistungen und Renten, wenn sie im Gebiet eines anderen Mitgliedstaates als des Staates wohnen, auf dessen Gebiet der zur Zahlung verpflichtete Tr\u00e4ger seinen Sitz hat. Die Aus\u00fcbung des Aufenthaltsrechts wird erst dann eine reale M\u00f6glichkeit, wenn es auch den Familienangeh\u00f6rigen zugestanden wird. F\u00fcr die von dieser Richtlinie Beg\u00fcnstigten sollte eine Verwaltungsregelung entsprechend der insbesondere in der Richtlinie 68/360/EWG (6) und in der Richtlinie 64/221/EWG (7) vorgesehenen Regelung gelten. Der Vertrag enth\u00e4lt Befugnisse f\u00fcr den Erla\u00df der vorliegenden Richtlinie nur in Artikel 235 - HAT FOLGENDE RICHTLINIE ERLASSEN: Artikel 1 (1) Die Mitgliedstaaten gew\u00e4hren den Angeh\u00f6rigen der Mitgliedstaaten, die in der Gemeinschaft eine T\u00e4tigkeit als Arbeitnehmer oder als Selbst\u00e4ndige ausgeuebt haben, sowie deren Familienangeh\u00f6rigen nach der Definition von Absatz 2 unter der Bedingung das Aufenthaltsrecht, da\u00df sie eine Invalidit\u00e4ts-, Vorruhestands- oder Altersrente oder eine Rente wegen Arbeitsunfalls oder Berufskrankheit in einer solchen H\u00f6he beziehen, da\u00df sie w\u00e4hrend ihres Aufenthalts nicht die Sozialhilfe des Aufnahmemitgliedstaats in Anspruch nehmen m\u00fcssen, und einen Krankenversicherungsschutz genie\u00dfen, der im Aufnahmemitgliedstaat alle Risiken abdeckt. Die Existenzmittel des Antragstellers gelten als ausreichend, wenn sie einen Betrag \u00fcbersteigen, unterhalb dessen der Aufnahmemitgliedstaat seinen Staatsangeh\u00f6rigen aufgrund der pers\u00f6nlichen Situation des Antragstellers und gegebenenfalls der Situation der nach Absatz 2 aufgenommenen Personen Sozialhilfe gew\u00e4hren kann. Ist Unterabsatz 2 in einem Mitgliedstaat nicht anwendbar, so gelten die Existenzmittel des Antragstellers als ausreichend, wenn sie den Betrag der Grundrente der Sozialversicherung \u00fcbersteigen, die der Aufnahmemitgliedstaat zahlt. (2) Bei dem Aufenthaltsberechtigten d\u00fcrfen folgende Personen ungeachtet ihrer Staatsangeh\u00f6rigkeit in einem anderen Mitgliedstaat Wohnung nehmen: a) sein Ehegatte sowie die Verwandten in absteigender Linie, denen Unterhalt gew\u00e4hrt wird; b) seine Verwandten und die Verwandten seines Ehegatten in aufsteigender Linie, denen er Unterhalt gew\u00e4hrt. Artikel 2 (1) Zum Nachweis des Aufenthaltsrechts wird eine Bescheinigung, die \u00bbAufenthaltserlaubnis f\u00fcr Staatsangeh\u00f6rige eines EWG-Mitgliedstaates%quot%, erteilt, deren G\u00fcltigkeit auf f\u00fcnf Jahre mit Verl\u00e4ngerungsm\u00f6glichkeit begrenzt werden kann. Die Mitgliedstaaten k\u00f6nnen jedoch die Erneuerung der Aufenthaltserlaubnis nach den ersten zwei Aufenthaltsjahren verlangen, wenn sie dies f\u00fcr erforderlich halten. Einem Familienmitglied, das nicht die Staatsangeh\u00f6rigkeit eines Mitgliedstaats besitzt, wird ein Aufenthaltsdokument mit der gleichen G\u00fcltigkeitsdauer ausgestellt wie dem Staatsangeh\u00f6rigen, von dem es seine Rechte herleitet. F\u00fcr die Erteilung der Aufenthaltserlaubnis oder des Aufenthaltsdokuments darf der Mitgliedstaat vom Antragsteller nur die Vorlage eines g\u00fcltigen Personalausweises bzw. Reisepasses sowie den Nachweis verlangen, da\u00df er die Voraussetzungen des Artikels 1 erfuellt. (2) Die Artikel 2 und 3, Artikel 6 Absatz 1 Buchstabe a) und Absatz 2 sowie Artikel 9 der Richtlinie 68/360/EWG finden auf die von dieser Richtlinie Beg\u00fcnstigten entsprechende Anwendung. Der Ehegatte eines Staatsangeh\u00f6rigen eines Mitgliedstaats, der im Hoheitsgebiet eines Mitgliedstaats aufenthaltsberechtigt ist, sowie die Kinder dieses Staatsangeh\u00f6rigen, denen er Unterhalt gew\u00e4hrt, haben, auch wenn sie die Staatsangeh\u00f6rigkeit eines Mitgliedstaats nicht besitzen, das Recht, im gesamten Hoheitsgebiet dieses Mitgliedstaats jedwede T\u00e4tigkeit im Lohn- oder Gehaltsverh\u00e4ltnis oder jedwede selbst\u00e4ndige Erwerbst\u00e4tigkeit auszu\u00fcben. Die Mitgliedstaaten d\u00fcrfen nur aus Gr\u00fcnden der \u00f6ffentlichen Ordnung, der \u00f6ffentlichen Sicherheit oder der Volksgesundheit von den Bestimmungen dieser Richtlinie abweichen. In diesem Fall findet die Richtlinie 64/221/EWG Anwendung. (3) Die vorliegende Richtlinie ber\u00fchrt nicht die geltenden Rechtsvorschriften f\u00fcr den Erwerb von Zweitwohnungen. Artikel 3 Das Aufenthaltsrecht besteht, solange die Berechtigten die Bedingungen des Artikels 1 erfuellen. Artikel 4 Die Kommission arbeitet sp\u00e4testens drei Jahre nach dem Beginn der Anwendung dieser Richtlinie und anschlie\u00dfend alle drei Jahre einen Bericht \u00fcber ihre Anwendung aus und legt ihn dem Europ\u00e4ischen Parlament und dem Rat vor. Artikel 5 Die Mitgliedstaaten setzen die erforderlichen Rechts- und Verwaltungsvorschriften in Kraft, um dieser Richtlinie bis sp\u00e4testens 30. Juni 1992 nachzukommen. Sie setzen die Kommission unverzueglich davon in Kenntnis. Artikel 6 Diese Richtlinie ist an die Mitgliedstaaten gerichtet. Geschehen zu Luxemburg am 28. Juni 1990. Im Namen des Rates Der Pr\u00e4sident M. GEOGHEGAN-QUINN (1) ABl. Nr. C 191 vom 28. 7. 1989, S. 3 und ABl. Nr. C 26 vom 3. 2. 1990, S. 19. (2) Stellungnahme vom 13. Juni 1990 (noch nicht im Amtsblatt ver\u00f6ffentlicht). (3) ABl. Nr. C 329 vom 30. 12. 1989, S. 25. (4) ABl. Nr. L 149 vom 5. 7. 1971, S. 2. (5) ABl. Nr. L 143 vom 29. 5. 1981, S. 1. (6) ABl. Nr. L 257 vom 19. 10. 1968, S. 13. (7) ABl. Nr. 56 vom 4. 4. 1964, S. 850/64. "}]}
text2text-generation
SEBIS/legal_t5_small_summ_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "summarization Deustch model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #summarization Deustch model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_summ\_de model ================================ Model for Summarization of legal text written in Deustch. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_summ\_de is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for summarization of legal texts written in Deustch. ### How to use Here is how to use this model to summarize legal text written in Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_summ\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to summarize legal text written in Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Deustch model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to summarize legal text written in Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 68, 155, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Deustch model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to summarize legal text written in Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_de model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.10963981598615646, 0.09662772715091705, -0.0026180229615420103, 0.06180949881672859, 0.09098587930202484, 0.028346318751573563, 0.07837538421154022, 0.10181918740272522, -0.08768025040626526, 0.06920461356639862, 0.07489604502916336, 0.031937114894390106, 0.08562220633029938, 0.12664644420146942, 0.027291394770145416, -0.2131434977054596, 0.014193277806043625, -0.015702512115240097, -0.014957877807319164, 0.14432281255722046, 0.12945471704006195, -0.09642541408538818, 0.030094284564256668, -0.03827597200870514, -0.12982122600078583, -0.009080750867724419, -0.05703434720635414, -0.06610413640737534, 0.06580769270658493, 0.029052333906292915, 0.11985862255096436, 0.021087799221277237, 0.0847763642668724, -0.17682363092899323, -0.0007806780049577355, 0.09288707375526428, 0.05154876410961151, 0.04414849728345871, 0.05731752887368202, -0.015910739079117775, 0.14361470937728882, -0.028382351621985435, 0.0890301987528801, 0.022255832329392433, -0.1323641687631607, -0.13290797173976898, -0.06461715698242188, 0.04454435408115387, 0.15675070881843567, 0.1435570865869522, -0.06117452308535576, 0.07740531861782074, -0.11982790380716324, 0.05830186977982521, 0.05089421942830086, -0.22001586854457855, -0.0747804269194603, 0.0479280948638916, 0.04369081184267998, 0.06577251106500626, -0.07373806089162827, -0.06488612294197083, 0.03516416624188423, 0.02926069125533104, 0.029409507289528847, -0.020838843658566475, 0.016753876581788063, 0.006600395310670137, -0.1707409918308258, -0.09104707092046738, 0.19062352180480957, 0.004598760977387428, -0.06909167766571045, -0.08434170484542847, -0.025941653177142143, -0.1499360203742981, 0.024635588750243187, -0.0568055585026741, 0.039362844079732895, -0.004467956256121397, 0.0032881589140743017, 0.01258831936866045, -0.11526697129011154, -0.11015888303518295, 0.03872797638177872, 0.061617687344551086, 0.0801347941160202, -0.005458693485707045, -0.020964907482266426, 0.1583230346441269, 0.017331168055534363, -0.09991224855184555, -0.03330296650528908, -0.006458217743784189, -0.14289461076259613, -0.050605565309524536, -0.03824044391512871, -0.12518663704395294, -0.050575826317071915, 0.14356184005737305, -0.01834295690059662, 0.0732707530260086, 0.0395071804523468, 0.022987496107816696, 0.03636555373668671, 0.14148947596549988, -0.056511275470256805, -0.060010358691215515, -0.05328553169965744, 0.0694427341222763, -0.06148746237158775, 0.0029614418745040894, 0.008170036599040031, 0.02205401100218296, 0.08425125479698181, 0.05822719633579254, -0.07744468003511429, 0.012986253947019577, -0.05481838062405586, -0.04108308628201485, 0.020011182874441147, -0.12064837664365768, -0.03721347823739052, 0.006977024022489786, -0.09191795438528061, -0.05494263768196106, 0.07377919554710388, -0.013414643704891205, -0.10451546311378479, 0.07631959021091461, -0.042775414884090424, -0.04007897526025772, -0.1328708976507187, -0.08162227272987366, -0.025745796039700508, -0.0601203627884388, -0.03614088147878647, -0.05992240458726883, -0.17391866445541382, -0.11255406588315964, 0.06819488108158112, -0.04032697528600693, -0.06168185919523239, -0.07047905027866364, -0.02268696203827858, 0.0032708283979445696, -0.020518489181995392, 0.12838420271873474, -0.018788600340485573, 0.0972052589058876, 0.024035470560193062, 0.03377215936779976, 0.155786395072937, 0.08307565748691559, -0.1005709171295166, 0.023700658231973648, -0.08505810797214508, 0.15419967472553253, -0.016304168850183487, 0.01100807823240757, -0.1562996208667755, -0.07643675804138184, -0.06146266683936119, 0.04439646378159523, 0.08769043534994125, 0.10732214897871017, -0.15948224067687988, -0.005109718535095453, 0.1814502775669098, -0.0958065614104271, -0.05040799826383591, 0.09082331508398056, -0.0639893114566803, 0.137824147939682, 0.08716609328985214, 0.1628485471010208, 0.08104367554187775, -0.06885837763547897, -0.004466813523322344, -0.04127800837159157, 0.00007456611638190225, 0.02950458973646164, 0.08336324244737625, -0.027340702712535858, -0.09976654499769211, -0.02360132336616516, -0.09453689306974411, 0.0032059168443083763, -0.05997886881232262, -0.06258085370063782, 0.016602519899606705, -0.06060582399368286, -0.04584068804979324, 0.061036866158246994, 0.024825671687722206, -0.04522723704576492, -0.12859024107456207, 0.012954172678291798, 0.10822907835245132, -0.06625290960073471, 0.005212347023189068, -0.06912928819656372, -0.05862756446003914, -0.04967100918292999, -0.011587926186621189, -0.1684054583311081, 0.029941504821181297, 0.044249799102544785, -0.026279427111148834, 0.02739408239722252, 0.0531010664999485, 0.027421364560723305, 0.036450646817684174, -0.010503287427127361, -0.05237514525651932, -0.06259091198444366, -0.027602996677160263, -0.118616983294487, -0.13214336335659027, -0.03940390795469284, -0.028949715197086334, 0.1290099173784256, -0.1940620243549347, 0.03753318265080452, -0.07303868979215622, 0.037365496158599854, -0.010941803455352783, -0.06214335560798645, 0.03764437884092331, 0.021107833832502365, 0.015372468158602715, -0.07334179431200027, 0.027745859697461128, 0.04368385300040245, 0.02009923942387104, 0.024413036182522774, -0.12313373386859894, -0.15692491829395294, 0.07920410484075546, 0.06538830697536469, -0.1693236380815506, 0.01620147004723549, -0.04908318445086479, -0.07055927813053131, -0.07464628666639328, -0.003914166241884232, 0.22426845133304596, 0.0002985134196933359, 0.12039528787136078, -0.09610434621572495, -0.060118693858385086, -0.013367334380745888, -0.0034638638608157635, 0.022142227739095688, 0.12137684971094131, 0.07519004493951797, -0.11047892272472382, 0.05380325764417648, 0.023122638463974, -0.0211652722209692, 0.1192149966955185, -0.005609814077615738, -0.11934846639633179, -0.006295698694884777, 0.05583928897976875, -0.02531968057155609, 0.0963490754365921, -0.16379931569099426, 0.014651468023657799, 0.012253714725375175, 0.03194866329431534, 0.03880727291107178, -0.14875300228595734, 0.019508834928274155, 0.05805003270506859, -0.031618136912584305, -0.00618921872228384, -0.031274233013391495, -0.044520989060401917, 0.06849496811628342, 0.02985103242099285, -0.009157671593129635, -0.0009051053202711046, -0.03916807472705841, -0.142979234457016, 0.21481390297412872, -0.05578784644603729, -0.1497415453195572, -0.07644417136907578, 0.10169108211994171, 0.07222113758325577, -0.00903222057968378, 0.0393604077398777, -0.07239066064357758, -0.05350600928068161, -0.09359486401081085, 0.10294747352600098, -0.06587734818458557, -0.025391835719347, -0.06675823777914047, -0.0020435114856809378, 0.013331183232367039, -0.12425951659679413, 0.03494809940457344, -0.03751262277364731, -0.0772065818309784, 0.017544720321893692, -0.032163672149181366, 0.06676620990037918, 0.1632215976715088, -0.0016390318050980568, 0.021455930545926094, -0.009605221450328827, 0.19839060306549072, -0.13287104666233063, 0.01600053533911705, 0.0717490017414093, 0.0090120118111372, 0.00524937966838479, 0.10275331884622574, -0.007745430804789066, -0.10299813747406006, 0.07265489548444748, 0.07073689997196198, -0.032154425978660583, -0.27928146719932556, -0.0020707075018435717, -0.024667274206876755, -0.04357852786779404, 0.12191113084554672, 0.03653064742684364, 0.032794978469610214, 0.04251517727971077, -0.02579788863658905, 0.005243991035968065, 0.029045427218079567, 0.05495145916938782, -0.02634429931640625, 0.009861760772764683, 0.08344806730747223, -0.05097409710288048, -0.015005591325461864, 0.04537738859653473, -0.004463511984795332, 0.25133344531059265, -0.04642318934202194, 0.08756943792104721, 0.09100668132305145, 0.09283377230167389, 0.006321187596768141, 0.05006474256515503, -0.02673148550093174, 0.024196108803153038, -0.00242623221129179, -0.0280614010989666, -0.07237647473812103, 0.055277056992053986, 0.0011849801521748304, 0.020770639181137085, -0.09199836850166321, -0.002580894622951746, 0.00567361805588007, 0.3332650661468506, 0.07408422231674194, -0.24744392931461334, -0.07580062001943588, 0.011216969229280949, -0.06347913295030594, -0.09797406941652298, 0.06071646511554718, 0.06312697380781174, -0.11851948499679565, 0.01160157099366188, -0.03836781904101372, 0.09212123602628708, -0.09386599808931351, -0.038639407604932785, 0.08577495813369751, 0.05639372020959854, -0.01794012077152729, 0.08541148155927658, -0.2941383421421051, 0.1672440469264984, -0.014176622964441776, 0.12536530196666718, -0.026763103902339935, 0.02635159343481064, -0.052723478525877, -0.00675577949732542, 0.14988893270492554, -0.0061452132649719715, 0.0007051927968859673, -0.06009770557284355, -0.08909229934215546, 0.02391395904123783, 0.044765327125787735, -0.07977834343910217, 0.09792386740446091, 0.007897747680544853, 0.04002921283245087, 0.016708122566342354, -0.10374446213245392, -0.1349191814661026, -0.09899084270000458, 0.011348060332238674, -0.10355354845523834, 0.048909835517406464, -0.05533019080758095, -0.04179416224360466, 0.020124338567256927, 0.151004359126091, -0.13816948235034943, -0.08164399862289429, -0.09749995917081833, 0.05637441948056221, 0.08658964186906815, -0.03512416034936905, -0.012727731838822365, 0.025065802037715912, 0.03455851972103119, 0.014448885805904865, -0.00025623684632591903, 0.08806129544973373, -0.05092550814151764, -0.1499810367822647, -0.05300748720765114, 0.1180846095085144, 0.1373506784439087, 0.05953262373805046, -0.033741094172000885, 0.025992682203650475, -0.009993528015911579, -0.07464498281478882, 0.020496243610978127, 0.016640305519104004, 0.025828618556261063, 0.013810640200972557, -0.03341718763113022, 0.015449227765202522, -0.08890596777200699, -0.04536363109946251, 0.07513577491044998, 0.12266969680786133, -0.04459860175848007, 0.061669401824474335, 0.14307568967342377, -0.11085325479507446, -0.17188328504562378, 0.052862826734781265, 0.09682147204875946, 0.06323880702257156, -0.06667735427618027, -0.22935445606708527, 0.05838941037654877, 0.09755408763885498, 0.007321988232433796, -0.026380158960819244, -0.3969513475894928, -0.12380999326705933, 0.10530734807252884, 0.07875711470842361, -0.06500083953142166, -0.10584662109613419, -0.015738075599074364, 0.04158726707100868, -0.0007939539500512183, 0.09862668067216873, -0.04214373975992203, 0.08593124896287918, 0.010623275302350521, -0.05657656490802765, 0.04203801229596138, -0.0654812753200531, 0.1117229238152504, 0.07740379869937897, 0.07456783205270767, -0.0464213490486145, 0.02018648386001587, 0.031263403594493866, -0.040030449628829956, 0.14361387491226196, 0.047710057348012924, 0.05261212959885597, -0.19816216826438904, -0.06754473596811295, -0.08032488822937012, -0.0022916579619050026, -0.07882161438465118, -0.06360441446304321, -0.05012253671884537, 0.09532103687524796, 0.053372930735349655, -0.024215368553996086, 0.006444776430726051, -0.05541404336690903, 0.0032073576003313065, 0.07238209992647171, 0.10922521352767944, 0.09131602942943573, -0.10112475603818893, 0.030770936980843544, 0.04107992723584175, 0.09287023544311523, -0.19002221524715424, -0.011814269237220287, 0.11868317425251007, 0.010379449464380741, 0.15088683366775513, 0.0041427006945014, -0.14581245183944702, -0.002811686834320426, 0.03672518953680992, -0.09830262511968613, -0.12526844441890717, -0.010556437075138092, -0.03426777571439743, -0.10541324317455292, -0.05273999646306038, 0.06399363279342651, -0.10480207949876785, -0.009471004828810692, -0.013102016411721706, 0.054462071508169174, -0.06410960108041763, 0.2159375101327896, 0.047685615718364716, 0.06589572131633759, -0.06788936257362366, 0.12837789952754974, 0.12118252366781235, -0.1131192147731781, 0.030295662581920624, 0.1606985628604889, -0.0837235301733017, -0.04808070510625839, 0.006833300925791264, 0.1326856017112732, 0.001262639183551073, -0.05700196698307991, -0.05356490612030029, -0.06584702432155609, 0.08007976412773132, 0.022625571116805077, 0.037743669003248215, 0.0467190220952034, -0.025756437331438065, 0.012531610205769539, -0.14785614609718323, 0.07815983891487122, 0.08569623529911041, 0.019575156271457672, -0.02854146622121334, 0.1581801176071167, 0.050323281437158585, 0.03747972846031189, -0.00534455943852663, -0.04432431235909462, -0.06373745948076248, 0.06257051229476929, -0.0167472492903471, -0.0170341394841671, -0.05410786345601082, -0.0003784826258197427, -0.010496692731976509, -0.00043804076267406344, 0.0006998823373578489, 0.02391071990132332, -0.049139417707920074, -0.037052225321531296, -0.024111730977892876, 0.05029720813035965, -0.08123711496591568, -0.01334563922137022, -0.0002877699153032154, -0.07878995686769485, 0.07886789739131927, 0.010357176885008812, 0.009975490160286427, -0.014742174185812473, -0.008806140162050724, 0.07168318331241608, -0.015815889462828636, 0.007691072300076485, -0.029036369174718857, -0.13017085194587708, 0.03293940797448158, -0.027743063867092133, -0.031873516738414764, -0.004164454992860556, 0.04557104781270027, -0.13869349658489227, 0.06789799779653549, -0.0253913477063179, -0.011205682530999184, -0.07829839736223221, 0.09180480986833572, 0.02826441265642643, 0.08039985597133636, 0.11161379516124725, -0.05556464195251465, 0.07003262639045715, -0.16112324595451355, -0.04717542603611946, 0.02536231465637684, 0.031117046251893044, -0.05822550132870674, -0.05029677599668503, 0.076165109872818, -0.05618744343519211, 0.06229979544878006, 0.05075427517294884, 0.02904600463807583, 0.04674592241644859, -0.10549502074718475, 0.005062668118625879, 0.055504269897937775, 0.06499270349740982, -0.0058500804007053375, -0.006341022904962301, 0.022036654874682426, 0.03329137712717056, -0.013814433477818966, 0.06703988462686539, 0.1495261788368225, 0.22796937823295593, 0.0751175731420517, 0.06695432960987091, -0.04090426117181778, -0.1134379431605339, -0.09204075485467911, 0.1627664417028427, -0.009255386888980865, 0.02539382502436638, -0.006939782295376062, 0.09973545372486115, 0.08184090256690979, -0.15962544083595276, 0.0659327507019043, -0.021185725927352905, -0.11028028279542923, -0.08463932573795319, -0.05684379115700722, -0.03786030411720276, -0.04627613723278046, -0.005870350636541843, -0.10370732098817825, 0.0195782333612442, 0.07552900165319443, 0.05243273824453354, -0.0331864170730114, 0.13103991746902466, 0.023475440219044685, -0.04809454083442688, 0.08622431010007858, 0.0008132390794344246, 0.07300397008657455, -0.07852459698915482, 0.008953412994742393, 0.036983076483011246, 0.0014428169233724475, 0.061384595930576324, -0.005717314314097166, -0.02092709019780159, 0.03785919025540352, 0.039410028606653214, -0.060159385204315186, 0.01523568108677864, 0.0265448447316885, 0.09299156069755554, 0.07996658235788345, 0.07127237319946289, -0.030892739072442055, -0.011658640578389168, 0.1987643539905548, -0.04041304811835289, -0.08942463994026184, -0.1752311885356903, 0.1847788691520691, 0.07004953175783157, 0.0050992113538086414, 0.05082808807492256, -0.10803062468767166, 0.013180752284824848, 0.20584559440612793, 0.12715379893779755, -0.03053310699760914, -0.03539435192942619, 0.027919188141822815, -0.0062088556587696075, 0.013802075758576393, 0.10674959421157837, 0.050536707043647766, 0.15594805777072906, -0.09031863510608673, 0.03586728498339653, -0.0668482780456543, -0.06661000102758408, 0.018944209441542625, 0.13911305367946625, 0.030345143750309944, -0.0029377092141658068, -0.0520603246986866, 0.10954666137695312, -0.003213572781533003, -0.20746538043022156, 0.0807536318898201, -0.0686815157532692, -0.13226747512817383, -0.02426697313785553, -0.014745955355465412, -0.018169431015849113, 0.04300988093018532, -0.00009463172318646684, -0.005812894552946091, 0.13334515690803528, 0.03370623290538788, -0.06316732615232468, -0.13595393300056458, 0.0937025174498558, -0.0031736039090901613, 0.15563571453094482, -0.009176104329526424, 0.07533062249422073, 0.08636052161455154, 0.023955421522259712, -0.11540446430444717, 0.07414212077856064, 0.055866219103336334, 0.035138629376888275, 0.04129201918840408, 0.12699824571609497, -0.028104448691010475, 0.13432808220386505, 0.044590212404727936, -0.10875328630208969, 0.026625879108905792, -0.1356058269739151, -0.04801061749458313, -0.18204091489315033, 0.033427733927965164, -0.03839724883437157, 0.13700519502162933, 0.2207607924938202, -0.0389707088470459, 0.010166004300117493, -0.058285605162382126, 0.03488299250602722, -0.018865585327148438, 0.15563175082206726, 0.009934104047715664, -0.17889155447483063, 0.016205858439207077, -0.07521519809961319, 0.04315650090575218, -0.22171668708324432, -0.03584441542625427, 0.03355550393462181, -0.08070825785398483, -0.014761368744075298, 0.1308322697877884, 0.030355311930179596, 0.06388464570045471, -0.05239075794816017, -0.0006081960746087134, -0.02253437228500843, 0.15160024166107178, -0.12711699306964874, -0.09892762452363968 ]
null
null
transformers
# legal_t5_small_summ_en model Model for Summarization of legal text written in English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_summ_en is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for summarization of legal texts written in English. ### How to use Here is how to use this model to summarize legal text written in English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_summ_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_summ_en", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "THE COMMISSION OF THE EUROPEAN COMMUNITIES, Having regard to the Treaty establishing the European Community, Having regard to Council Regulation (EC) No 1255/1999 of 17 May 1999 on the common organisation of the market in milk and milk products [1], and in particular Article 15 thereof, Whereas: (1) Article 7(1) of Commission Regulation (EC) No 2799/1999 [2] fixes the amount of aid for skimmed milk and skimmed-milk powder intended for animal feed taking into account the factors set out in Article 11(2) of Regulation (EC) No 1255/1999. In view of the developments in the market price of skimmed-milk powder, of the increase in the market prices for competing proteins, and of the reduction of the supply of skimmed-milk powder, the amount of aid should be reduced. (2) Regulation (EC) No 2799/1999 should therefore be amended accordingly. (3) The Management Committee for Milk and Milk Products has not delivered an opinion within the time-limit set by its chairman, HAS ADOPTED THIS REGULATION: Article 1 In Article 7 of Regulation (EC) No 2799/1999, paragraph 1 is replaced by the following: "1. Aid is fixed at: (a) EUR 1,62 per 100 kg of skimmed milk with a protein content of not less than 35,6 % of the non-fatty dry extract; (b) EUR 1,42 per 100 kg of skimmed milk with a protein content of not less than 31,4 % but less than 35,6 % of the non-fatty dry extract; (c) EUR 20,00 per 100 kg of skimmed-milk powder with a protein content of not less than 35,6 % of the non-fatty dry extract; (d) EUR 17,64 per 100 kg of skimmed-milk powder with a protein content of not less than 31,4 % but less than 35,6 % of the non-fatty dry extract." Article 2 This Regulation shall enter into force on the day following its publication in the Official Journal of the European Union. This Regulation shall be binding in its entirety and directly applicable in all Member States. Done at Brussels, 19 April 2006. For the Commission Mariann Fischer Boel Member of the Commission [1] OJ L 160, 26.6.1999, p. 48. Regulation as last amended by Regulation (EC) No 1913/2005 (OJ L 307, 25.11.2005, p. 2). [2] OJ L 340, 31.12.1999, p. 3. Regulation as last amended by Regulation (EC) No 1194/2005 (OJ L 194, 26.7.2005, p. 7). -------------------------------------------------- " pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_summ_en model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 22 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | Rouge1 | Rouge2 | Rouge Lsum | |:-----:|:-----:|:-----:|:-----:| | legal_t5_small_summ_en | 78.11|68.78 |77.0| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English", "tags": ["summarization English model"], "datasets": ["jrc-acquis"], "widget": [{"text": "THE COMMISSION OF THE EUROPEAN COMMUNITIES, Having regard to the Treaty establishing the European Community, Having regard to Council Regulation (EC) No 1255/1999 of 17 May 1999 on the common organisation of the market in milk and milk products [1], and in particular Article 15 thereof, Whereas: (1) Article 7(1) of Commission Regulation (EC) No 2799/1999 [2] fixes the amount of aid for skimmed milk and skimmed-milk powder intended for animal feed taking into account the factors set out in Article 11(2) of Regulation (EC) No 1255/1999. In view of the developments in the market price of skimmed-milk powder, of the increase in the market prices for competing proteins, and of the reduction of the supply of skimmed-milk powder, the amount of aid should be reduced. (2) Regulation (EC) No 2799/1999 should therefore be amended accordingly. (3) The Management Committee for Milk and Milk Products has not delivered an opinion within the time-limit set by its chairman, HAS ADOPTED THIS REGULATION: Article 1 In Article 7 of Regulation (EC) No 2799/1999, paragraph 1 is replaced by the following: \"1. Aid is fixed at: (a) EUR 1,62 per 100 kg of skimmed milk with a protein content of not less than 35,6 % of the non-fatty dry extract; (b) EUR 1,42 per 100 kg of skimmed milk with a protein content of not less than 31,4 % but less than 35,6 % of the non-fatty dry extract; (c) EUR 20,00 per 100 kg of skimmed-milk powder with a protein content of not less than 35,6 % of the non-fatty dry extract; (d) EUR 17,64 per 100 kg of skimmed-milk powder with a protein content of not less than 31,4 % but less than 35,6 % of the non-fatty dry extract.\" Article 2 This Regulation shall enter into force on the day following its publication in the Official Journal of the European Union. This Regulation shall be binding in its entirety and directly applicable in all Member States. Done at Brussels, 19 April 2006. For the Commission Mariann Fischer Boel Member of the Commission [1] OJ L 160, 26.6.1999, p. 48. Regulation as last amended by Regulation (EC) No 1913/2005 (OJ L 307, 25.11.2005, p. 2). [2] OJ L 340, 31.12.1999, p. 3. Regulation as last amended by Regulation (EC) No 1194/2005 (OJ L 194, 26.7.2005, p. 7)."}]}
text2text-generation
SEBIS/legal_t5_small_summ_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "summarization English model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #summarization English model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_summ\_en model ================================ Model for Summarization of legal text written in English. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_summ\_en is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for summarization of legal texts written in English. ### How to use Here is how to use this model to summarize legal text written in English in PyTorch: Training data ------------- The legal\_t5\_small\_summ\_en model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to summarize legal text written in English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_en model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization English model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to summarize legal text written in English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_en model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 66, 153, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization English model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to summarize legal text written in English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_en model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.09747159481048584, 0.10345157235860825, -0.002616576151922345, 0.061104439198970795, 0.10205326229333878, 0.008708780631422997, 0.07701405137777328, 0.10791750252246857, -0.0726630911231041, 0.05758528783917427, 0.053818389773368835, 0.030006803572177887, 0.08487120270729065, 0.10452990978956223, 0.05287274345755577, -0.23106826841831207, 0.0110228406265378, -0.013224313035607338, -0.01608709618449211, 0.15142829716205597, 0.11930596828460693, -0.09064138680696487, 0.02818179316818714, -0.02047097124159336, -0.13673153519630432, -0.004923319909721613, -0.06767769902944565, -0.05585718899965286, 0.060849886387586594, 0.03399984911084175, 0.10368945449590683, 0.016734568402171135, 0.0709761530160904, -0.18952499330043793, 0.0031903425697237253, 0.08895441889762878, 0.04421886429190636, 0.049399759620428085, 0.07001674175262451, -0.022004136815667152, 0.16811487078666687, -0.012329291552305222, 0.07073040306568146, 0.04118704795837402, -0.12671130895614624, -0.1262313276529312, -0.0559818409383297, 0.08325283974409103, 0.15897861123085022, 0.12951910495758057, -0.060294535011053085, 0.07876711338758469, -0.09253658354282379, 0.058573007583618164, 0.04719661921262741, -0.26041048765182495, -0.0671585351228714, 0.030047787353396416, 0.032184649258852005, 0.0660749301314354, -0.05316067859530449, -0.051191382110118866, 0.04000323638319969, 0.02686605043709278, -0.005200458224862814, -0.013262450695037842, 0.02215522713959217, -0.000025201878088410012, -0.16915683448314667, -0.09862248599529266, 0.17608441412448883, 0.017842138186097145, -0.08712224662303925, -0.0725046843290329, -0.02050306275486946, -0.16079123318195343, 0.03711670637130737, -0.049861542880535126, 0.0321001261472702, -0.0034096448216587305, 0.014583596028387547, 0.0014004746917635202, -0.12791922688484192, -0.1136055663228035, 0.035293545573949814, 0.10180164873600006, 0.08578122407197952, -0.00910803209990263, -0.00035953219048678875, 0.1524893045425415, 0.008455782197415829, -0.07775938510894775, -0.03433344513177872, -0.002267196774482727, -0.14134490489959717, -0.04023192077875137, -0.0388561487197876, -0.13704080879688263, -0.058136578649282455, 0.10601379722356796, -0.009240384213626385, 0.05906815454363823, 0.05886758491396904, 0.049022044986486435, 0.019783904775977135, 0.13563460111618042, -0.061909813433885574, -0.015568727627396584, -0.05744529888033867, 0.04804808646440506, -0.07720443606376648, 0.011509816162288189, -0.00650077685713768, 0.018456514924764633, 0.05717454478144646, 0.04962201789021492, -0.0873868390917778, 0.008086959831416607, -0.06024978682398796, -0.035341255366802216, -0.0020697698928415775, -0.11833257228136063, -0.042614541947841644, -0.013004648499190807, -0.08635228872299194, -0.07149703055620193, 0.10405315458774567, -0.01425002608448267, -0.1118473932147026, 0.07354482263326645, -0.03367732837796211, -0.010363983921706676, -0.11323761194944382, -0.07946083694696426, -0.018781565129756927, -0.07520843297243118, -0.04586348310112953, -0.08065598458051682, -0.159809872508049, -0.1063295379281044, 0.08671726286411285, -0.041637711226940155, -0.044284187257289886, -0.055343061685562134, -0.004173469729721546, -0.007584203500300646, -0.02575276419520378, 0.0972166582942009, -0.010927494615316391, 0.0886794850230217, 0.021819140762090683, 0.04874951392412186, 0.1283978670835495, 0.06573979556560516, -0.09534355998039246, 0.030428389087319374, -0.07723841816186905, 0.135511115193367, -0.007898924872279167, 0.02584652043879032, -0.1593407243490219, -0.06407332420349121, -0.07390888780355453, 0.049501512199640274, 0.08815418183803558, 0.09511332958936691, -0.14886926114559174, -0.0013538949424400926, 0.17679832875728607, -0.10836980491876602, -0.05910436436533928, 0.09645824134349823, -0.03918890282511711, 0.1670808345079422, 0.06801768392324448, 0.16083797812461853, 0.07339006662368774, -0.051151812076568604, -0.004027524031698704, -0.043666064739227295, 0.0020637395791709423, 0.017703915014863014, 0.09034698456525803, -0.04080675542354584, -0.1097480058670044, -0.025663312524557114, -0.07131105661392212, 0.014701120555400848, -0.07571561634540558, -0.06777373701334, 0.013915584422647953, -0.06443845480680466, -0.05195719748735428, 0.055274598300457, 0.052231479436159134, -0.02924584411084652, -0.12243204563856125, 0.0022179612424224615, 0.09234308451414108, -0.056748464703559875, 0.02176997810602188, -0.0848529189825058, -0.057980217039585114, -0.10062689334154129, -0.01160843763500452, -0.1956528276205063, 0.08336132764816284, 0.03271394222974777, -0.046621859073638916, 0.0622505247592926, 0.030536441132426262, 0.023666687309741974, 0.05518756061792374, 0.004733223933726549, -0.05059970170259476, -0.04401831328868866, -0.02487577497959137, -0.12675166130065918, -0.12602952122688293, -0.04635971412062645, -0.025236958637833595, 0.120980404317379, -0.16558466851711273, 0.04324395954608917, -0.06170252338051796, 0.03675471246242523, -0.012763341888785362, -0.0654599592089653, 0.030888456851243973, 0.03589320182800293, 0.008860306814312935, -0.05762922018766403, 0.050631094723939896, 0.04138142988085747, -0.01301934290677309, 0.037969920784235, -0.11229538172483444, -0.14261673390865326, 0.07693653553724289, 0.04590538144111633, -0.1722460687160492, 0.015670055523514748, -0.05364515632390976, -0.06191439926624298, -0.05209216848015785, 0.00343973096460104, 0.24152922630310059, 0.0028245490975677967, 0.1383410096168518, -0.08827612549066544, -0.06575649231672287, 0.0037187279667705297, -0.02575344406068325, 0.026895960792899132, 0.12221527099609375, 0.07347200810909271, -0.12549588084220886, 0.03649784252047539, 0.028193006291985512, -0.032041024416685104, 0.13597555458545685, 0.00935450755059719, -0.10472305864095688, -0.007123179268091917, 0.0713849663734436, -0.010565023869276047, 0.08068163692951202, -0.18288546800613403, 0.0040674698539078236, 0.00834403932094574, 0.04273654893040657, 0.04574345052242279, -0.16416865587234497, 0.02431187592446804, 0.07376162707805634, -0.024284979328513145, 0.002521555172279477, -0.029360713437199593, -0.06339111924171448, 0.07260448485612869, 0.029462791979312897, -0.0002433309709886089, -0.00022390417871065438, -0.03891633823513985, -0.146606907248497, 0.21076828241348267, -0.050695162266492844, -0.15257862210273743, -0.10395161807537079, 0.08745381981134415, 0.06935378164052963, -0.006106243934482336, 0.045704521238803864, -0.09126082807779312, -0.048206716775894165, -0.10349470376968384, 0.089812733232975, -0.0579976923763752, -0.007922305725514889, -0.05763986334204674, -0.0032558299135416746, 0.03046058863401413, -0.12021257728338242, 0.0315205417573452, -0.028857465833425522, -0.09116734564304352, 0.009742067195475101, -0.04195278510451317, 0.06856807321310043, 0.15866714715957642, -0.02396324649453163, 0.042975932359695435, 0.010067597962915897, 0.18684843182563782, -0.11061933636665344, 0.00937806349247694, 0.07766635715961456, 0.007207974791526794, 0.0025775551330298185, 0.09505046904087067, -0.0033081609290093184, -0.08682289719581604, 0.07252198457717896, 0.05106239393353462, -0.046363864094018936, -0.2781860828399658, -0.007594475522637367, -0.015591922216117382, -0.03507806733250618, 0.11826710402965546, 0.05929840728640556, 0.012911232188344002, 0.05746025964617729, -0.011834734119474888, 0.0009580422192811966, 0.019015900790691376, 0.07499691843986511, -0.05593230202794075, 0.00652707926928997, 0.06711883097887039, -0.05395679175853729, -0.013896044343709946, 0.035650480538606644, -0.006410621572285891, 0.24897821247577667, -0.024144509807229042, 0.09820176661014557, 0.11157747358083725, 0.08761600404977798, 0.02055925317108631, 0.08141526579856873, -0.04337942972779274, 0.019137678667902946, 0.007557739038020372, -0.029985932633280754, -0.07644874602556229, 0.06260780990123749, -0.02616490237414837, 0.04462350904941559, -0.1124827191233635, -0.013640760444104671, 0.01472611352801323, 0.3300127685070038, 0.05686923861503601, -0.2440195530653, -0.08037250488996506, -0.0007918130722828209, -0.08265677839517593, -0.0944037064909935, 0.052144937217235565, 0.08015913516283035, -0.1137804463505745, -0.005743334069848061, -0.03907549008727074, 0.09681379795074463, -0.08896808326244354, -0.04577381908893585, 0.09307975322008133, 0.018615925684571266, -0.011073368601500988, 0.09227743744850159, -0.2707692086696625, 0.1754271537065506, -0.013744932599365711, 0.12046422064304352, -0.009931514039635658, 0.02216017246246338, -0.058947306126356125, 0.015565856359899044, 0.1612899750471115, 0.003436097176745534, 0.05826411023736, -0.09937284141778946, -0.08161778748035431, 0.015627983957529068, 0.058610014617443085, -0.09319465607404709, 0.10026200860738754, 0.017773378640413284, 0.021148033440113068, -0.0012976785656064749, -0.058677759021520615, -0.14203737676143646, -0.1056288629770279, -0.002458008471876383, -0.0777982845902443, 0.07955566048622131, -0.0568547286093235, -0.043729934841394424, 0.010674898512661457, 0.17751388251781464, -0.15853431820869446, -0.06884723901748657, -0.08625766634941101, 0.04890092462301254, 0.076400987803936, -0.025540506467223167, -0.01714431494474411, 0.014647018164396286, 0.02437940426170826, 0.003715231316164136, 0.004539173562079668, 0.08574657887220383, -0.05957479402422905, -0.1510210782289505, -0.060190599411726, 0.11897589266300201, 0.1330043077468872, 0.05746251344680786, -0.029094204306602478, 0.010751333087682724, -0.022279076278209686, -0.08527164906263351, 0.018186843022704124, 0.041540488600730896, 0.042743150144815445, 0.01809675805270672, -0.04132237285375595, 0.020502012223005295, -0.08677291870117188, -0.0510902963578701, 0.10078281909227371, 0.1403452455997467, -0.042755261063575745, 0.06213730573654175, 0.13843844830989838, -0.12221445143222809, -0.14500172436237335, 0.03849676623940468, 0.10011589527130127, 0.06914274394512177, -0.06528204679489136, -0.1930191069841385, 0.07133346796035767, 0.08156812191009521, 0.00882054679095745, -0.052418943494558334, -0.3635512590408325, -0.13488377630710602, 0.08886510878801346, 0.09741847217082977, -0.04806813225150108, -0.11277994513511658, -0.019849609583616257, 0.028081266209483147, -0.02727634087204933, 0.11199595034122467, -0.026077821850776672, 0.09089039266109467, 0.01319288183003664, -0.06500253081321716, 0.03699030727148056, -0.05619010329246521, 0.1135484054684639, 0.08577287197113037, 0.05955275148153305, -0.03990798443555832, 0.030459174886345863, 0.018158184364438057, -0.029845917597413063, 0.13834497332572937, 0.039446040987968445, 0.03763469681143761, -0.21442997455596924, -0.06890641897916794, -0.08000073581933975, -0.009399637579917908, -0.06993572413921356, -0.05440953001379967, -0.024850603193044662, 0.094155453145504, 0.053835030645132065, -0.0047713820822536945, 0.017248107120394707, -0.08546673506498337, 0.03668952360749245, 0.10341886430978775, 0.12117186933755875, 0.06643439829349518, -0.0863407701253891, 0.03267331048846245, 0.03463713079690933, 0.09201239049434662, -0.1893271654844284, -0.016086261719465256, 0.12608779966831207, 0.004849146120250225, 0.15781943500041962, 0.015138286165893078, -0.1434141993522644, -0.0033211857080459595, 0.0506904162466526, -0.10403120517730713, -0.0990186259150505, -0.018488189205527306, 0.011258224956691265, -0.0700930804014206, -0.03160163760185242, 0.053949035704135895, -0.12287317961454391, -0.012299423106014729, -0.005810006521642208, 0.02836737595498562, -0.08755134791135788, 0.20444652438163757, 0.05261490121483803, 0.07483717799186707, -0.05335991084575653, 0.12154576927423477, 0.11369483917951584, -0.13293078541755676, 0.015176724642515182, 0.1535549908876419, -0.09201514720916748, -0.055267155170440674, -0.014496393501758575, 0.11819712817668915, -0.0020392087753862143, -0.07194071263074875, -0.05588562414050102, -0.05374625325202942, 0.06482966989278793, 0.015669099986553192, 0.05002957582473755, 0.040105752646923065, -0.0305501539260149, 0.0037384426686912775, -0.1423526555299759, 0.07463505119085312, 0.08413128554821014, 0.01791437901556492, -0.04752836003899574, 0.19023559987545013, 0.0487976148724556, 0.05230071768164635, -0.000005011117536923848, -0.05298526957631111, -0.04326264187693596, 0.06404921412467957, -0.015650078654289246, -0.010644297115504742, -0.0680832490324974, -0.0137220099568367, -0.0006648491253145039, 0.0016928893746808171, 0.0061975205317139626, 0.02264573983848095, -0.06006374582648277, -0.03182744234800339, -0.025086333975195885, 0.0401274673640728, -0.044501665979623795, -0.0005487086018547416, -0.005720101296901703, -0.06958308070898056, 0.07966259866952896, 0.0015319261001423001, -0.008121021091938019, -0.020652014762163162, -0.029024163261055946, 0.09834853559732437, -0.017725147306919098, -0.01534196175634861, -0.025669537484645844, -0.10937155038118362, 0.03713336959481239, -0.024442346766591072, -0.045367904007434845, -0.008360026404261589, 0.05193096026778221, -0.1442500799894333, 0.05151920020580292, -0.02138402871787548, 0.01127589400857687, -0.08841732889413834, 0.0813986212015152, 0.020939087495207787, 0.09651856869459152, 0.10053730756044388, -0.05959466099739075, 0.06689513474702835, -0.15697123110294342, -0.04033882915973663, 0.030817555263638496, 0.04916369915008545, -0.06143592298030853, -0.060566067695617676, 0.052600495517253876, -0.055073708295822144, 0.062004271894693375, 0.04563545808196068, 0.03619285300374031, 0.04107600823044777, -0.12140724062919617, 0.0006332543562166393, 0.037571679800748825, 0.06168461963534355, -0.014607908204197884, 0.0031490297988057137, 0.015645194798707962, 0.02715773694217205, -0.02165062353014946, 0.06801286339759827, 0.13345076143741608, 0.19658263027668, 0.051242027431726456, 0.07471716403961182, -0.04877972602844238, -0.08644557744264603, -0.08183472603559494, 0.15492132306098938, 0.0045041111297905445, 0.03892077878117561, -0.02935219742357731, 0.09224254637956619, 0.09364531934261322, -0.15429724752902985, 0.05845686048269272, -0.007654075976461172, -0.09155640751123428, -0.0856722891330719, -0.10015620291233063, -0.02992018684744835, -0.0598001666367054, -0.005452244076877832, -0.09182612597942352, 0.021387334913015366, 0.07019086182117462, 0.05864626541733742, -0.01666879653930664, 0.13389922678470612, 0.009941029362380505, -0.040932122617959976, 0.09044794738292694, -0.0230429507791996, 0.08321603387594223, -0.0785260796546936, 0.014883394353091717, 0.03756484389305115, -0.02149309776723385, 0.0564252994954586, 0.012000803835690022, -0.024378303438425064, 0.03209787234663963, 0.038457904011011124, -0.0679406151175499, 0.006735933944582939, 0.032870110124349594, 0.11507683247327805, 0.06748014688491821, 0.06124177202582359, -0.04098081961274147, -0.019161177799105644, 0.2037026435136795, -0.0460967943072319, -0.07251819968223572, -0.15993915498256683, 0.2028692364692688, 0.08667439967393875, 0.003912712913006544, 0.04261406883597374, -0.11086809635162354, 0.022655563428997993, 0.18569381535053253, 0.11165136098861694, -0.008426654152572155, -0.028107494115829468, 0.03274933993816376, -0.007878211326897144, 0.02024618349969387, 0.099721260368824, 0.059771571308374405, 0.17409950494766235, -0.1020522341132164, 0.058262698352336884, -0.0704398900270462, -0.06223795562982559, 0.030503995716571808, 0.14571796357631683, 0.027029240503907204, 0.002620234852656722, -0.05428965762257576, 0.11168340593576431, -0.02564796805381775, -0.2111632227897644, 0.10201714932918549, -0.07072555273771286, -0.12853023409843445, -0.023491105064749718, 0.010056196711957455, -0.010607537813484669, 0.05097121372818947, -0.021585941314697266, -0.012718410231173038, 0.09419582784175873, 0.036385249346494675, -0.0648711696267128, -0.12992416322231293, 0.0876324325799942, -0.016115082427859306, 0.1445057988166809, -0.0014525860315188766, 0.08775629848241806, 0.07867538928985596, 0.020657550543546677, -0.11781930178403854, 0.08252213895320892, 0.02655547857284546, 0.01563313417136669, 0.04399523884057999, 0.155924990773201, -0.013798641972243786, 0.12877757847309113, 0.029037442058324814, -0.1308923065662384, 0.0396265871822834, -0.1441156417131424, -0.04604969918727875, -0.18603624403476715, 0.03089897334575653, -0.041787806898355484, 0.1515372097492218, 0.21287083625793457, -0.05471150204539299, 0.005331564228981733, -0.07406032830476761, 0.03827473148703575, -0.010690884664654732, 0.15211912989616394, 0.010045202448964119, -0.18477153778076172, 0.022336430847644806, -0.08834698051214218, 0.0428365021944046, -0.23551395535469055, -0.038531508296728134, 0.022849099710583687, -0.08590207993984222, -0.01979946717619896, 0.11919976770877838, 0.05854104086756706, 0.08459898084402084, -0.05532783269882202, -0.023203939199447632, -0.007926000282168388, 0.15547963976860046, -0.13407418131828308, -0.1106070801615715 ]
null
null
transformers
# legal_t5_small_summ_es model Model for Summarization of legal text written in Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_summ_es is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for summarization of legal texts written in Spanish. ### How to use Here is how to use this model to summarize legal text written in Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_summ_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_summ_es", do_lower_case=False, skip_special_tokens=True), device=0 ) es_text = "[notificada con el número C(2006) 166] (El texto en lengua portuguesa es el único auténtico) (2006/78/CE) LA COMISIÓN DE LAS COMUNIDADES EUROPEAS, Visto el Tratado constitutivo de la Comunidad Europea, Vista la Decisión 90/424/CEE del Consejo, de 26 de junio de 1990, relativa a determinados gastos en el sector veterinario [1], y, en particular, su artículo 3, apartado 2 bis, Considerando lo siguiente: (1) El 24 de noviembre de 2004 se declararon brotes de fiebre catarral ovina en Portugal. La aparición de esta enfermedad puede representar un grave riesgo para la cabaña ganadera de la Comunidad. (2) Para atajar la propagación de la enfermedad en el plazo más breve, la Comunidad debe participar en los gastos subvencionables que suponen para Portugal la adopción de medidas de urgencia contra la enfermedad, en las condiciones previstas en la Decisión 90/424/CEE. Por ello, el 15 de septiembre de 2005 se adoptó la Decisión 2005/660/CE de la Comisión relativa a una ayuda financiera de la Comunidad para medidas de urgencia contra la fiebre catarral ovina adoptadas en Portugal en 2004 y 2005 [2]. (3) La Comisión ha adoptado varias decisiones para delimitar las zonas de protección y vigilancia y fijar las condiciones que deben cumplir los animales que vayan a salir de esas zonas; la última de ellas es la Decisión 2005/393/CE, de 23 de mayo de 2005, sobre las zonas de protección y vigilancia en relación con la fiebre catarral ovina y las condiciones que se aplican a los traslados de animales desde estas zonas o a través de ellas [3]. (4) Desde el otoño de 2004, la excepcional escasez de lluvias en Portugal ha afectado gravemente al suministro de forraje y, en consecuencia, a las posibilidades de alimentación animal, lo que ha conllevado costes adicionales para los ganaderos. La situación tiene consecuencias particulares en Portugal, pues las explotaciones especializadas en reproducción de bovinos y de ovinos están ubicadas en las zonas afectadas por las restricciones aplicadas a los traslados de animales, mientras que las especializadas en engorde, que constituyen la salida lógica de los animales criados en aquéllas, están localizadas fuera de dichas zonas. (5) Portugal, en colaboración con España, puso en marcha otras medidas para controlar la epidemia, como la realización de estudios epidemiológicos y la aplicación de medidas de vigilancia de la enfermedad, incluidas las pruebas de laboratorio para el control serológico y virológico en el marco de las pruebas realizadas a los animales antes de su traslado y en el de la vigilancia entomológica. (6) Portugal y España han presentado pruebas de su cooperación para evitar la propagación de la enfermedad tomando medidas de vigilancia de la misma. (7) De conformidad con el artículo 3, apartado 2, del Reglamento (CE) no 1258/1999 del Consejo, de 17 de mayo de 1999, sobre la financiación de la política agrícola común [4], las medidas veterinarias y fitosanitarias ejecutadas según las normas comunitarias son financiadas por la sección Garantía del Fondo Europeo de Orientación y de Garantía Agrícola. El control financiero de estas acciones debe efectuarse de conformidad con lo dispuesto en los artículos 8 y 9 de dicho Reglamento. (8) El pago de la contribución financiera de la Comunidad se supedita a la realización efectiva de las acciones programadas y a la presentación por parte de las autoridades de toda la información necesaria en los plazos establecidos. (9) El 25 de febrero de 2005, Portugal presentó un primer cálculo de los costes de las demás medidas de urgencia, como las de vigilancia epidemiológica, tomadas para luchar contra la enfermedad. El importe estimado de las medidas de vigilancia epidemiológica se eleva a 4303336 EUR. (10) A la espera de que se efectúen los controles in situ de la Comisión, procede fijar desde ahora el importe de un primer pago de la ayuda financiera de la Comunidad. Este primer pago ha de ser igual al 50 % de la contribución de la Comunidad, establecida sobre la base del gasto subvencionable calculado para las medidas de vigilancia epidemiológica. Procede asimismo determinar los importes máximos que se reembolsarán en concepto de pruebas realizadas y de trampas utilizadas en el marco de dichas medidas. (11) Las autoridades portuguesas han cumplido íntegramente sus obligaciones técnicas y administrativas relacionadas con las medidas previstas en el artículo 3 de la Decisión 90/424/CEE. (12) Las medidas previstas en la presente Decisión se ajustan al dictamen del Comité permanente de la cadena alimentaria y de sanidad animal. HA ADOPTADO LA PRESENTE DECISIÓN: Artículo 1 Concesión de una ayuda financiera de la Comunidad a Portugal 1. En el marco de las medidas de urgencia contra la fiebre catarral ovina adoptadas en Portugal en 2004 y 2005, Portugal tendrá derecho a una contribución comunitaria del 50 % de los importes desembolsados en concepto de pruebas de laboratorio para la vigilancia serológica y virológica, así como en concepto de vigilancia entomológica, incluida la adquisición de trampas. 2. El importe máximo de los gastos que se reembolsarán a Portugal en concepto de las pruebas y las trampas mencionadas en el apartado 1 no excederá de: a) vigilancia serológica, prueba ELISA: 2,5 EUR por prueba; b) vigilancia virológica, reacción en cadena de la polimerasa retrotranscriptásica (RT.PCR): 15 EUR por prueba; c) vigilancia entomológica, trampa: 160 EUR por trampa. 3. El impuesto sobre el valor añadido se excluirá de la participación financiera de la Comunidad. Artículo 2 Modalidades de pago A reserva del resultado de los controles in situ llevados a cabo de conformidad con el artículo 9, apartado 1, de la Decisión 90/424/CEE, se efectuará un primer pago de 600000 EUR como parte de la ayuda financiera de la Comunidad prevista en el artículo 1. El pago se llevará a cabo previa presentación por parte de Portugal de justificantes de las pruebas de laboratorio y de la adquisición de las trampas mencionadas en el artículo 1, apartado 1. Artículo 3 Condiciones de pago y documentación justificativa 1. La ayuda financiera de la Comunidad contemplada en el artículo 1 se pagará atendiendo a los siguientes elementos: a) una solicitud que contenga los datos especificados en el anexo, presentada en el plazo establecido en el apartado 2 del presente artículo; b) la documentación justificativa mencionada en el artículo 2, que incluirá un informe epidemiológico y un informe financiero; c) el resultado de cualquiera de los controles in situ llevados a cabo de conformidad con el artículo 9, apartado 1, de la Decisión 90/424/CEE. Los documentos mencionados en la letra b) deberán estar disponibles para los controles in situ mencionados en la letra c). 2. La solicitud mencionada en el apartado 1, letra a), se presentará en formato electrónico en un plazo de 60 días naturales a partir de la fecha de notificación de la presente Decisión. Si no se respeta este plazo, la ayuda financiera comunitaria se reducirá un 25 % por cada mes de retraso. Artículo 4 Destinatario El destinatario de la presente Decisión es la República Portuguesa. Hecho en Bruselas, el 31 de enero de 2006. Por la Comisión Markos Kyprianou Miembro de la Comisión [1] DO L 224 de 18.8.1990, p. 19. Decisión modificada en último lugar por el Reglamento (CE) no 806/2003 (DO L 122 de 16.5.2003, p. 1). [2] DO L 244 de 20.9.2005, p. 28. [3] DO L 130 de 24.5.2005, p. 22. Decisión modificada en último lugar por la Decisión 2005/828/CE (DO L 311 de 26.11.2005, p. 37). [4] DO L 160 de 26.6.1999, p. 103. -------------------------------------------------- ANEXO Datos mencionados en el artículo 3, apartado 1, letra a) Gastos | Naturaleza de los costes | Número | Importe (sin IVA) | Pruebas ELISA | | | Pruebas RT.PCR | | | Otras pruebas virológicas | | | Trampas | | | Total | | -------------------------------------------------- " pipeline([es_text], max_length=512) ``` ## Training data The legal_t5_small_summ_es model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 23 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | Rouge1 | Rouge2 | Rouge Lsum | |:-----:|:-----:|:-----:|:-----:| | legal_t5_small_summ_es | 80.23|70.16 |78.69| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Spanish", "tags": ["summarization Spanish model"], "datasets": ["jrc-acquis"], "widget": [{"text": "[notificada con el n\u00famero C(2006) 166] (El texto en lengua portuguesa es el \u00fanico aut\u00e9ntico) (2006/78/CE) LA COMISI\u00d3N DE LAS COMUNIDADES EUROPEAS, Visto el Tratado constitutivo de la Comunidad Europea, Vista la Decisi\u00f3n 90/424/CEE del Consejo, de 26 de junio de 1990, relativa a determinados gastos en el sector veterinario [1], y, en particular, su art\u00edculo 3, apartado 2 bis, Considerando lo siguiente: (1) El 24 de noviembre de 2004 se declararon brotes de fiebre catarral ovina en Portugal. La aparici\u00f3n de esta enfermedad puede representar un grave riesgo para la caba\u00f1a ganadera de la Comunidad. (2) Para atajar la propagaci\u00f3n de la enfermedad en el plazo m\u00e1s breve, la Comunidad debe participar en los gastos subvencionables que suponen para Portugal la adopci\u00f3n de medidas de urgencia contra la enfermedad, en las condiciones previstas en la Decisi\u00f3n 90/424/CEE. Por ello, el 15 de septiembre de 2005 se adopt\u00f3 la Decisi\u00f3n 2005/660/CE de la Comisi\u00f3n relativa a una ayuda financiera de la Comunidad para medidas de urgencia contra la fiebre catarral ovina adoptadas en Portugal en 2004 y 2005 [2]. (3) La Comisi\u00f3n ha adoptado varias decisiones para delimitar las zonas de protecci\u00f3n y vigilancia y fijar las condiciones que deben cumplir los animales que vayan a salir de esas zonas; la \u00faltima de ellas es la Decisi\u00f3n 2005/393/CE, de 23 de mayo de 2005, sobre las zonas de protecci\u00f3n y vigilancia en relaci\u00f3n con la fiebre catarral ovina y las condiciones que se aplican a los traslados de animales desde estas zonas o a trav\u00e9s de ellas [3]. (4) Desde el oto\u00f1o de 2004, la excepcional escasez de lluvias en Portugal ha afectado gravemente al suministro de forraje y, en consecuencia, a las posibilidades de alimentaci\u00f3n animal, lo que ha conllevado costes adicionales para los ganaderos. La situaci\u00f3n tiene consecuencias particulares en Portugal, pues las explotaciones especializadas en reproducci\u00f3n de bovinos y de ovinos est\u00e1n ubicadas en las zonas afectadas por las restricciones aplicadas a los traslados de animales, mientras que las especializadas en engorde, que constituyen la salida l\u00f3gica de los animales criados en aqu\u00e9llas, est\u00e1n localizadas fuera de dichas zonas. (5) Portugal, en colaboraci\u00f3n con Espa\u00f1a, puso en marcha otras medidas para controlar la epidemia, como la realizaci\u00f3n de estudios epidemiol\u00f3gicos y la aplicaci\u00f3n de medidas de vigilancia de la enfermedad, incluidas las pruebas de laboratorio para el control serol\u00f3gico y virol\u00f3gico en el marco de las pruebas realizadas a los animales antes de su traslado y en el de la vigilancia entomol\u00f3gica. (6) Portugal y Espa\u00f1a han presentado pruebas de su cooperaci\u00f3n para evitar la propagaci\u00f3n de la enfermedad tomando medidas de vigilancia de la misma. (7) De conformidad con el art\u00edculo 3, apartado 2, del Reglamento (CE) no 1258/1999 del Consejo, de 17 de mayo de 1999, sobre la financiaci\u00f3n de la pol\u00edtica agr\u00edcola com\u00fan [4], las medidas veterinarias y fitosanitarias ejecutadas seg\u00fan las normas comunitarias son financiadas por la secci\u00f3n Garant\u00eda del Fondo Europeo de Orientaci\u00f3n y de Garant\u00eda Agr\u00edcola. El control financiero de estas acciones debe efectuarse de conformidad con lo dispuesto en los art\u00edculos 8 y 9 de dicho Reglamento. (8) El pago de la contribuci\u00f3n financiera de la Comunidad se supedita a la realizaci\u00f3n efectiva de las acciones programadas y a la presentaci\u00f3n por parte de las autoridades de toda la informaci\u00f3n necesaria en los plazos establecidos. (9) El 25 de febrero de 2005, Portugal present\u00f3 un primer c\u00e1lculo de los costes de las dem\u00e1s medidas de urgencia, como las de vigilancia epidemiol\u00f3gica, tomadas para luchar contra la enfermedad. El importe estimado de las medidas de vigilancia epidemiol\u00f3gica se eleva a 4303336 EUR. (10) A la espera de que se efect\u00faen los controles in situ de la Comisi\u00f3n, procede fijar desde ahora el importe de un primer pago de la ayuda financiera de la Comunidad. Este primer pago ha de ser igual al 50 % de la contribuci\u00f3n de la Comunidad, establecida sobre la base del gasto subvencionable calculado para las medidas de vigilancia epidemiol\u00f3gica. Procede asimismo determinar los importes m\u00e1ximos que se reembolsar\u00e1n en concepto de pruebas realizadas y de trampas utilizadas en el marco de dichas medidas. (11) Las autoridades portuguesas han cumplido \u00edntegramente sus obligaciones t\u00e9cnicas y administrativas relacionadas con las medidas previstas en el art\u00edculo 3 de la Decisi\u00f3n 90/424/CEE. (12) Las medidas previstas en la presente Decisi\u00f3n se ajustan al dictamen del Comit\u00e9 permanente de la cadena alimentaria y de sanidad animal. HA ADOPTADO LA PRESENTE DECISI\u00d3N: Art\u00edculo 1 Concesi\u00f3n de una ayuda financiera de la Comunidad a Portugal 1. En el marco de las medidas de urgencia contra la fiebre catarral ovina adoptadas en Portugal en 2004 y 2005, Portugal tendr\u00e1 derecho a una contribuci\u00f3n comunitaria del 50 % de los importes desembolsados en concepto de pruebas de laboratorio para la vigilancia serol\u00f3gica y virol\u00f3gica, as\u00ed como en concepto de vigilancia entomol\u00f3gica, incluida la adquisici\u00f3n de trampas. 2. El importe m\u00e1ximo de los gastos que se reembolsar\u00e1n a Portugal en concepto de las pruebas y las trampas mencionadas en el apartado 1 no exceder\u00e1 de: a) vigilancia serol\u00f3gica, prueba ELISA: 2,5 EUR por prueba; b) vigilancia virol\u00f3gica, reacci\u00f3n en cadena de la polimerasa retrotranscript\u00e1sica (RT.PCR): 15 EUR por prueba; c) vigilancia entomol\u00f3gica, trampa: 160 EUR por trampa. 3. El impuesto sobre el valor a\u00f1adido se excluir\u00e1 de la participaci\u00f3n financiera de la Comunidad. Art\u00edculo 2 Modalidades de pago A reserva del resultado de los controles in situ llevados a cabo de conformidad con el art\u00edculo 9, apartado 1, de la Decisi\u00f3n 90/424/CEE, se efectuar\u00e1 un primer pago de 600000 EUR como parte de la ayuda financiera de la Comunidad prevista en el art\u00edculo 1. El pago se llevar\u00e1 a cabo previa presentaci\u00f3n por parte de Portugal de justificantes de las pruebas de laboratorio y de la adquisici\u00f3n de las trampas mencionadas en el art\u00edculo 1, apartado 1. Art\u00edculo 3 Condiciones de pago y documentaci\u00f3n justificativa 1. La ayuda financiera de la Comunidad contemplada en el art\u00edculo 1 se pagar\u00e1 atendiendo a los siguientes elementos: a) una solicitud que contenga los datos especificados en el anexo, presentada en el plazo establecido en el apartado 2 del presente art\u00edculo; b) la documentaci\u00f3n justificativa mencionada en el art\u00edculo 2, que incluir\u00e1 un informe epidemiol\u00f3gico y un informe financiero; c) el resultado de cualquiera de los controles in situ llevados a cabo de conformidad con el art\u00edculo 9, apartado 1, de la Decisi\u00f3n 90/424/CEE. Los documentos mencionados en la letra b) deber\u00e1n estar disponibles para los controles in situ mencionados en la letra c). 2. La solicitud mencionada en el apartado 1, letra a), se presentar\u00e1 en formato electr\u00f3nico en un plazo de 60 d\u00edas naturales a partir de la fecha de notificaci\u00f3n de la presente Decisi\u00f3n. Si no se respeta este plazo, la ayuda financiera comunitaria se reducir\u00e1 un 25 % por cada mes de retraso. Art\u00edculo 4 Destinatario El destinatario de la presente Decisi\u00f3n es la Rep\u00fablica Portuguesa. Hecho en Bruselas, el 31 de enero de 2006. Por la Comisi\u00f3n Markos Kyprianou Miembro de la Comisi\u00f3n [1] DO L 224 de 18.8.1990, p. 19. Decisi\u00f3n modificada en \u00faltimo lugar por el Reglamento (CE) no 806/2003 (DO L 122 de 16.5.2003, p. 1). [2] DO L 244 de 20.9.2005, p. 28. [3] DO L 130 de 24.5.2005, p. 22. Decisi\u00f3n modificada en \u00faltimo lugar por la Decisi\u00f3n 2005/828/CE (DO L 311 de 26.11.2005, p. 37). [4] DO L 160 de 26.6.1999, p. 103. -------------------------------------------------- ANEXO Datos mencionados en el art\u00edculo 3, apartado 1, letra a) Gastos | Naturaleza de los costes | N\u00famero | Importe (sin IVA) | Pruebas ELISA | | | Pruebas RT.PCR | | | Otras pruebas virol\u00f3gicas | | | Trampas | | | Total | | -------------------------------------------------- "}]}
text2text-generation
SEBIS/legal_t5_small_summ_es
[ "transformers", "pytorch", "tf", "jax", "t5", "text2text-generation", "summarization Spanish model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Spanish" ]
TAGS #transformers #pytorch #tf #jax #t5 #text2text-generation #summarization Spanish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_summ\_es model ================================ Model for Summarization of legal text written in Spanish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_summ\_es is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for summarization of legal texts written in Spanish. ### How to use Here is how to use this model to summarize legal text written in Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_summ\_es model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to summarize legal text written in Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_es model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #summarization Spanish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to summarize legal text written in Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_es model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 69, 153, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #summarization Spanish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to summarize legal text written in Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_es model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.12036479264497757, 0.10578545182943344, -0.0029741055332124233, 0.07026373594999313, 0.10379055142402649, 0.011759854853153229, 0.055029962211847305, 0.1011858657002449, -0.0943029448390007, 0.07637492567300797, 0.05350124090909958, 0.04777605086565018, 0.08323019742965698, 0.1031835749745369, 0.023302316665649414, -0.21918104588985443, 0.009837091900408268, -0.02877803146839142, -0.019676506519317627, 0.1406288892030716, 0.11623141169548035, -0.0858205035328865, 0.024137506261467934, -0.0348983071744442, -0.1323617547750473, 0.005755639635026455, -0.04954523220658302, -0.09423113614320755, 0.07185395807027817, 0.049061521887779236, 0.11898345500230789, 0.024763010442256927, 0.07285495102405548, -0.16021662950515747, -0.0022692910861223936, 0.08009665459394455, 0.04078594967722893, 0.05203383043408394, 0.08256211131811142, -0.02841852605342865, 0.16393336653709412, -0.03230547532439232, 0.07120734453201294, 0.0326826237142086, -0.15694661438465118, -0.12100411206483841, -0.06353586912155151, 0.03571740910410881, 0.1503279060125351, 0.1347842961549759, -0.05242450162768364, 0.06345458328723907, -0.10924337804317474, 0.038652028888463974, 0.04905049130320549, -0.23656533658504486, -0.07765108346939087, 0.028616778552532196, 0.028801415115594864, 0.07832033187150955, -0.051974136382341385, -0.04367462173104286, 0.04792912304401398, 0.010264724493026733, 0.009108396247029305, -0.027575071901082993, -0.028300970792770386, -0.0024850741028785706, -0.16443702578544617, -0.10961280018091202, 0.17708507180213928, 0.0020829527638852596, -0.06011831760406494, -0.07921548187732697, -0.02657190151512623, -0.1319604068994522, 0.023436076939105988, -0.04192104563117027, 0.03987957164645195, -0.005778896156698465, 0.02060755155980587, 0.0052542611956596375, -0.11915948241949081, -0.10601507872343063, 0.030121339485049248, 0.08588768541812897, 0.06867445260286331, -0.01678832620382309, 0.004516018554568291, 0.1587386280298233, 0.014487034641206264, -0.09821483492851257, -0.01092847902327776, 0.01046530157327652, -0.13269639015197754, -0.03885951265692711, -0.035312801599502563, -0.13703487813472748, -0.04509914666414261, 0.08200498670339584, -0.05526512488722801, 0.06286494433879852, 0.04410126805305481, 0.03761376067996025, 0.029691316187381744, 0.13360367715358734, -0.05415096506476402, -0.04153304174542427, -0.06633332371711731, 0.047758039087057114, -0.06743399053812027, 0.0045257150195539, -0.01733204536139965, 0.008043582551181316, 0.058632150292396545, 0.07315471768379211, -0.06338777393102646, 0.006224357057362795, -0.05594920739531517, -0.04832487925887108, 0.015319057740271091, -0.11728430539369583, -0.02851366251707077, 0.00746684055775404, -0.09383737295866013, -0.06563159823417664, 0.06388281285762787, -0.01787138544023037, -0.1152261346578598, 0.050679922103881836, -0.0444638729095459, -0.03735543042421341, -0.13573914766311646, -0.08530069887638092, -0.012352779507637024, -0.07518385350704193, -0.04229027032852173, -0.06335209310054779, -0.1371978372335434, -0.12534216046333313, 0.0760735347867012, -0.0695812776684761, -0.04762651398777962, -0.08442474901676178, -0.0016527281841263175, 0.001451094402000308, -0.024671467021107674, 0.13153918087482452, -0.016095926985144615, 0.10158869624137878, 0.029509739950299263, 0.03886866196990013, 0.14391346275806427, 0.07468755543231964, -0.09854284673929214, 0.02521054446697235, -0.10080499202013016, 0.17205144464969635, -0.01293967105448246, 0.012245121411979198, -0.17132475972175598, -0.06652743369340897, -0.07249113917350769, 0.06188247725367546, 0.08879486471414566, 0.12784990668296814, -0.13515442609786987, -0.020287029445171356, 0.2010446935892105, -0.08691952377557755, -0.04971732944250107, 0.08270859718322754, -0.047302015125751495, 0.1652446985244751, 0.08058154582977295, 0.1724727749824524, 0.06617388874292374, -0.05933786928653717, 0.007579554338008165, -0.035246167331933975, 0.004887248855084181, 0.0018472402589395642, 0.09268121421337128, -0.05382227152585983, -0.10276807099580765, -0.020087027922272682, -0.1051953136920929, 0.013785803690552711, -0.05572490021586418, -0.07020976394414902, 0.028415385633707047, -0.042444467544555664, -0.0547117181122303, 0.06499432772397995, 0.04183642193675041, -0.03431130200624466, -0.11679454147815704, 0.007279144134372473, 0.07810894399881363, -0.05395672097802162, 0.009458743967115879, -0.07274556905031204, -0.04722559079527855, -0.08957140147686005, -0.017476944252848625, -0.17985250055789948, 0.052149176597595215, 0.0314774215221405, 0.004660202655941248, 0.043351080268621445, 0.03485378623008728, 0.026884503662586212, 0.037444762885570526, -0.008656710386276245, -0.04728591814637184, -0.04651998355984688, -0.029392536729574203, -0.11578081548213959, -0.13886529207229614, -0.03354930132627487, -0.025844911113381386, 0.12385299056768417, -0.19434970617294312, 0.032871052622795105, -0.057741351425647736, 0.013455679640173912, -0.02247709222137928, -0.056450653821229935, 0.03767631947994232, 0.039544254541397095, 0.016973203048110008, -0.07336188852787018, 0.057018402963876724, 0.051441606134176254, 0.018245594576001167, 0.034765779972076416, -0.13235247135162354, -0.15251442790031433, 0.0744730532169342, 0.0526672899723053, -0.1636413186788559, -0.019382935017347336, -0.04472430795431137, -0.05289080739021301, -0.07011636346578598, 0.008670106530189514, 0.24316057562828064, 0.001507889828644693, 0.14504599571228027, -0.12060710042715073, -0.051524169743061066, 0.001852732035331428, -0.019002869725227356, 0.003942436072975397, 0.13770927488803864, 0.08467670530080795, -0.1179012656211853, 0.05434476584196091, 0.009118620306253433, -0.013921432197093964, 0.14898863434791565, 0.005598688032478094, -0.1146223247051239, 0.0033953359816223383, 0.07834410667419434, -0.0021068714559078217, 0.08108685910701752, -0.1801246702671051, 0.0057555511593818665, 0.008843050338327885, 0.03326725214719772, 0.05637823045253754, -0.16056998074054718, 0.018549922853708267, 0.06241799145936966, -0.03103889524936676, -0.0005267015076242387, -0.029518328607082367, -0.05074705928564072, 0.0767207145690918, 0.042199861258268356, -0.0164821557700634, -0.008028693497180939, -0.03960007429122925, -0.1487346887588501, 0.2157793641090393, -0.057363804429769516, -0.1686369776725769, -0.10747656971216202, 0.0954788401722908, 0.0765230655670166, 0.01199561171233654, 0.04234844818711281, -0.08900803327560425, -0.03355076164007187, -0.07529398798942566, 0.11220928281545639, -0.05339432507753372, -0.040715865790843964, -0.06790817528963089, 0.014674080535769463, 0.0020027931313961744, -0.12252325564622879, 0.035694293677806854, -0.024666454643011093, -0.1008278951048851, -0.0033271019347012043, -0.05976099520921707, 0.08678773790597916, 0.18501870334148407, -0.0008677743026055396, 0.031037794426083565, -0.00940999947488308, 0.18851661682128906, -0.13213035464286804, 0.00827063899487257, 0.08133497834205627, 0.02526697888970375, -0.0010495125316083431, 0.10108503699302673, 0.0017217097338289022, -0.09638519585132599, 0.04898817092180252, 0.053194787353277206, -0.04421756789088249, -0.2821768820285797, -0.022376732900738716, -0.026640204712748528, -0.06336876004934311, 0.1307632327079773, 0.04314007610082626, 0.031790826469659805, 0.06917018443346024, -0.01941256783902645, -0.010417599231004715, 0.023402994498610497, 0.058466970920562744, -0.026299580931663513, 0.006583241280168295, 0.06144390627741814, -0.05304119735956192, -0.025593316182494164, 0.05309167876839638, 0.009743228554725647, 0.23513975739479065, -0.04125205799937248, 0.11113241314888, 0.10173503309488297, 0.08674751222133636, -0.00563209829851985, 0.06627165526151657, -0.03365591540932655, 0.01948453299701214, -0.016467483714222908, -0.03845715522766113, -0.08610799163579941, 0.03948831558227539, -0.010288163088262081, 0.026888445019721985, -0.12223901599645615, -0.023609710857272148, 0.008997754193842411, 0.313247412443161, 0.05452050641179085, -0.24138101935386658, -0.06639032810926437, 0.008422893472015858, -0.051299285143613815, -0.10132671147584915, 0.05418916791677475, 0.08682559430599213, -0.12561118602752686, 0.005706841126084328, -0.04003385826945305, 0.1024070754647255, -0.09258085489273071, -0.04297453537583351, 0.06576192378997803, 0.0516529381275177, -0.01117172185331583, 0.10382966697216034, -0.26405709981918335, 0.20484718680381775, -0.003571212524548173, 0.12179677188396454, -0.023093191906809807, 0.029229948297142982, -0.06816906481981277, -0.0005068174214102328, 0.16567695140838623, -0.003927402663975954, 0.03238296136260033, -0.06772960722446442, -0.08138905465602875, 0.017604243010282516, 0.032086364924907684, -0.0937386006116867, 0.09837783873081207, 0.012117704376578331, 0.028829341754317284, 0.0033621699549257755, -0.09400371462106705, -0.11063792556524277, -0.1303258091211319, -0.0008986399625428021, -0.0979859009385109, 0.06479296833276749, -0.04812980443239212, -0.040874119848012924, 0.01684701070189476, 0.15964744985103607, -0.13821186125278473, -0.0843399167060852, -0.10556701570749283, 0.04917173087596893, 0.09193388372659683, -0.03371336683630943, -0.010627887211740017, 0.018772415816783905, 0.02171470783650875, 0.005371066741645336, 0.013597752898931503, 0.10980697721242905, -0.07088669389486313, -0.13169048726558685, -0.052614726126194, 0.126081645488739, 0.1233731359243393, 0.05859507620334625, -0.01944342441856861, 0.018400078639388084, 0.0011240513995289803, -0.08143176138401031, 0.0018939886940643191, 0.000525402429047972, 0.03681669384241104, 0.010952437296509743, -0.05312002822756767, -0.0014679679879918694, -0.08211290836334229, -0.05363365262746811, 0.09274951368570328, 0.11597799509763718, -0.04601045697927475, 0.06896915286779404, 0.16214032471179962, -0.1304081827402115, -0.15320098400115967, 0.041006214916706085, 0.11270681768655777, 0.06074514985084534, -0.06612012535333633, -0.2288227528333664, 0.030351558700203896, 0.10169697552919388, 0.013847127556800842, -0.04979357123374939, -0.4282871186733246, -0.11454402655363083, 0.082339808344841, 0.08803575485944748, -0.04757019132375717, -0.10082584619522095, -0.038787469267845154, 0.03547871857881546, 0.0023888282012194395, 0.10381737351417542, -0.02915426902472973, 0.08514198660850525, 0.029826048761606216, -0.07151370495557785, 0.04997120797634125, -0.05545634776353836, 0.12553484737873077, 0.060160957276821136, 0.054077718406915665, -0.03783746436238289, 0.009881886653602123, 0.032253049314022064, -0.027449898421764374, 0.12897011637687683, 0.05614578351378441, 0.03398656100034714, -0.2165433168411255, -0.0701979324221611, -0.08761511743068695, -0.00031961745116859674, -0.08066727221012115, -0.047161202877759933, -0.025976883247494698, 0.08381498605012894, 0.06384728103876114, -0.014982575550675392, 0.015246854163706303, -0.0724334567785263, 0.02677224949002266, 0.10514333844184875, 0.12874922156333923, 0.07705334573984146, -0.10257071256637573, 0.02486249804496765, 0.0524759367108345, 0.09548566490411758, -0.16807271540164948, -0.017830774188041687, 0.11518681049346924, 0.004135732538998127, 0.13390889763832092, 0.003741547930985689, -0.1407710164785385, 0.002750713611021638, 0.06829831749200821, -0.08139044791460037, -0.11783754080533981, -0.02462342381477356, -0.010205873288214207, -0.07996153831481934, -0.06308215111494064, 0.06212926283478737, -0.09084375947713852, -0.023938894271850586, -0.012119676917791367, 0.034603919833898544, -0.06137995794415474, 0.21493235230445862, 0.05783865600824356, 0.061987753957509995, -0.064935602247715, 0.11561669409275055, 0.12226440012454987, -0.13861238956451416, 0.022275542840361595, 0.16938625276088715, -0.08244162052869797, -0.04552794247865677, 0.02351214364171028, 0.123115673661232, -0.05001789703965187, -0.07660695165395737, -0.061590369790792465, -0.05603112280368805, 0.06835880875587463, 0.03271779417991638, 0.030703887343406677, 0.030258145183324814, -0.025647761300206184, 0.007367835380136967, -0.12711584568023682, 0.05669001117348671, 0.07382884621620178, 0.0012158664176240563, -0.026267578825354576, 0.18190424144268036, 0.049809351563453674, 0.028355048969388008, -0.006175222806632519, -0.047619663178920746, -0.07554551213979721, 0.06964714825153351, 0.005475527606904507, -0.0010054567828774452, -0.061441194266080856, -0.008065884001553059, -0.021320616826415062, 0.0018821625271812081, 0.0035616368986666203, 0.009945902973413467, -0.057509321719408035, -0.023126766085624695, -0.03453386202454567, 0.05674204230308533, -0.06277391314506531, 0.005250321235507727, -0.012525193393230438, -0.06708112359046936, 0.05931398645043373, -0.008866500109434128, 0.0006403574370779097, 0.0000886627021827735, -0.04573782533407211, 0.08497489243745804, -0.008848806843161583, -0.0008797042537480593, -0.011080684140324593, -0.1115686222910881, 0.05200618505477905, -0.017688946798443794, -0.029709292575716972, -0.006233193911612034, 0.047163885086774826, -0.14393608272075653, 0.07765797525644302, -0.009909234941005707, -0.02632821351289749, -0.0711965337395668, 0.12308280169963837, 0.03805115446448326, 0.054399389773607254, 0.09537184983491898, -0.07240395992994308, 0.059068404138088226, -0.15553484857082367, -0.046840064227581024, 0.019738836213946342, 0.04479334130883217, -0.06275184452533722, -0.050668392330408096, 0.06912623345851898, -0.03580114245414734, 0.04913095757365227, 0.08464650809764862, 0.06348737329244614, 0.03617559373378754, -0.1031181588768959, 0.01293138973414898, 0.039164360612630844, 0.07449787110090256, 0.005328888539224863, 0.014192380011081696, 0.013106013648211956, 0.05975509434938431, -0.016916194930672646, 0.06543819606304169, 0.12876524031162262, 0.21033348143100739, 0.08347054570913315, 0.08000513911247253, -0.025193899869918823, -0.10584862530231476, -0.07467775046825409, 0.16101443767547607, 0.0149147380143404, 0.02015243098139763, -0.023222031071782112, 0.06816089153289795, 0.09418710321187973, -0.15952596068382263, 0.07898236066102982, -0.0018576374277472496, -0.09583642333745956, -0.08792340755462646, -0.08724353462457657, -0.02547243982553482, -0.0598738007247448, -0.0152012063190341, -0.1039767935872078, 0.0285140722990036, 0.055595625191926956, 0.05089890956878662, -0.03297875449061394, 0.13229665160179138, 0.012342028319835663, -0.07339880615472794, 0.09231364727020264, 0.0034542197827249765, 0.12282740324735641, -0.10018853098154068, 0.029623642563819885, 0.030710311606526375, 0.014464338310062885, 0.05604657903313637, 0.019363535568118095, -0.011829156428575516, 0.030434444546699524, 0.04494652897119522, -0.05473228171467781, 0.006600878667086363, 0.0289150457829237, 0.11987587809562683, 0.09349999576807022, 0.05738063529133797, -0.04114394634962082, -0.018412457779049873, 0.23375436663627625, -0.04396902769804001, -0.07748048007488251, -0.1689349263906479, 0.18947091698646545, 0.07065368443727493, 0.022361597046256065, 0.03804557025432587, -0.11506244540214539, 0.01171442586928606, 0.22525186836719513, 0.12545736134052277, -0.020318875089287758, -0.038127340376377106, 0.023520197719335556, 0.0024389096070080996, 0.03793560341000557, 0.11479911208152771, 0.03193867579102516, 0.2052038013935089, -0.11050281673669815, 0.03806241974234581, -0.07274631410837173, -0.04591739922761917, 0.005959173198789358, 0.15997759997844696, 0.021462004631757736, -0.009405843913555145, -0.0442323163151741, 0.129817396402359, -0.007659995462745428, -0.218862384557724, 0.08227042108774185, -0.07870682328939438, -0.13806718587875366, -0.04396572336554527, 0.004747704602777958, 0.005970395170152187, 0.059346433728933334, 0.0066080293618142605, -0.015753567218780518, 0.10577228665351868, 0.0524681955575943, -0.07285116612911224, -0.15932485461235046, 0.08681169897317886, -0.010122998617589474, 0.16904252767562866, -0.011431358754634857, 0.06118538975715637, 0.08501668274402618, 0.009550387039780617, -0.1159641370177269, 0.07074427604675293, 0.04658528044819832, 0.04192721098661423, 0.07444025576114655, 0.09221498668193817, -0.019105708226561546, 0.11941908299922943, 0.0321243517100811, -0.13106341660022736, 0.045922983437776566, -0.1135827898979187, -0.030543720349669456, -0.17448675632476807, 0.042656801640987396, -0.04490331560373306, 0.13034068048000336, 0.22344106435775757, -0.03892572596669197, 0.012457233853638172, -0.06595532596111298, 0.05465581640601158, -0.014577047899365425, 0.1826401948928833, 0.00988651905208826, -0.1912720501422882, 0.0015911433147266507, -0.08143416792154312, 0.03136149421334267, -0.22616668045520782, -0.017803166061639786, 0.0331055149435997, -0.08232177793979645, -0.024536341428756714, 0.11319948732852936, 0.036352068185806274, 0.07801757752895355, -0.04134737327694893, -0.00936843827366829, -0.020783672109246254, 0.150640606880188, -0.119740791618824, -0.08232459425926208 ]
null
null
transformers
# legal_t5_small_summ_fr model Model for Summarization of legal text written in French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_summ_fr is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for summarization of legal texts written in French. ### How to use Here is how to use this model to summarize legal text written in French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_summ_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_summ_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) fr_text = "LA COMMISSION DES COMMUNAUTÉS EUROPÉENNES, vu le traité instituant la Communauté européenne, vu le règlement (CE) no 1784/2003 du Conseil du 29 septembre 2003 portant organisation commune des marchés dans le secteur des céréales [1], et notamment son article 13, paragraphe 3, vu le règlement (CE) no 1785/2003 du Conseil du 29 septembre 2003 portant organisation commune du marché du riz [2], et notamment son article 14, paragraphe 3, considérant ce qui suit: (1) Conformément à l'article 13, paragraphe 1, du règlement (CE) no 1784/2003 et à l'article 14, paragraphe 1, du règlement (CE) no 1785/2003, la différence entre les cours ou les prix sur le marché mondial des produits visés à l'article 1er de chacun de ces deux règlements et les prix dans la Communauté peut être couverte par une restitution à l'exportation. (2) Le règlement (CE) no 1043/2005 de la Commission du 30 juin 2005 portant application du règlement (CE) no 3448/93 du Conseil en ce qui concerne le système d’octroi des restitutions à l'exportation pour certains produits agricoles exportés sous forme de marchandises ne relevant pas de l'annexe I du traité ainsi que les critères de fixation de leurs montants [3] a spécifié ceux de ces produits pour lesquels il y a lieu de fixer un taux de restitution applicable lors de leur exportation sous forme de marchandises reprises, selon le cas, à l'annexe III du règlement (CE) no 1784/2003 ou à l'annexe IV du règlement (CE) no 1785/2003. (3) Conformément à l'article 14, paragraphe 1, du règlement (CE) no 1043/2005, le taux de la restitution par 100 kilogrammes de chacun des produits de base considérés doit être fixé chaque mois. (4) Les engagements pris en matière de restitutions pouvant être octroyées à l'exportation de produits agricoles incorporés dans des marchandises ne relevant pas de l'annexe I du traité peuvent être mis en péril par la fixation à l'avance de taux de restitution élevés. Il convient, dès lors, de prendre des mesures de sauvegarde dans ces situations sans empêcher pour autant la conclusion de contrats à long terme. La fixation d'un taux de restitution spécifique pour la fixation à l'avance des restitutions est une mesure permettant de rencontrer ces différents objectifs. (5) À la suite de l'arrangement entre la Communauté européenne et les États-Unis d'Amérique concernant les exportations de pâtes alimentaires de la Communauté aux États-Unis approuvé par la décision 87/482/CEE du Conseil [4], il est nécessaire de différencier la restitution pour les marchandises relevant des codes NC 19021100 et 190219 selon leur destination. (6) Conformément à l'article 15, paragraphes 2 et 3, du règlement (CE) no 1043/2005, il y a lieu de fixer un taux de restitution à l'exportation réduit, compte tenu du montant de la restitution à la production applicable, en vertu du règlement (CEE) no 1722/93 de la Commission [5], au produit de base mis en œuvre, valable au cours de la période présumée de fabrication des marchandises. (7) Les boissons spiritueuses sont considérées comme moins sensibles au prix des céréales mises en œuvre pour leur fabrication. Toutefois, le protocole 19 du traité d'adhésion du Royaume-Uni, de l'Irlande et du Danemark prévoit que des mesures nécessaires doivent être arrêtées afin de faciliter l'utilisation des céréales communautaires pour la fabrication de boissons spiritueuses obtenues à partir de céréales. Il convient donc d'adapter le taux de restitution applicable aux céréales exportées sous forme de boissons spiritueuses. (8) Le comité de gestion des céréales n'a pas émis d'avis dans le délai imparti par son président, A ARRÊTÉ LE PRÉSENT RÈGLEMENT: Article premier Les taux des restitutions applicables aux produits de base figurant à l'annexe I du règlement (CE) no 1043/2005 et à l'article 1er du règlement (CE) no 1784/2003 ou à l'article 1er du règlement (CE) no 1785/2003 modifié, qui sont exportés sous forme de marchandises reprises respectivement à l'annexe III du règlement (CE) no 1784/2003 ou à l'annexe IV du règlement (CE) no 1785/2003, sont fixés comme indiqué à l'annexe du présent règlement. Article 2 Le présent règlement entre en vigueur le 23 septembre 2005. Le présent règlement est obligatoire dans tous ses éléments et directement applicable dans tout État membre. Fait à Bruxelles, le 22 septembre 2005. Par la Commission Günter Verheugen Vice-président [1] JO L 270 du 21.10.2003, p. 78. [2] JO L 270 du 21.10.2003, p. 96. [3] JO L 172 du 5.7.2005, p. 24. [4] JO L 275 du 29.9.1987, p. 36. [5] JO L 159 du 1.7.1993, p. 112. Règlement modifié en dernier lieu par le règlement (CE) no 1584/2004 (JO L 280 du 31.8.2004, p. 11). -------------------------------------------------- ANNEXE Taux des restitutions applicables à compter du 23 septembre 2005 à certains produits des secteurs des céréales et du riz exportés sous forme de marchandises ne relevant pas de l'annexe I du traité [1] (en EUR/100 kg) | Code NC | Désignation des marchandises | Taux de la restitution par 100 kg du produit de base | En cas de fixation à l'avance des restitutions | Autres | 10011000 | Froment (blé) dur: | | | – en cas d'exportation de marchandises relevant des codes NC 190211 et 190219 vers les États-Unis d'Amérique | — | — | – dans les autres cas | — | — | 10019099 | Froment (blé) tendre et méteil: | | | – en cas d'exportation de marchandises relevant des codes NC 190211 et 190219 vers les États-Unis d'Amérique | — | — | – dans les autres cas: | | | – – en cas d'application de l'article 15, paragraphe 3, du règlement (CE) no 1043/2005 | — | — | – – en cas d'exportation de marchandises relevant du sous-chapitre 2208 | — | — | – – dans les autres cas | — | — | 10020000 | Seigle | — | — | 10030090 | Orge | | | – en cas d'exportation de marchandises relevant du sous-chapitre 2208 | — | — | – dans les autres cas | — | — | 10040000 | Avoine | — | — | 10059000 | Maïs, mis en œuvre sous forme de: | | | – amidon: | | | – – en cas d'application de l'article 15, paragraphe 3, du règlement (CE) no 1043/2005 | 2,994 | 3,150 | – – en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 2,368 | 2,368 | – – dans les autres cas | 4,615 | 4,615 | – glucose, sirop de glucose, maltodextrine, sirop de maltodextrine des codes NC 17023051, 17023059, 17023091, 17023099, 17024090, 17029050, 17029075, 17029079, 21069055: | | | – – en cas d'application de l'article 15, paragraphe 3, du règlement (CE) no 1043/2005 | 1,840 | 1,996 | – – en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 1,776 | 1,776 | – – dans les autres cas | 3,461 | 3,461 | – en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 2,368 | 2,368 | – autres (y compris en l'état) | 4,615 | 4,615 | Fécule de pommes de terre du code NC 11081300 assimilée à un produit issu de la transformation du maïs: | | | – en cas d'application de l'article 15, paragraphe 3, du règlement (CE) no 1043/2005 | 2,435 | 2,585 | – en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 2,368 | 2,368 | – dans les autres cas | 4,615 | 4,615 | ex100630 | Riz blanchi: | | | – à grains ronds | — | — | – à grains moyens | — | — | – à grains longs | — | — | 10064000 | Riz en brisures | — | — | 10070090 | Sorgho à grains (à l'excl. du sorgho à grains, hybride, destiné à l'ensemencement) | — | — | [1] Les taux prévus à la présente annexe ne s’appliquent pas avec effet au 1er octobre 2004 aux exportations vers la Bulgarie et avec effet au 1er février 2005 aux marchandises visées aux tableaux I et II du Protocole no 2 de l’Accord entre la Communauté économique européenne et la Confédération suisse du 22 juillet 1972 qui sont exportées vers la Confédération suisse ou la principauté de Liechtenstein. [2] En ce qui concerne les produits agricoles obtenus par transformation d’un produit de base et/ou de produits assimilés, les coefficients fixés à l’annexe V du règlement (CE) no 1043/2005 de la Commission s’appliquent. [3] La marchandise concernée relève du code NC 35051050. [4] Marchandises reprises à l'annexe III du règlement (CE) no 1784/2003 ou visées à l'article 2 du règlement (CEE) no 2825/93 (JO L 258 du 16.10.1993, p. 6). [5] Pour les sirops des codes NC 17023099, 17024090 et 17026090, obtenus par mélange de sirops de glucose et fructose, seul le sirop de glucose a droit à la restitution à l'exportation. -------------------------------------------------- " pipeline([fr_text], max_length=512) ``` ## Training data The legal_t5_small_summ_fr model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 23 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | Rouge1 | Rouge2 | Rouge Lsum | |:-----:|:-----:|:-----:|:-----:| | legal_t5_small_summ_fr | 77.1|67.97 |75.74| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "French", "tags": ["summarization French model"], "datasets": ["jrc-acquis"], "widget": [{"text": "LA COMMISSION DES COMMUNAUT\u00c9S EUROP\u00c9ENNES, vu le trait\u00e9 instituant la Communaut\u00e9 europ\u00e9enne, vu le r\u00e8glement (CE) no 1784/2003 du Conseil du 29 septembre 2003 portant organisation commune des march\u00e9s dans le secteur des c\u00e9r\u00e9ales [1], et notamment son article 13, paragraphe 3, vu le r\u00e8glement (CE) no 1785/2003 du Conseil du 29 septembre 2003 portant organisation commune du march\u00e9 du riz [2], et notamment son article 14, paragraphe 3, consid\u00e9rant ce qui suit: (1) Conform\u00e9ment \u00e0 l'article 13, paragraphe 1, du r\u00e8glement (CE) no 1784/2003 et \u00e0 l'article 14, paragraphe 1, du r\u00e8glement (CE) no 1785/2003, la diff\u00e9rence entre les cours ou les prix sur le march\u00e9 mondial des produits vis\u00e9s \u00e0 l'article 1er de chacun de ces deux r\u00e8glements et les prix dans la Communaut\u00e9 peut \u00eatre couverte par une restitution \u00e0 l'exportation. (2) Le r\u00e8glement (CE) no 1043/2005 de la Commission du 30 juin 2005 portant application du r\u00e8glement (CE) no 3448/93 du Conseil en ce qui concerne le syst\u00e8me d\u2019octroi des restitutions \u00e0 l'exportation pour certains produits agricoles export\u00e9s sous forme de marchandises ne relevant pas de l'annexe I du trait\u00e9 ainsi que les crit\u00e8res de fixation de leurs montants [3] a sp\u00e9cifi\u00e9 ceux de ces produits pour lesquels il y a lieu de fixer un taux de restitution applicable lors de leur exportation sous forme de marchandises reprises, selon le cas, \u00e0 l'annexe III du r\u00e8glement (CE) no 1784/2003 ou \u00e0 l'annexe IV du r\u00e8glement (CE) no 1785/2003. (3) Conform\u00e9ment \u00e0 l'article 14, paragraphe 1, du r\u00e8glement (CE) no 1043/2005, le taux de la restitution par 100 kilogrammes de chacun des produits de base consid\u00e9r\u00e9s doit \u00eatre fix\u00e9 chaque mois. (4) Les engagements pris en mati\u00e8re de restitutions pouvant \u00eatre octroy\u00e9es \u00e0 l'exportation de produits agricoles incorpor\u00e9s dans des marchandises ne relevant pas de l'annexe I du trait\u00e9 peuvent \u00eatre mis en p\u00e9ril par la fixation \u00e0 l'avance de taux de restitution \u00e9lev\u00e9s. Il convient, d\u00e8s lors, de prendre des mesures de sauvegarde dans ces situations sans emp\u00eacher pour autant la conclusion de contrats \u00e0 long terme. La fixation d'un taux de restitution sp\u00e9cifique pour la fixation \u00e0 l'avance des restitutions est une mesure permettant de rencontrer ces diff\u00e9rents objectifs. (5) \u00c0 la suite de l'arrangement entre la Communaut\u00e9 europ\u00e9enne et les \u00c9tats-Unis d'Am\u00e9rique concernant les exportations de p\u00e2tes alimentaires de la Communaut\u00e9 aux \u00c9tats-Unis approuv\u00e9 par la d\u00e9cision 87/482/CEE du Conseil [4], il est n\u00e9cessaire de diff\u00e9rencier la restitution pour les marchandises relevant des codes NC 19021100 et 190219 selon leur destination. (6) Conform\u00e9ment \u00e0 l'article 15, paragraphes 2 et 3, du r\u00e8glement (CE) no 1043/2005, il y a lieu de fixer un taux de restitution \u00e0 l'exportation r\u00e9duit, compte tenu du montant de la restitution \u00e0 la production applicable, en vertu du r\u00e8glement (CEE) no 1722/93 de la Commission [5], au produit de base mis en \u0153uvre, valable au cours de la p\u00e9riode pr\u00e9sum\u00e9e de fabrication des marchandises. (7) Les boissons spiritueuses sont consid\u00e9r\u00e9es comme moins sensibles au prix des c\u00e9r\u00e9ales mises en \u0153uvre pour leur fabrication. Toutefois, le protocole 19 du trait\u00e9 d'adh\u00e9sion du Royaume-Uni, de l'Irlande et du Danemark pr\u00e9voit que des mesures n\u00e9cessaires doivent \u00eatre arr\u00eat\u00e9es afin de faciliter l'utilisation des c\u00e9r\u00e9ales communautaires pour la fabrication de boissons spiritueuses obtenues \u00e0 partir de c\u00e9r\u00e9ales. Il convient donc d'adapter le taux de restitution applicable aux c\u00e9r\u00e9ales export\u00e9es sous forme de boissons spiritueuses. (8) Le comit\u00e9 de gestion des c\u00e9r\u00e9ales n'a pas \u00e9mis d'avis dans le d\u00e9lai imparti par son pr\u00e9sident, A ARR\u00caT\u00c9 LE PR\u00c9SENT R\u00c8GLEMENT: Article premier Les taux des restitutions applicables aux produits de base figurant \u00e0 l'annexe I du r\u00e8glement (CE) no 1043/2005 et \u00e0 l'article 1er du r\u00e8glement (CE) no 1784/2003 ou \u00e0 l'article 1er du r\u00e8glement (CE) no 1785/2003 modifi\u00e9, qui sont export\u00e9s sous forme de marchandises reprises respectivement \u00e0 l'annexe III du r\u00e8glement (CE) no 1784/2003 ou \u00e0 l'annexe IV du r\u00e8glement (CE) no 1785/2003, sont fix\u00e9s comme indiqu\u00e9 \u00e0 l'annexe du pr\u00e9sent r\u00e8glement. Article 2 Le pr\u00e9sent r\u00e8glement entre en vigueur le 23 septembre 2005. Le pr\u00e9sent r\u00e8glement est obligatoire dans tous ses \u00e9l\u00e9ments et directement applicable dans tout \u00c9tat membre. Fait \u00e0 Bruxelles, le 22 septembre 2005. Par la Commission G\u00fcnter Verheugen Vice-pr\u00e9sident [1] JO L 270 du 21.10.2003, p. 78. [2] JO L 270 du 21.10.2003, p. 96. [3] JO L 172 du 5.7.2005, p. 24. [4] JO L 275 du 29.9.1987, p. 36. [5] JO L 159 du 1.7.1993, p. 112. R\u00e8glement modifi\u00e9 en dernier lieu par le r\u00e8glement (CE) no 1584/2004 (JO L 280 du 31.8.2004, p. 11). -------------------------------------------------- ANNEXE Taux des restitutions applicables \u00e0 compter du 23 septembre 2005 \u00e0 certains produits des secteurs des c\u00e9r\u00e9ales et du riz export\u00e9s sous forme de marchandises ne relevant pas de l'annexe I du trait\u00e9 [1] (en EUR/100 kg) | Code NC | D\u00e9signation des marchandises | Taux de la restitution par 100 kg du produit de base | En cas de fixation \u00e0 l'avance des restitutions | Autres | 10011000 | Froment (bl\u00e9) dur: | | | \u2013 en cas d'exportation de marchandises relevant des codes NC 190211 et 190219 vers les \u00c9tats-Unis d'Am\u00e9rique | \u2014 | \u2014 | \u2013 dans les autres cas | \u2014 | \u2014 | 10019099 | Froment (bl\u00e9) tendre et m\u00e9teil: | | | \u2013 en cas d'exportation de marchandises relevant des codes NC 190211 et 190219 vers les \u00c9tats-Unis d'Am\u00e9rique | \u2014 | \u2014 | \u2013 dans les autres cas: | | | \u2013 \u2013 en cas d'application de l'article 15, paragraphe 3, du r\u00e8glement (CE) no 1043/2005 | \u2014 | \u2014 | \u2013 \u2013 en cas d'exportation de marchandises relevant du sous-chapitre 2208 | \u2014 | \u2014 | \u2013 \u2013 dans les autres cas | \u2014 | \u2014 | 10020000 | Seigle | \u2014 | \u2014 | 10030090 | Orge | | | \u2013 en cas d'exportation de marchandises relevant du sous-chapitre 2208 | \u2014 | \u2014 | \u2013 dans les autres cas | \u2014 | \u2014 | 10040000 | Avoine | \u2014 | \u2014 | 10059000 | Ma\u00efs, mis en \u0153uvre sous forme de: | | | \u2013 amidon: | | | \u2013 \u2013 en cas d'application de l'article 15, paragraphe 3, du r\u00e8glement (CE) no 1043/2005 | 2,994 | 3,150 | \u2013 \u2013 en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 2,368 | 2,368 | \u2013 \u2013 dans les autres cas | 4,615 | 4,615 | \u2013 glucose, sirop de glucose, maltodextrine, sirop de maltodextrine des codes NC 17023051, 17023059, 17023091, 17023099, 17024090, 17029050, 17029075, 17029079, 21069055: | | | \u2013 \u2013 en cas d'application de l'article 15, paragraphe 3, du r\u00e8glement (CE) no 1043/2005 | 1,840 | 1,996 | \u2013 \u2013 en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 1,776 | 1,776 | \u2013 \u2013 dans les autres cas | 3,461 | 3,461 | \u2013 en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 2,368 | 2,368 | \u2013 autres (y compris en l'\u00e9tat) | 4,615 | 4,615 | F\u00e9cule de pommes de terre du code NC 11081300 assimil\u00e9e \u00e0 un produit issu de la transformation du ma\u00efs: | | | \u2013 en cas d'application de l'article 15, paragraphe 3, du r\u00e8glement (CE) no 1043/2005 | 2,435 | 2,585 | \u2013 en cas d'exportation de marchandises relevant du sous-chapitre 2208 | 2,368 | 2,368 | \u2013 dans les autres cas | 4,615 | 4,615 | ex100630 | Riz blanchi: | | | \u2013 \u00e0 grains ronds | \u2014 | \u2014 | \u2013 \u00e0 grains moyens | \u2014 | \u2014 | \u2013 \u00e0 grains longs | \u2014 | \u2014 | 10064000 | Riz en brisures | \u2014 | \u2014 | 10070090 | Sorgho \u00e0 grains (\u00e0 l'excl. du sorgho \u00e0 grains, hybride, destin\u00e9 \u00e0 l'ensemencement) | \u2014 | \u2014 | [1] Les taux pr\u00e9vus \u00e0 la pr\u00e9sente annexe ne s\u2019appliquent pas avec effet au 1er octobre 2004 aux exportations vers la Bulgarie et avec effet au 1er f\u00e9vrier 2005 aux marchandises vis\u00e9es aux tableaux I et II du Protocole no 2 de l\u2019Accord entre la Communaut\u00e9 \u00e9conomique europ\u00e9enne et la Conf\u00e9d\u00e9ration suisse du 22 juillet 1972 qui sont export\u00e9es vers la Conf\u00e9d\u00e9ration suisse ou la principaut\u00e9 de Liechtenstein. [2] En ce qui concerne les produits agricoles obtenus par transformation d\u2019un produit de base et/ou de produits assimil\u00e9s, les coefficients fix\u00e9s \u00e0 l\u2019annexe V du r\u00e8glement (CE) no 1043/2005 de la Commission s\u2019appliquent. [3] La marchandise concern\u00e9e rel\u00e8ve du code NC 35051050. [4] Marchandises reprises \u00e0 l'annexe III du r\u00e8glement (CE) no 1784/2003 ou vis\u00e9es \u00e0 l'article 2 du r\u00e8glement (CEE) no 2825/93 (JO L 258 du 16.10.1993, p. 6). [5] Pour les sirops des codes NC 17023099, 17024090 et 17026090, obtenus par m\u00e9lange de sirops de glucose et fructose, seul le sirop de glucose a droit \u00e0 la restitution \u00e0 l'exportation. -------------------------------------------------- "}]}
text2text-generation
SEBIS/legal_t5_small_summ_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "summarization French model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #summarization French model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_summ\_fr model ================================ Model for Summarization of legal text written in French. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_summ\_fr is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for summarization of legal texts written in French. ### How to use Here is how to use this model to summarize legal text written in French in PyTorch: Training data ------------- The legal\_t5\_small\_summ\_fr model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to summarize legal text written in French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_fr model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization French model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to summarize legal text written in French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_fr model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 66, 153, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization French model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to summarize legal text written in French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_fr model was trained on JRC-ACQUIS dataset consisting of 23 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.09912589192390442, 0.11335726082324982, -0.0027333982288837433, 0.07412704080343246, 0.0885249674320221, -0.014341252855956554, 0.08343593776226044, 0.08963636308908463, -0.07212043553590775, 0.057397861033678055, 0.06190722435712814, 0.005636878311634064, 0.07192451506853104, 0.10538295656442642, 0.06844206899404526, -0.21183721721172333, 0.016798067837953568, -0.022322235628962517, -0.0354076512157917, 0.15009941160678864, 0.12496603280305862, -0.07193314284086227, 0.034942615777254105, -0.0027748297434300184, -0.13017675280570984, 0.017161959782242775, -0.06397230923175812, -0.05380450189113617, 0.07242689281702042, 0.04153704270720482, 0.09926774352788925, 0.020852310582995415, 0.07196789234876633, -0.18589824438095093, 0.000421824399381876, 0.08798857033252716, 0.024056531488895416, 0.04536271467804909, 0.08528603613376617, -0.03473081439733505, 0.16434219479560852, -0.023698149248957634, 0.06341256201267242, 0.04035567119717598, -0.1154877319931984, -0.12047144025564194, -0.06574904173612595, 0.07566884905099869, 0.12775476276874542, 0.1336175948381424, -0.05012354999780655, 0.06503788381814957, -0.11309501528739929, 0.053166937083005905, 0.055827781558036804, -0.2571165859699249, -0.06944222003221512, 0.015125052072107792, 0.03770551458001137, 0.05780104175209999, -0.057178400456905365, -0.03850168734788895, 0.034654684364795685, 0.015551365911960602, -0.002238987712189555, -0.021999796852469444, 0.024276824668049812, -0.0068066054955124855, -0.16336587071418762, -0.10473913699388504, 0.17346729338169098, 0.021350858733057976, -0.0964241474866867, -0.06894025951623917, -0.015943801030516624, -0.1598452478647232, 0.014762546867132187, -0.06279938668012619, 0.05344320088624954, -0.007963634096086025, 0.022032849490642548, 0.00486464099958539, -0.11236339062452316, -0.11454286426305771, 0.03558656945824623, 0.08743996173143387, 0.0794861763715744, -0.013154642656445503, 0.019088568165898323, 0.15150699019432068, -0.02976229600608349, -0.0584261529147625, -0.0314873605966568, 0.0011989842168986797, -0.1435266137123108, -0.041309040039777756, -0.032626520842313766, -0.10948020964860916, -0.04474551975727081, 0.12982676923274994, -0.03961443901062012, 0.05526304617524147, 0.047858163714408875, 0.047879140824079514, 0.00898760836571455, 0.1351819783449173, -0.06702470779418945, -0.03439054265618324, -0.06759773939847946, 0.0487452931702137, -0.09736257791519165, 0.026129964739084244, -0.002718940144404769, -0.000025979059500969015, 0.04894642531871796, 0.04723929613828659, -0.08313117176294327, -0.0160808227956295, -0.0488472618162632, -0.03018377721309662, 0.016361355781555176, -0.11491421610116959, -0.029090115800499916, -0.013436260633170605, -0.09652107208967209, -0.05827922001481056, 0.09532321989536285, -0.008782114833593369, -0.12050651758909225, 0.06690233200788498, -0.03224252909421921, -0.011308574117720127, -0.11726688593626022, -0.08142653852701187, -0.02614706940948963, -0.04997487738728523, -0.06195996329188347, -0.07097072899341583, -0.13739870488643646, -0.10959360003471375, 0.10696668922901154, -0.05652996897697449, -0.03595462813973427, -0.07083290070295334, -0.013879561796784401, 0.006166781298816204, -0.00537445954978466, 0.0893639475107193, -0.019448043778538704, 0.06713519990444183, -0.0022792494855821133, 0.0453176349401474, 0.1168946623802185, 0.06421543657779694, -0.08084015548229218, 0.03973397612571716, -0.09492573887109756, 0.1489066779613495, -0.017516469582915306, -0.010235876776278019, -0.17144367098808289, -0.08134739100933075, -0.08400153368711472, 0.04240746423602104, 0.10473443567752838, 0.117972232401371, -0.14909005165100098, -0.01369173638522625, 0.194544717669487, -0.10826987773180008, -0.0496254526078701, 0.13397689163684845, -0.042716700583696365, 0.14136867225170135, 0.05494672805070877, 0.1616254448890686, 0.07745806127786636, -0.05978434532880783, -0.00577138178050518, -0.022439194843173027, 0.009810050949454308, 0.018565811216831207, 0.10932579636573792, -0.04636382311582565, -0.12075664848089218, -0.03969048708677292, -0.07764962315559387, 0.011197265237569809, -0.07740695029497147, -0.06975101679563522, 0.016348207369446754, -0.04846053943037987, -0.021807124838232994, 0.05318264290690422, 0.049639731645584106, -0.031236842274665833, -0.12775224447250366, -0.023494867607951164, 0.08700324594974518, -0.05951237305998802, 0.016383010894060135, -0.06701871007680893, -0.051576871424913406, -0.081138014793396, -0.016045426949858665, -0.18512886762619019, 0.07276342064142227, 0.03203321620821953, -0.05904260650277138, 0.060527242720127106, 0.02792811021208763, 0.030511070042848587, 0.07118942588567734, -0.0021661976352334023, -0.04660559445619583, -0.06719092279672623, -0.03102351911365986, -0.10782764106988907, -0.13552629947662354, -0.03292320668697357, -0.023631606251001358, 0.1368209421634674, -0.16872039437294006, 0.038142528384923935, -0.054567284882068634, 0.033038511872291565, -0.016741620376706123, -0.05640600994229317, -0.0005740489577874541, 0.030169185250997543, 0.011601623147726059, -0.051221560686826706, 0.051446668803691864, 0.03091314435005188, 0.021512258797883987, 0.0425867922604084, -0.07935750484466553, -0.12622886896133423, 0.07737316191196442, 0.04788803681731224, -0.18413883447647095, 0.002477830508723855, -0.060977622866630554, -0.05096998065710068, -0.04370323568582535, 0.013655876740813255, 0.24132040143013, 0.008646541275084019, 0.1561644971370697, -0.09551876038312912, -0.06883227080106735, 0.013307628221809864, -0.0189495999366045, 0.017862368375062943, 0.13270726799964905, 0.08760656416416168, -0.10232111066579819, 0.04146938398480415, 0.038472119718790054, -0.031098660081624985, 0.13466916978359222, 0.011920226737856865, -0.10090150684118271, -0.008139402605593204, 0.0913984552025795, -0.005486510694026947, 0.08439271152019501, -0.18044884502887726, -0.0047989157028496265, 0.0026150578632950783, 0.03635098785161972, 0.045461174100637436, -0.16747306287288666, 0.033881668001413345, 0.06366854161024094, -0.03707156702876091, 0.00835467129945755, -0.01790427789092064, -0.07040350884199142, 0.08730582147836685, 0.04271678999066353, -0.04027845337986946, -0.009131589904427528, -0.0408492274582386, -0.1423138529062271, 0.22483594715595245, -0.05121811851859093, -0.14930443465709686, -0.09398777782917023, 0.09425614774227142, 0.03430628031492233, -0.0033352752216160297, 0.04312571883201599, -0.09234921634197235, -0.039893828332424164, -0.10786684602499008, 0.06313088536262512, -0.06074192747473717, -0.0036944299936294556, -0.06828567385673523, 0.0016274347435683012, 0.02000747062265873, -0.12865404784679413, 0.025107722729444504, -0.03012976609170437, -0.09782096743583679, -0.014105062931776047, -0.045637089759111404, 0.07322612404823303, 0.15604785084724426, -0.041990015655756, 0.04137284681200981, 0.02158907987177372, 0.19737417995929718, -0.11217445880174637, 0.006583106704056263, 0.06812648475170135, 0.03938642516732216, 0.013443224132061005, 0.09749629348516464, 0.004345187917351723, -0.07441004365682602, 0.06494738161563873, 0.06319355219602585, -0.04135744273662567, -0.26065561175346375, -0.03022090345621109, -0.01990051381289959, -0.03660103306174278, 0.1222151666879654, 0.059932973235845566, 0.026536069810390472, 0.05234243720769882, -0.01232120767235756, 0.0028078260365873575, 0.03063509427011013, 0.07085113227367401, -0.04552845656871796, 0.011424010619521141, 0.06258024275302887, -0.05638641119003296, -0.03502875566482544, 0.0451153926551342, -0.004680574871599674, 0.23569954931735992, -0.03139866143465042, 0.09811144322156906, 0.1157350167632103, 0.086224764585495, 0.006791024003177881, 0.08510870486497879, -0.032794393599033356, 0.017010565847158432, 0.0016617432702332735, -0.03479401767253876, -0.06002306193113327, 0.050260405987501144, -0.034007728099823, 0.028372175991535187, -0.1326756328344345, -0.019402138888835907, 0.025531474500894547, 0.32193151116371155, 0.04850473254919052, -0.24859988689422607, -0.07157991826534271, -0.02058487758040428, -0.07747837156057358, -0.0997152179479599, 0.057611510157585144, 0.09280446171760559, -0.1305333375930786, -0.01458291057497263, -0.03678951784968376, 0.0936712995171547, -0.07014711201190948, -0.04248732700943947, 0.11550039798021317, 0.03014811873435974, -0.010333542712032795, 0.09029585868120193, -0.2545737028121948, 0.1852104365825653, -0.01947902888059616, 0.09983527660369873, -0.006080463994294405, 0.031140027567744255, -0.05117541551589966, 0.044836197048425674, 0.18459342420101166, 0.016962286084890366, 0.061411451548337936, -0.06843406707048416, -0.06947453320026398, 0.00560770183801651, 0.061828624457120895, -0.09698780626058578, 0.08392885327339172, 0.013741950504481792, -0.0015762412222102284, -0.007896251045167446, -0.06155639514327049, -0.15202264487743378, -0.12255968153476715, 0.004106190986931324, -0.07995906472206116, 0.06969258934259415, -0.05167708173394203, -0.04339202493429184, 0.0012723662657663226, 0.19827398657798767, -0.1408853977918625, -0.04943840205669403, -0.08307061344385147, 0.028182754293084145, 0.08101563155651093, -0.025126611813902855, 0.010420136153697968, 0.0022194907069206238, 0.030323928222060204, -0.011529247276484966, 0.009452646598219872, 0.0862242728471756, -0.08270347863435745, -0.1200236827135086, -0.060103777796030045, 0.12249884009361267, 0.1302291750907898, 0.05326502397656441, -0.015966027975082397, 0.010052448138594627, -0.029555203393101692, -0.10407178103923798, -0.0016899933107197285, 0.0019190734019502997, 0.03821708261966705, 0.020993970334529877, -0.05127031356096268, -0.0011145522585138679, -0.07816208153963089, -0.027999594807624817, 0.11012709885835648, 0.14132554829120636, -0.060213979333639145, 0.07351993769407272, 0.143485888838768, -0.113017238676548, -0.16426509618759155, 0.04635021090507507, 0.12064521759748459, 0.050339225679636, -0.08523213863372803, -0.19645830988883972, 0.05933406576514244, 0.07349592447280884, 0.013085702434182167, -0.0569070428609848, -0.38068509101867676, -0.13331367075443268, 0.06917754560709, 0.08387138694524765, -0.02822229452431202, -0.08996577560901642, -0.007896422408521175, 0.01986844465136528, -0.015394539572298527, 0.120118647813797, -0.030785521492362022, 0.07518550008535385, 0.018864791840314865, -0.0895005613565445, 0.032924216240644455, -0.05205088108778, 0.12183549255132675, 0.10524555295705795, 0.06485272943973541, -0.02204548381268978, 0.041813887655735016, 0.02687768079340458, -0.02252833917737007, 0.1350780576467514, 0.05381754785776138, 0.03698692098259926, -0.2119518369436264, -0.05871410667896271, -0.082969069480896, 0.004827822558581829, -0.07669883221387863, -0.04166346788406372, -0.024798866361379623, 0.09909990429878235, 0.07177019119262695, 0.0020247558131814003, -0.02701966091990471, -0.06704862415790558, 0.022776897996664047, 0.11865521967411041, 0.10987196117639542, 0.07417622953653336, -0.09722698479890823, 0.06299528479576111, 0.04864231124520302, 0.0878661572933197, -0.172941654920578, -0.0038489210419356823, 0.12691661715507507, 0.0003368963079992682, 0.15040844678878784, 0.018781444057822227, -0.1298612803220749, -0.02605912834405899, 0.054262977093458176, -0.11350823938846588, -0.10425615310668945, -0.030274467542767525, 0.003223380306735635, -0.052704691886901855, -0.04298488795757294, 0.04648476839065552, -0.12113232910633087, -0.014920615591108799, -0.012181313708424568, 0.01984500326216221, -0.10197294503450394, 0.19206109642982483, 0.0339835025370121, 0.07638363540172577, -0.055192023515701294, 0.0994265004992485, 0.10561308264732361, -0.1531180590391159, 0.004712474532425404, 0.1622466742992401, -0.08878585696220398, -0.0559292696416378, -0.002116812625899911, 0.12665343284606934, -0.025723472237586975, -0.07498122751712799, -0.06569647789001465, -0.04761995002627373, 0.06656476855278015, 0.02116912603378296, 0.050664182752370834, 0.026052361354231834, -0.013870911672711372, -0.016763271763920784, -0.1450892984867096, 0.08119691908359528, 0.09446001052856445, 0.006017347332090139, -0.01502386573702097, 0.16327586770057678, 0.04036663845181465, 0.06336196511983871, -0.0014130582567304373, -0.04825759306550026, -0.04604768007993698, 0.05170619860291481, -0.001672901795245707, -0.010444150306284428, -0.052080970257520676, 0.0018157566664740443, -0.009809552691876888, 0.003916863352060318, 0.00922448467463255, 0.021743565797805786, -0.06843827664852142, -0.03163191303610802, -0.0179225392639637, 0.04801812022924423, -0.0552205964922905, 0.009755277074873447, 0.0038023304659873247, -0.07788634300231934, 0.07850831747055054, 0.0024237013421952724, -0.0004441670316737145, -0.02763761393725872, -0.04620197042822838, 0.07940519601106644, -0.050925448536872864, -0.009297491051256657, -0.027056019753217697, -0.1198020651936531, 0.057548124343156815, -0.01964189112186432, -0.03715874254703522, -0.012388370931148529, 0.044491320848464966, -0.13346989452838898, 0.055665094405412674, -0.031815387308597565, -0.008863485418260098, -0.08875326067209244, 0.11046698689460754, 0.016012152656912804, 0.10233867913484573, 0.09088229387998581, -0.06445492804050446, 0.05541634187102318, -0.1358637660741806, -0.048155151307582855, 0.03270226716995239, 0.04744848236441612, -0.056319791823625565, -0.07063101977109909, 0.04336908087134361, -0.03835439682006836, 0.07457435876131058, 0.07879317551851273, 0.06585736572742462, 0.04126083850860596, -0.14395669102668762, -0.015869708731770515, 0.04248008131980896, 0.032616011798381805, -0.019028406590223312, 0.0077329836785793304, 0.002536230720579624, 0.03393418341875076, -0.013613497838377953, 0.09024401754140854, 0.13149283826351166, 0.20306546986103058, 0.04612468555569649, 0.0741557776927948, -0.05936112254858017, -0.09662073105573654, -0.07491578161716461, 0.1628107726573944, 0.01767394132912159, 0.0387670136988163, -0.053313959389925, 0.08376201242208481, 0.07522653788328171, -0.16082055866718292, 0.0762353464961052, 0.011604972183704376, -0.09425006806850433, -0.09170276671648026, -0.07959558814764023, -0.025606226176023483, -0.0715760663151741, -0.012880009599030018, -0.08494509756565094, 0.04382847249507904, 0.05689133331179619, 0.07798400521278381, -0.008229746483266354, 0.135412335395813, -0.039359983056783676, -0.046512000262737274, 0.10355773568153381, -0.020232895389199257, 0.09563075751066208, -0.0837685614824295, 0.024017157033085823, 0.02780733071267605, -0.032396841794252396, 0.05127885192632675, 0.01847822405397892, -0.00991805549710989, 0.030739635229110718, 0.04627326875925064, -0.05184224247932434, 0.006836032960563898, 0.02513512596487999, 0.12177972495555878, 0.08613158762454987, 0.06133527308702469, -0.06369492411613464, -0.011880620382726192, 0.20103061199188232, -0.04647120088338852, -0.07229316234588623, -0.16815270483493805, 0.20264358818531036, 0.08802427351474762, 0.01902305893599987, 0.03200283274054527, -0.0970352441072464, 0.01920575648546219, 0.17310985922813416, 0.12064052373170853, -0.002356037963181734, -0.030568096786737442, 0.0364610031247139, -0.0065409294329583645, 0.05027734860777855, 0.08274290710687637, 0.06244037672877312, 0.18987523019313812, -0.11945383995771408, 0.053611885756254196, -0.0757984146475792, -0.05236446484923363, 0.029171627014875412, 0.16110968589782715, 0.025610923767089844, -0.004546556621789932, -0.05849070847034454, 0.11355943977832794, -0.010460306890308857, -0.21922162175178528, 0.12137492001056671, -0.07284164428710938, -0.12403629720211029, -0.0218944288790226, -0.018558964133262634, -0.004367817658931017, 0.054433178156614304, -0.018320389091968536, -0.030172472819685936, 0.09892206639051437, 0.03926074877381325, -0.05825318396091461, -0.14675994217395782, 0.0764627456665039, -0.003358605084940791, 0.17392997443675995, -0.0014650803059339523, 0.09791982918977737, 0.08164699375629425, 0.004521425813436508, -0.1099548190832138, 0.05230190232396126, 0.038304027169942856, 0.02635381929576397, 0.03914188966155052, 0.14182238280773163, -0.019347917288541794, 0.11767110973596573, 0.0011511743068695068, -0.11757013201713562, 0.05383296310901642, -0.15377430617809296, -0.05755849927663803, -0.1835939884185791, 0.04909839108586311, -0.04881677404046059, 0.14564253389835358, 0.21813222765922546, -0.04169526323676109, 0.02593075856566429, -0.08594494313001633, 0.042367931455373764, -0.01260639913380146, 0.14129669964313507, 0.014555981382727623, -0.17693409323692322, 0.04318816959857941, -0.09434794634580612, 0.044029977172613144, -0.23266154527664185, -0.03071449138224125, 0.020778322592377663, -0.06857316941022873, -0.01199012715369463, 0.11345970630645752, 0.05357784032821655, 0.07646559178829193, -0.05519937723875046, -0.03254414722323418, -0.019745584577322006, 0.14735625684261322, -0.09962520748376846, -0.09778055548667908 ]
null
null
transformers
# legal_t5_small_summ_it model Model for Summarization of legal text written in Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_summ_it is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for summarization of legal texts written in Italian. ### How to use Here is how to use this model to summarize legal text written in Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_summ_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_summ_it", do_lower_case=False, skip_special_tokens=True), device=0 ) it_text = "LA COMMISSIONE DELLE COMUNITÀ EUROPEE, visto il trattato che istituisce la Comunità europea, visto il regolamento (CEE) n. 2082/92 del Consiglio, del 14 luglio 1992, relativo alle attestazioni di specificità dei prodotti agricoli ed alimentari(1), in particolare l'articolo 9, paragrafo 1, considerando quanto segue: (1) A norma dell'articolo 7 del regolamento (CEE) n. 2082/92, la Finlandia ha trasmesso alla Commissione una domanda di registrazione della denominazione %quot%Kalakukko%quot% quale attestazione di specificità. (2) La dicitura %quot%specialità tradizionale garantita%quot% può applicarsi soltanto a denominazioni figuranti nel summenzionato albo. (3) Nessuna dichiarazione di opposizione, ai sensi dell'articolo 8 del summenzionato regolamento, è stata trasmessa alla Commissione a seguito della pubblicazione nella Gazzetta ufficiale delle Comunità europee(2) della denominazione figurante nell'allegato del presente regolamento. (4) Di conseguenza, la denominazione di cui all'allegato può essere iscritta nell'albo delle attestazioni di specificità e beneficiare pertanto della protezione a livello comunitario quale specialità tradizionale garantita nella Comunità in virtù dell'articolo 13, paragrafo 2, del regolamento (CEE) n. 2082/92. (5) L'allegato del presente regolamento completa l'allegato del regolamento (CE) n. 2301/97 della Commissione(3), modificato da ultimo dal regolamento (CE) n. 688/2002(4), HA ADOTTATO IL PRESENTE REGOLAMENTO: Articolo 1 La denominazione di cui all'allegato del presente regolamento è aggiunta all'allegato del regolamento (CE) n. 2301/97 e iscritta nell'albo delle attestazioni di specificità, conformemente all'articolo 9, paragrafo 1, del regolamento (CEE) n. 2082/92. Tale denominazione è protetta ai sensi dell'articolo 13, paragrafo 2, del summenzionato regolamento. Articolo 2 Il presente regolamento entra in vigore il ventesimo giorno successivo alla pubblicazione nella Gazzetta ufficiale delle Comunità europee. Il presente regolamento è obbligatorio in tutti i suoi elementi e direttamente applicabile in ciascuno degli Stati membri. Fatto a Bruxelles, il 15 luglio 2002. Per la Commissione Franz Fischler Membro della Commissione (1) GU L 208 del 24.7.1992, pag. 9. (2) GU C 235 del 21.8.2001, pag. 12. (3) GU L 319 del 21.11.1997, pag. 8. (4) GU L 106 del 23.4.2002, pag. 7. ALLEGATO Prodotti della panetteria, della pasticceria, della confetteria o della biscotteria - Kalakukko " pipeline([it_text], max_length=512) ``` ## Training data The legal_t5_small_summ_it model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 22 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | Rouge1 | Rouge2 | Rouge Lsum | |:-----:|:-----:|:-----:|:-----:| | legal_t5_small_summ_it | 75.07|65.53 |73.85| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Italian", "tags": ["summarization Italian model"], "datasets": ["jrc-acquis"], "widget": [{"text": "LA COMMISSIONE DELLE COMUNIT\u00c0 EUROPEE, visto il trattato che istituisce la Comunit\u00e0 europea, visto il regolamento (CEE) n. 2082/92 del Consiglio, del 14 luglio 1992, relativo alle attestazioni di specificit\u00e0 dei prodotti agricoli ed alimentari(1), in particolare l'articolo 9, paragrafo 1, considerando quanto segue: (1) A norma dell'articolo 7 del regolamento (CEE) n. 2082/92, la Finlandia ha trasmesso alla Commissione una domanda di registrazione della denominazione %quot%Kalakukko%quot% quale attestazione di specificit\u00e0. (2) La dicitura %quot%specialit\u00e0 tradizionale garantita%quot% pu\u00f2 applicarsi soltanto a denominazioni figuranti nel summenzionato albo. (3) Nessuna dichiarazione di opposizione, ai sensi dell'articolo 8 del summenzionato regolamento, \u00e8 stata trasmessa alla Commissione a seguito della pubblicazione nella Gazzetta ufficiale delle Comunit\u00e0 europee(2) della denominazione figurante nell'allegato del presente regolamento. (4) Di conseguenza, la denominazione di cui all'allegato pu\u00f2 essere iscritta nell'albo delle attestazioni di specificit\u00e0 e beneficiare pertanto della protezione a livello comunitario quale specialit\u00e0 tradizionale garantita nella Comunit\u00e0 in virt\u00f9 dell'articolo 13, paragrafo 2, del regolamento (CEE) n. 2082/92. (5) L'allegato del presente regolamento completa l'allegato del regolamento (CE) n. 2301/97 della Commissione(3), modificato da ultimo dal regolamento (CE) n. 688/2002(4), HA ADOTTATO IL PRESENTE REGOLAMENTO: Articolo 1 La denominazione di cui all'allegato del presente regolamento \u00e8 aggiunta all'allegato del regolamento (CE) n. 2301/97 e iscritta nell'albo delle attestazioni di specificit\u00e0, conformemente all'articolo 9, paragrafo 1, del regolamento (CEE) n. 2082/92. Tale denominazione \u00e8 protetta ai sensi dell'articolo 13, paragrafo 2, del summenzionato regolamento. Articolo 2 Il presente regolamento entra in vigore il ventesimo giorno successivo alla pubblicazione nella Gazzetta ufficiale delle Comunit\u00e0 europee. Il presente regolamento \u00e8 obbligatorio in tutti i suoi elementi e direttamente applicabile in ciascuno degli Stati membri. Fatto a Bruxelles, il 15 luglio 2002. Per la Commissione Franz Fischler Membro della Commissione (1) GU L 208 del 24.7.1992, pag. 9. (2) GU C 235 del 21.8.2001, pag. 12. (3) GU L 319 del 21.11.1997, pag. 8. (4) GU L 106 del 23.4.2002, pag. 7. ALLEGATO Prodotti della panetteria, della pasticceria, della confetteria o della biscotteria - Kalakukko "}]}
text2text-generation
SEBIS/legal_t5_small_summ_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "summarization Italian model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #summarization Italian model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_summ\_it model ================================ Model for Summarization of legal text written in Italian. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_summ\_it is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for summarization of legal texts written in Italian. ### How to use Here is how to use this model to summarize legal text written in Italian in PyTorch: Training data ------------- The legal\_t5\_small\_summ\_it model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to summarize legal text written in Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_it model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Italian model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to summarize legal text written in Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_it model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 66, 153, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Italian model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to summarize legal text written in Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_it model was trained on JRC-ACQUIS dataset consisting of 22 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.11596604436635971, 0.12502790987491608, -0.002073264215141535, 0.07462747395038605, 0.10953114181756973, 0.010294539853930473, 0.089795783162117, 0.12335541099309921, -0.05212279409170151, 0.07763063162565231, 0.03760958090424538, 0.02129063382744789, 0.09496624022722244, 0.11445613205432892, 0.037107277661561966, -0.2283792495727539, 0.021669968962669373, -0.023493612185120583, -0.01782968081533909, 0.1532450169324875, 0.12401623278856277, -0.0914432480931282, 0.028157126158475876, -0.029810503125190735, -0.15588420629501343, 0.00893369223922491, -0.06376941502094269, -0.049718547612428665, 0.06653736531734467, 0.042649731040000916, 0.1026168093085289, 0.01786232925951481, 0.08537626266479492, -0.16900868713855743, 0.0015075831906870008, 0.1011519730091095, 0.03827179968357086, 0.03687559813261032, 0.0936313346028328, -0.018209313973784447, 0.17018595337867737, 0.0017350137932226062, 0.05137083679437637, 0.03528769686818123, -0.11559094488620758, -0.10345296561717987, -0.05722172558307648, 0.07830486446619034, 0.13604949414730072, 0.13520300388336182, -0.057506367564201355, 0.08911266177892685, -0.10756378620862961, 0.05394626781344414, 0.04972013086080551, -0.24304687976837158, -0.07813381403684616, 0.009272728115320206, 0.019811255857348442, 0.07590802758932114, -0.03980260714888573, -0.03643270209431648, 0.03587278351187706, 0.011769898235797882, -0.03526797518134117, 0.004668613895773888, 0.03470223769545555, -0.016752544790506363, -0.16649997234344482, -0.09974320977926254, 0.18347346782684326, 0.012491734698414803, -0.09103644639253616, -0.06513653695583344, -0.013909975066781044, -0.14783984422683716, 0.03341275453567505, -0.058654751628637314, 0.02600041963160038, 0.0018952074460685253, 0.03905530273914337, 0.025078317150473595, -0.12293374538421631, -0.12023235112428665, 0.04931919276714325, 0.13871878385543823, 0.09626150131225586, -0.015517357736825943, 0.016412971541285515, 0.1534113585948944, -0.006238480564206839, -0.07177336513996124, -0.024409011006355286, 0.013659651391208172, -0.13346850872039795, -0.049056682735681534, -0.04133422300219536, -0.14097870886325836, -0.04654860496520996, 0.11038591712713242, 0.014043559320271015, 0.042114365845918655, 0.0539843924343586, 0.05723161622881889, 0.008593276143074036, 0.12404002249240875, -0.08226950466632843, -0.02894343063235283, -0.05829545110464096, 0.043543752282857895, -0.0750337466597557, 0.011780314147472382, -0.01899082213640213, -0.003640451468527317, 0.044602423906326294, 0.059649668633937836, -0.06574222445487976, 0.001316204434260726, -0.06236748397350311, -0.046256039291620255, -0.01103446539491415, -0.12999221682548523, -0.028931112959980965, -0.011000833474099636, -0.1025705635547638, -0.0448773019015789, 0.07404736429452896, -0.020496491342782974, -0.1274617612361908, 0.08520583063364029, -0.03650641813874245, -0.014597847126424313, -0.12674300372600555, -0.07954141497612, -0.007897344417870045, -0.08608106523752213, -0.052155110985040665, -0.07894909381866455, -0.14143989980220795, -0.10412769019603729, 0.08723678439855576, -0.06618019193410873, -0.024175139144062996, -0.054123833775520325, 0.00776580860838294, -0.013548457063734531, -0.01841050758957863, 0.08805414289236069, -0.013220541179180145, 0.09013597667217255, 0.008953283540904522, 0.048015765845775604, 0.11112989485263824, 0.06624902784824371, -0.09884385764598846, 0.028666844591498375, -0.07714307308197021, 0.125911146402359, -0.012169594876468182, 0.012742835097014904, -0.15951663255691528, -0.06289711594581604, -0.08019609749317169, 0.04227491095662117, 0.11207903921604156, 0.13190732896327972, -0.16136427223682404, 0.000494313717354089, 0.17522823810577393, -0.12510564923286438, -0.056622449308633804, 0.12077642232179642, -0.02829766646027565, 0.1774696558713913, 0.06673061847686768, 0.16629017889499664, 0.06883049011230469, -0.06252814829349518, -0.015465854667127132, -0.05165073648095131, -0.003070270176976919, 0.009098505601286888, 0.09368273615837097, -0.027615761384367943, -0.07175654917955399, -0.02462598867714405, -0.10007233172655106, 0.007416699547320604, -0.07951299846172333, -0.07456295937299728, 0.008151340298354626, -0.07197809964418411, -0.050634317100048065, 0.06055016443133354, 0.05540475621819496, -0.03147019073367119, -0.13260667026042938, -0.04967612028121948, 0.10008075088262558, -0.07249864190816879, 0.024330539628863335, -0.0894235298037529, -0.011398623697459698, -0.08375197649002075, -0.012531186453998089, -0.18256308138370514, 0.07355484366416931, 0.04696046561002731, -0.04267193004488945, 0.050248801708221436, 0.015294002369046211, 0.028412487357854843, 0.05977476015686989, 0.0004095065814908594, -0.04662694036960602, -0.04846057668328285, -0.027775486931204796, -0.12018482387065887, -0.13869553804397583, -0.05881766229867935, -0.016732770949602127, 0.13433696329593658, -0.16691365838050842, 0.046473171561956406, -0.03637952730059624, 0.05341511592268944, -0.026872845366597176, -0.06449691206216812, -0.0054531460627913475, 0.024777229875326157, -0.001250529196113348, -0.06518526375293732, 0.04872540384531021, 0.04128018021583557, 0.002927101217210293, 0.023845216259360313, -0.09756555408239365, -0.11968950927257538, 0.07058220356702805, 0.039584800601005554, -0.17209213972091675, -0.005973442457616329, -0.051234662532806396, -0.061292339116334915, -0.048288144171237946, 0.0031852032989263535, 0.2100369781255722, 0.002080753445625305, 0.12284951657056808, -0.09487088024616241, -0.039274558424949646, 0.020861024037003517, -0.020410142838954926, 0.008471719920635223, 0.11130759865045547, 0.10132179409265518, -0.12381578981876373, 0.047052692621946335, 0.03360697254538536, -0.0450487919151783, 0.14037755131721497, 0.02404414303600788, -0.11764021217823029, 0.004544543102383614, 0.09445386379957199, 0.002799778478220105, 0.0771200880408287, -0.1927802562713623, 0.013030566275119781, 0.000029939377782284282, 0.03761593997478485, 0.05572177842259407, -0.16351883113384247, 0.02013586461544037, 0.06031370535492897, -0.021800193935632706, 0.001706642098724842, -0.03877827525138855, -0.0857124775648117, 0.09390445798635483, 0.04188350960612297, -0.03284550458192825, 0.0018430155469104648, -0.037242405116558075, -0.1339358687400818, 0.20351560413837433, -0.039296891540288925, -0.15008771419525146, -0.09972038120031357, 0.09232132136821747, 0.055258169770240784, 0.005752993747591972, 0.03231719136238098, -0.08470045030117035, -0.04209849238395691, -0.08721774071455002, 0.07192093133926392, -0.05447493866086006, -0.007597833406180143, -0.070345439016819, 0.009266616776585579, 0.011571848765015602, -0.10454566031694412, 0.029591137543320656, -0.02828516997396946, -0.10602093487977982, -0.00995289534330368, -0.07045567780733109, 0.09639919549226761, 0.17092008888721466, -0.03211890906095505, 0.057515475898981094, 0.022417008876800537, 0.15723735094070435, -0.12959976494312286, 0.018388301134109497, 0.09459101408720016, 0.009095268324017525, -0.0012241319054737687, 0.09983979910612106, -0.0026377057656645775, -0.08920768648386002, 0.055770017206668854, 0.05099558085203171, -0.04160076379776001, -0.26219868659973145, -0.023524902760982513, -0.022665074095129967, -0.03466542065143585, 0.11077334731817245, 0.056203678250312805, 0.011488888412714005, 0.06635905802249908, -0.02007952332496643, -0.034057144075632095, 0.029962725937366486, 0.07101140916347504, -0.07536297291517258, 0.007743263151496649, 0.06420303881168365, -0.05129680037498474, -0.020365804433822632, 0.04964723810553551, 0.021768471226096153, 0.2606552839279175, -0.0297103151679039, 0.11515934020280838, 0.10303567349910736, 0.10322771966457367, 0.012021496891975403, 0.08312119543552399, -0.01904424838721752, 0.004017466679215431, 0.011574994772672653, -0.03867428004741669, -0.07096579670906067, 0.06005540117621422, -0.030528942123055458, 0.01885341666638851, -0.11712505668401718, -0.01133587583899498, 0.02494952268898487, 0.3480283319950104, 0.04432133957743645, -0.25822943449020386, -0.07022711634635925, -0.015656866133213043, -0.05901867896318436, -0.0955151915550232, 0.038212038576602936, 0.09175918251276016, -0.11340957880020142, -0.01956235058605671, -0.03873837739229202, 0.09541667997837067, -0.10027144104242325, -0.0511590875685215, 0.10534609854221344, 0.03124259039759636, -0.015099972486495972, 0.108287014067173, -0.25321897864341736, 0.18712414801120758, -0.011641023680567741, 0.13617826998233795, -0.03425217792391777, 0.024212168529629707, -0.05774884670972824, 0.04847368970513344, 0.18119896948337555, 0.004118805751204491, 0.06369498372077942, -0.10627031326293945, -0.09186544269323349, 0.004271750338375568, 0.08330516517162323, -0.08713900297880173, 0.08971461653709412, 0.006097020581364632, 0.017952607944607735, 0.001187513000331819, -0.038843996822834015, -0.12353892624378204, -0.1140613928437233, 0.018853073939681053, -0.061159927397966385, 0.05966958776116371, -0.05090833827853203, -0.058051176369190216, 0.024547480046749115, 0.20071518421173096, -0.14012838900089264, -0.048015058040618896, -0.0973065048456192, 0.05899599939584732, 0.06760447472333908, -0.026301072910428047, -0.0153666315600276, 0.008558878675103188, 0.034848593175411224, -0.0045227305963635445, 0.016335338354110718, 0.10035062581300735, -0.07013160735368729, -0.13640891015529633, -0.06505087018013, 0.10403529554605484, 0.13528689742088318, 0.04725265875458717, -0.01168469712138176, 0.0037879275623708963, -0.011658038944005966, -0.0797073021531105, 0.015973880887031555, 0.01490304246544838, 0.039237335324287415, 0.0308768842369318, -0.0647672489285469, 0.011661277152597904, -0.08755536377429962, -0.04973466321825981, 0.09194869548082352, 0.13580937683582306, -0.054811716079711914, 0.033389098942279816, 0.13223442435264587, -0.12317081540822983, -0.1652974784374237, 0.0698140487074852, 0.10402357578277588, 0.06602196395397186, -0.07765272259712219, -0.1906641125679016, 0.05540166422724724, 0.10471459478139877, 0.002241463167592883, -0.03783760964870453, -0.37976911664009094, -0.11978252977132797, 0.060106683522462845, 0.10361228138208389, -0.0350281186401844, -0.09628864377737045, -0.006857838947325945, 0.025736235082149506, -0.01025576051324606, 0.08141899853944778, -0.009756999090313911, 0.08405719697475433, 0.021632082760334015, -0.097341388463974, 0.042210035026073456, -0.0604434572160244, 0.11187651753425598, 0.09471090137958527, 0.05697496607899666, -0.039443593472242355, 0.03125351667404175, 0.022862516343593597, -0.013299733400344849, 0.1324613243341446, 0.04936370253562927, 0.017888261005282402, -0.18994887173175812, -0.07527118176221848, -0.07885053753852844, 0.00829668715596199, -0.07331323623657227, -0.050862688571214676, -0.01275939866900444, 0.10611309111118317, 0.053349610418081284, 0.010059354826807976, -0.0058711813762784, -0.09493113309144974, 0.02785961888730526, 0.11655990034341812, 0.11106477677822113, 0.06590438634157181, -0.07734321802854538, 0.03428281098604202, 0.04453814774751663, 0.08140058815479279, -0.1398666650056839, -0.009556521661579609, 0.12960530817508698, -0.002897449769079685, 0.14458267390727997, 0.019781438633799553, -0.14025549590587616, -0.013590523973107338, 0.066701240837574, -0.10352837294340134, -0.09979469329118729, -0.023349303752183914, 0.009042596444487572, -0.06256710737943649, -0.008266681805253029, 0.04962439090013504, -0.12135625630617142, -0.009932198561728, -0.01731027290225029, 0.03377955034375191, -0.09773517400026321, 0.21816344559192657, 0.039823468774557114, 0.05757378786802292, -0.05445864051580429, 0.13735431432724, 0.105103500187397, -0.14416715502738953, 0.004677172284573317, 0.15726429224014282, -0.09025249630212784, -0.05756315961480141, -0.002070661401376128, 0.10270052403211594, -0.022887760773301125, -0.08798318356275558, -0.0852445736527443, -0.045377638190984726, 0.05644089728593826, -0.009048067964613438, 0.054582029581069946, 0.0339343287050724, -0.039161842316389084, -0.007557831238955259, -0.16304783523082733, 0.07106344401836395, 0.10540447384119034, 0.011422302573919296, -0.04000493511557579, 0.19025619328022003, 0.049603432416915894, 0.024744359776377678, -0.0009934415575116873, -0.05611579492688179, -0.038221415132284164, 0.06671945750713348, 0.00926294643431902, -0.02695802040398121, -0.05373281612992287, -0.011608147993683815, -0.017242221161723137, -0.001999040599912405, -0.004059286322444677, 0.022371774539351463, -0.07625067979097366, -0.02883518487215042, -0.03144717961549759, 0.03025268204510212, -0.045265454798936844, 0.011647284962236881, -0.009619898162782192, -0.07404466718435287, 0.08456134051084518, 0.005532098468393087, -0.005964198615401983, -0.028034113347530365, -0.04629683867096901, 0.1027756854891777, -0.03918526694178581, -0.013655101880431175, -0.024537639692425728, -0.0985880047082901, 0.05675568804144859, -0.002640751888975501, -0.05493824928998947, -0.00992511585354805, 0.05108574032783508, -0.13172385096549988, 0.07450795918703079, -0.019252143800258636, -0.0048263221979141235, -0.08127548545598984, 0.1148916706442833, 0.030546819791197777, 0.09767863899469376, 0.0736418142914772, -0.051845114678144455, 0.06729798018932343, -0.13870345056056976, -0.03947307914495468, 0.04149220511317253, 0.03652859479188919, -0.0564023032784462, -0.05715581029653549, 0.0568821094930172, -0.04008495435118675, 0.04448903352022171, 0.06944514811038971, 0.04932780936360359, 0.037085507065057755, -0.11495425552129745, 0.0061754039488732815, 0.03873666748404503, 0.05865830183029175, -0.021754641085863113, 0.012170207686722279, 0.015195943415164948, 0.033184222877025604, -0.03359740972518921, 0.09347182512283325, 0.1243046298623085, 0.21285641193389893, 0.046863675117492676, 0.0799877718091011, -0.026772959157824516, -0.09866289794445038, -0.09528806805610657, 0.15730908513069153, -0.007106523960828781, 0.04232107847929001, -0.04090781509876251, 0.09006435424089432, 0.08570785820484161, -0.16075575351715088, 0.0470428504049778, -0.007145056501030922, -0.09982284158468246, -0.08140105754137039, -0.1295406073331833, -0.02388245239853859, -0.06769423186779022, -0.004936808254569769, -0.0931287333369255, 0.03946656361222267, 0.061019167304039, 0.05791757255792618, -0.0386052206158638, 0.14749504625797272, -0.00820154044777155, -0.04200693964958191, 0.10848166048526764, -0.022604161873459816, 0.10059590637683868, -0.08692172169685364, -0.011572644114494324, 0.017473630607128143, -0.027478529140353203, 0.06022069603204727, 0.008875997737050056, -0.003929926082491875, 0.020152725279331207, 0.043979402631521225, -0.0547352060675621, 0.0051764692179858685, 0.01679043471813202, 0.1182142049074173, 0.07880085706710815, 0.040204960852861404, -0.04261822625994682, -0.019104434177279472, 0.19861139357089996, -0.03909862041473389, -0.04985474795103073, -0.16090676188468933, 0.19415557384490967, 0.08539917320013046, 0.014659876935184002, 0.03342827036976814, -0.1096678301692009, 0.025513365864753723, 0.1821414828300476, 0.10174290835857391, -0.020718613639473915, -0.037425342947244644, 0.035074084997177124, -0.003910217899829149, 0.028691839426755905, 0.10560470819473267, 0.044781994074583054, 0.16757124662399292, -0.11183550208806992, 0.08050548285245895, -0.06926281005144119, -0.048291485756635666, 0.022384455427527428, 0.159753680229187, 0.02565068192780018, 0.004429720342159271, -0.07833215594291687, 0.12515515089035034, 0.008926769718527794, -0.2025270015001297, 0.11154836416244507, -0.08479771763086319, -0.14298397302627563, -0.022989440709352493, -0.006059109698981047, -0.0202520489692688, 0.07797089219093323, -0.032386478036642075, -0.019394168630242348, 0.08648721873760223, 0.0444079153239727, -0.05685065686702728, -0.17110992968082428, 0.07935190200805664, -0.023290837183594704, 0.17136536538600922, -0.010019924491643906, 0.08692001551389694, 0.0831199660897255, 0.02161194384098053, -0.12209426611661911, 0.0690094605088234, 0.028773456811904907, 0.010159799829125404, 0.041748885065317154, 0.1376553624868393, -0.03981965407729149, 0.1278032809495926, 0.02487044781446457, -0.13815529644489288, 0.054692480713129044, -0.16176901757717133, -0.04795999452471733, -0.18483798205852509, 0.03502734377980232, -0.049701087176799774, 0.15626060962677002, 0.2172912061214447, -0.04808598384261131, 0.0037088687531650066, -0.08055834472179413, 0.056116800755262375, 0.005408409051597118, 0.16604691743850708, 0.015281682834029198, -0.21258312463760376, 0.015888705849647522, -0.09302153438329697, 0.04552609473466873, -0.199088916182518, -0.03511388227343559, 0.016864590346813202, -0.08254235237836838, -0.026307882741093636, 0.10795404016971588, 0.0650864839553833, 0.07960828393697739, -0.054225243628025055, -0.061050038784742355, -0.007755602244287729, 0.14760054647922516, -0.1118665337562561, -0.10181085765361786 ]
null
null
transformers
# legal_t5_small_summ_sv model Model for Summarization of legal text written in Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis. ## Model description legal_t5_small_summ_sv is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for summarization of legal texts written in Swedish. ### How to use Here is how to use this model to summarize legal text written in Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_summ_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_summ_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) sv_text = "EUROPEISKA GEMENSKAPERNAS RÅD HAR ANTAGIT DENNA FÖRORDNING med beaktande av Fördraget om upprättandet av Europeiska ekonomiska gemenskapen, särskilt artiklarna 43 och 100a i detta, med beaktande av kommissionens förslag(1), i samarbete med Europaparlamentet(2), med beaktande av Ekonomiska och sociala kommitténs yttrande(3), och med beaktande av följande: Det bör införas förbud mot användning av blybaserade kapsyler eller blybaserad folie i förslutningar på förpackningar som används då aromatiserade viner, aromatiserade vinbaserade drycker och aromatiserade drinkar baserade på vinprodukter släpps ut på marknaden i syfte att undvika risken för kontaminering, särskilt vid oavsiktlig kontakt med sådana produkter, samt risken för miljöförorening på grund av avfall som innehåller bly från kapsyler och folie av detta slag. Tillverkarna och användarna av kapsylerna och folien i fråga bör dock ges tid att anpassa sig genom att förbudet inte tillämpas förrän från och med den 1 januari 1993. Det är även nödvändigt att tillåta att produkter som före detta datum tappats på buteljer med blybaserade kapsyler eller blybaserad folie får säljas till dess att lagren är uttömda. Vissa definitioner av aromatiserade vinbaserade drycker bör anpassas så att större hänsyn tas till traditionella framställningsmetoder. Förordning (EEG) nr 1601/91(4) bör därför ändras. HÄRIGENOM FÖRESKRIVS FÖLJANDE. Artikel 1 Förordning (EEG) nr 1601/91 ändras på följande sätt: 1. Artikel 2.3 a första stycket skall ersättas med följande: %quot%a) Sangria: en dryck som framställs av vin - som smaksatts genom tillsats av naturliga extrakt eller essenser av citrusfrukt, - med eller utan saft av sådan frukt, - eventuellt: - med tillsats av kryddor, - sötat, - med tillsats av CO2, och med en slutlig alkoholstyrka på under 12 volymprocent.%quot% 2. Artikel 2.3 e skall ersättas med följande: %quot%e) Kalte Ente: Smaksatt vinbaserad dryck som framställs genom att vin, pärlande vin eller pärlande vin med tillsatt CO2 blandas med mousserande vin eller mousserande vin med tillsatt CO2 och tillsätts naturlig citronsubstans eller extrakt av detta som måste ge en tydligt framträdande smak. Slutprodukten måste innehålla minst 25 volymprocent mousserande vin eller mousserande vin med tillsatt CO2.%quot% 3. Följande punkt skall införas i artikel 8: %quot%4.a Från och med den 1 januari 1993 får buteljerade produkter som omfattas av denna förordning inte saluhållas eller släppas ut på marknaden i förpackningar med förslutningar som täckts med blybaserade kapsyler eller blybaserad folie. Dock får produkter som före detta datum tappats på flaskor med detta slag av kapsyler eller folie avyttras till dess att lagren tömts.%quot% Artikel 2 Denna förordning träder i kraft den tredje dagen efter det att den har offentliggjorts i Europeiska gemenskapernas officiella tidning. Denna förordning är till alla delar bindande och direkt tillämplig i alla medlemsstater. Utfärdad i Bryssel den 9 november 1992. På rådets vägnar D. HURD Ordförande (1) EGT nr C 69, 18.3.1992, s. 11. (2) EGT nr C 241, 21.9.1992, s. 97 och beslut av den 28 oktober 1992. (3) EGT nr C 169, 6.7.1992, s. 1. (4) EGT nr L 149, 14.6.1991, s. 1. " pipeline([sv_text], max_length=512) ``` ## Training data The legal_t5_small_summ_sv model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html) dataset consisting of 19 Thousand texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for classification test dataset, achieves the following results: Test results : | Model | Rouge1 | Rouge2 | Rouge Lsum | |:-----:|:-----:|:-----:|:-----:| | legal_t5_small_summ_sv | 78.84|69.97 |77.59| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Swedish", "tags": ["summarization Swedish model"], "datasets": ["jrc-acquis"], "widget": [{"text": "EUROPEISKA GEMENSKAPERNAS R\u00c5D HAR ANTAGIT DENNA F\u00d6RORDNING med beaktande av F\u00f6rdraget om uppr\u00e4ttandet av Europeiska ekonomiska gemenskapen, s\u00e4rskilt artiklarna 43 och 100a i detta, med beaktande av kommissionens f\u00f6rslag(1), i samarbete med Europaparlamentet(2), med beaktande av Ekonomiska och sociala kommitt\u00e9ns yttrande(3), och med beaktande av f\u00f6ljande: Det b\u00f6r inf\u00f6ras f\u00f6rbud mot anv\u00e4ndning av blybaserade kapsyler eller blybaserad folie i f\u00f6rslutningar p\u00e5 f\u00f6rpackningar som anv\u00e4nds d\u00e5 aromatiserade viner, aromatiserade vinbaserade drycker och aromatiserade drinkar baserade p\u00e5 vinprodukter sl\u00e4pps ut p\u00e5 marknaden i syfte att undvika risken f\u00f6r kontaminering, s\u00e4rskilt vid oavsiktlig kontakt med s\u00e5dana produkter, samt risken f\u00f6r milj\u00f6f\u00f6rorening p\u00e5 grund av avfall som inneh\u00e5ller bly fr\u00e5n kapsyler och folie av detta slag. Tillverkarna och anv\u00e4ndarna av kapsylerna och folien i fr\u00e5ga b\u00f6r dock ges tid att anpassa sig genom att f\u00f6rbudet inte till\u00e4mpas f\u00f6rr\u00e4n fr\u00e5n och med den 1 januari 1993. Det \u00e4r \u00e4ven n\u00f6dv\u00e4ndigt att till\u00e5ta att produkter som f\u00f6re detta datum tappats p\u00e5 buteljer med blybaserade kapsyler eller blybaserad folie f\u00e5r s\u00e4ljas till dess att lagren \u00e4r utt\u00f6mda. Vissa definitioner av aromatiserade vinbaserade drycker b\u00f6r anpassas s\u00e5 att st\u00f6rre h\u00e4nsyn tas till traditionella framst\u00e4llningsmetoder. F\u00f6rordning (EEG) nr 1601/91(4) b\u00f6r d\u00e4rf\u00f6r \u00e4ndras. H\u00c4RIGENOM F\u00d6RESKRIVS F\u00d6LJANDE. Artikel 1 F\u00f6rordning (EEG) nr 1601/91 \u00e4ndras p\u00e5 f\u00f6ljande s\u00e4tt: 1. Artikel 2.3 a f\u00f6rsta stycket skall ers\u00e4ttas med f\u00f6ljande: %quot%a) Sangria: en dryck som framst\u00e4lls av vin - som smaksatts genom tillsats av naturliga extrakt eller essenser av citrusfrukt, - med eller utan saft av s\u00e5dan frukt, - eventuellt: - med tillsats av kryddor, - s\u00f6tat, - med tillsats av CO2, och med en slutlig alkoholstyrka p\u00e5 under 12 volymprocent.%quot% 2. Artikel 2.3 e skall ers\u00e4ttas med f\u00f6ljande: %quot%e) Kalte Ente: Smaksatt vinbaserad dryck som framst\u00e4lls genom att vin, p\u00e4rlande vin eller p\u00e4rlande vin med tillsatt CO2 blandas med mousserande vin eller mousserande vin med tillsatt CO2 och tills\u00e4tts naturlig citronsubstans eller extrakt av detta som m\u00e5ste ge en tydligt framtr\u00e4dande smak. Slutprodukten m\u00e5ste inneh\u00e5lla minst 25 volymprocent mousserande vin eller mousserande vin med tillsatt CO2.%quot% 3. F\u00f6ljande punkt skall inf\u00f6ras i artikel 8: %quot%4.a Fr\u00e5n och med den 1 januari 1993 f\u00e5r buteljerade produkter som omfattas av denna f\u00f6rordning inte saluh\u00e5llas eller sl\u00e4ppas ut p\u00e5 marknaden i f\u00f6rpackningar med f\u00f6rslutningar som t\u00e4ckts med blybaserade kapsyler eller blybaserad folie. Dock f\u00e5r produkter som f\u00f6re detta datum tappats p\u00e5 flaskor med detta slag av kapsyler eller folie avyttras till dess att lagren t\u00f6mts.%quot% Artikel 2 Denna f\u00f6rordning tr\u00e4der i kraft den tredje dagen efter det att den har offentliggjorts i Europeiska gemenskapernas officiella tidning. Denna f\u00f6rordning \u00e4r till alla delar bindande och direkt till\u00e4mplig i alla medlemsstater. Utf\u00e4rdad i Bryssel den 9 november 1992. P\u00e5 r\u00e5dets v\u00e4gnar D. HURD Ordf\u00f6rande (1) EGT nr C 69, 18.3.1992, s. 11. (2) EGT nr C 241, 21.9.1992, s. 97 och beslut av den 28 oktober 1992. (3) EGT nr C 169, 6.7.1992, s. 1. (4) EGT nr L 149, 14.6.1991, s. 1. "}]}
text2text-generation
SEBIS/legal_t5_small_summ_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "summarization Swedish model", "dataset:jrc-acquis", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #summarization Swedish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_summ\_sv model ================================ Model for Summarization of legal text written in Swedish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis. Model description ----------------- legal\_t5\_small\_summ\_sv is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for summarization of legal texts written in Swedish. ### How to use Here is how to use this model to summarize legal text written in Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_summ\_sv model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for classification test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to summarize legal text written in Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_sv model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Swedish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to summarize legal text written in Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_sv model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 66, 153, 50, 30, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #summarization Swedish model #dataset-jrc-acquis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to summarize legal text written in Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_summ\\_sv model was trained on JRC-ACQUIS dataset consisting of 19 Thousand texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 64). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for classification test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.09296100586652756, 0.0764964297413826, -0.00192154454998672, 0.07471626996994019, 0.0959850549697876, -0.024514202028512955, 0.06674551963806152, 0.09688501060009003, -0.07304120808839798, 0.06428538262844086, 0.06753676384687424, 0.02451162040233612, 0.0980503261089325, 0.09913748502731323, 0.04985567182302475, -0.2533610463142395, 0.020137831568717957, -0.036562494933605194, -0.021065808832645416, 0.14353437721729279, 0.12814190983772278, -0.07796747982501984, 0.015170426107943058, -0.024540243670344353, -0.10264863818883896, -0.0038296848069876432, -0.06451539695262909, -0.042793985456228256, 0.0609239861369133, 0.036166854202747345, 0.09524544328451157, 0.0281114112585783, 0.08713868260383606, -0.17186295986175537, -0.002388021908700466, 0.05735348165035248, 0.04105706885457039, 0.029733669012784958, 0.07582619786262512, 0.013086758553981781, 0.1858542263507843, -0.023259362205863, 0.05846616253256798, 0.02333797886967659, -0.08800224214792252, -0.14113938808441162, -0.05927182361483574, 0.07132089138031006, 0.14592929184436798, 0.12676413357257843, -0.0699043869972229, 0.06753771007061005, -0.09897694736719131, 0.07358182221651077, 0.043529488146305084, -0.2502806484699249, -0.07372678071260452, 0.06999939680099487, 0.047427188605070114, 0.08192642778158188, -0.049591440707445145, -0.02081424370408058, 0.04389333352446556, 0.03760151192545891, 0.013634445145726204, -0.01787097379565239, -0.004216346424072981, -0.007165838032960892, -0.1743977963924408, -0.08042915910482407, 0.19054588675498962, 0.008964492939412594, -0.08248284459114075, -0.08129920810461044, 0.0027434485964477062, -0.1401149332523346, 0.044239118695259094, -0.061998166143894196, 0.038257896900177, -0.019600413739681244, 0.025041639804840088, -0.02145475707948208, -0.13916440308094025, -0.09741932898759842, 0.06201454624533653, 0.0732019916176796, 0.06079758703708649, -0.007832084782421589, 0.032238759100437164, 0.12680743634700775, -0.028049636632204056, -0.08162211626768112, -0.04509458690881729, -0.0074561964720487595, -0.14344094693660736, -0.05076467618346214, -0.04673803597688675, -0.20124241709709167, -0.04425753653049469, 0.10634921491146088, 0.003767287591472268, 0.04574443772435188, 0.06304923444986343, 0.052577510476112366, 0.015520583838224411, 0.14752784371376038, -0.06940654665231705, -0.03869309276342392, -0.06545613706111908, 0.018967557698488235, -0.05987583100795746, 0.0005429679877124727, -0.016084708273410797, 0.0014639878645539284, 0.07569581270217896, 0.037660956382751465, -0.09059348702430725, 0.009548332542181015, -0.038714684545993805, -0.024230336770415306, -0.03306186944246292, -0.12219084054231644, -0.04992877319455147, -0.028447477146983147, -0.08207836002111435, -0.08073680102825165, 0.10160672664642334, -0.0007391074323095381, -0.12063918262720108, 0.10532177984714508, -0.01155893411487341, -0.0005806618719361722, -0.09286532551050186, -0.09828589111566544, -0.002437253249809146, -0.09333749115467072, -0.04183809086680412, -0.07180513441562653, -0.13947951793670654, -0.12292971462011337, 0.10000400245189667, -0.04952263459563255, -0.04924309253692627, -0.07446887344121933, -0.02117004431784153, 0.0005299613112583756, -0.04451138898730278, 0.11304417997598648, -0.019103769212961197, 0.06729856878519058, -0.03202846273779869, 0.05453061684966087, 0.1191842183470726, 0.056026723235845566, -0.1070159375667572, 0.02373577281832695, -0.10938528180122375, 0.14113624393939972, -0.06308016180992126, 0.018158987164497375, -0.14838817715644836, -0.07350847125053406, -0.05144878849387169, 0.056138888001441956, 0.09835393726825714, 0.10581858456134796, -0.1684214025735855, -0.0026457554195076227, 0.1663350760936737, -0.13165588676929474, -0.04471408948302269, 0.10019664466381073, -0.0202346108853817, 0.15981841087341309, 0.07911397516727448, 0.19873380661010742, 0.06752537190914154, -0.05555679276585579, -0.014080393128097057, -0.013254251331090927, -0.013020877726376057, 0.0009875918040052056, 0.10004401952028275, -0.048378024250268936, -0.0842733085155487, -0.00927627831697464, -0.08468237519264221, -0.006810717284679413, -0.051169708371162415, -0.071731336414814, 0.018853306770324707, -0.05475408583879471, -0.07164373993873596, 0.06004449352622032, 0.04600633308291435, -0.04562947154045105, -0.10461931675672531, 0.005989361088722944, 0.06495945155620575, -0.06382927298545837, 0.03759884089231491, -0.04288996011018753, -0.0555693693459034, -0.10804641991853714, -0.015169248916208744, -0.18033766746520996, 0.051342837512493134, 0.015348057262599468, -0.0368168018758297, 0.05543689802289009, 0.0620187446475029, 0.03842373192310333, 0.06226411089301109, 0.0003611265274230391, -0.037283044308423996, -0.039637643843889236, -0.02740679495036602, -0.12270698696374893, -0.13840129971504211, -0.04310091212391853, -0.029629338532686234, 0.11814232915639877, -0.1594144105911255, 0.024266410619020462, -0.04703982174396515, 0.03791959211230278, -0.013982009142637253, -0.06327984482049942, 0.04636717587709427, 0.03574319928884506, 0.018477879464626312, -0.062013719230890274, 0.055687665939331055, 0.03320403769612312, -0.0425204299390316, 0.06797458976507187, -0.06596724689006805, -0.15319159626960754, 0.062047071754932404, 0.04862266778945923, -0.17259730398654938, 0.00483891973271966, -0.04912623018026352, -0.05845275893807411, -0.0565776601433754, 0.015184761956334114, 0.21403768658638, 0.015438060276210308, 0.12250884622335434, -0.09479860961437225, -0.05394553393125534, 0.012182367034256458, -0.05504661798477173, 0.016457824036478996, 0.14453105628490448, 0.06878267973661423, -0.14240993559360504, 0.042796771973371506, -0.015008179470896721, -0.04031069949269295, 0.19824767112731934, 0.017903752624988556, -0.1029970645904541, 0.009153047576546669, 0.07423076778650284, -0.009784991852939129, 0.10214313119649887, -0.15323001146316528, 0.013958006165921688, 0.013337695971131325, 0.038077518343925476, 0.043547648936510086, -0.14976748824119568, 0.01091409008949995, 0.06062660738825798, -0.039205629378557205, 0.019488699734210968, -0.016308201476931572, -0.08447122573852539, 0.05964748561382294, 0.02885502204298973, -0.018760569393634796, -0.001948305987752974, -0.04241814836859703, -0.15228776633739471, 0.21754126250743866, -0.04971090331673622, -0.16296884417533875, -0.11986102908849716, 0.11370133608579636, 0.034414663910865784, 0.0030397269874811172, 0.06016998365521431, -0.09874624758958817, -0.06022521108388901, -0.11286695301532745, 0.09901062399148941, -0.01823764108121395, -0.017786942422389984, -0.08151960372924805, -0.0029274546541273594, 0.03364120051264763, -0.11244713515043259, 0.027710862457752228, -0.04508792236447334, -0.07333685457706451, 0.005051475018262863, -0.0360250286757946, 0.07735186070203781, 0.12794829905033112, -0.01529636513441801, 0.04449756443500519, 0.024198565632104874, 0.1849595010280609, -0.09991443902254105, 0.022753670811653137, 0.06003193184733391, -0.010913067497313023, 0.004381468053907156, 0.1193566545844078, -0.002030846895650029, -0.07176429778337479, 0.05413147807121277, 0.06294546276330948, -0.06210191175341606, -0.2705285847187042, -0.018748318776488304, -0.012358062900602818, -0.014571385458111763, 0.10788556933403015, 0.06284825503826141, -0.023392416536808014, 0.074661023914814, -0.006273436825722456, -0.04375404492020607, 0.03248579055070877, 0.0684097409248352, -0.08291366696357727, -0.0013665714068338275, 0.07288625836372375, -0.04823097586631775, 0.003081977367401123, 0.03952298313379288, -0.01351726520806551, 0.2322940081357956, -0.038748014718294144, 0.06733600050210953, 0.10870755463838577, 0.0830911248922348, 0.004935717675834894, 0.08513650298118591, -0.03647364675998688, 0.004241155926138163, 0.027760151773691177, -0.04186902195215225, -0.09288190305233002, 0.06140398234128952, -0.03484081104397774, 0.02753312885761261, -0.09600159525871277, 0.006554919295012951, 0.017336515709757805, 0.33136311173439026, 0.05902142450213432, -0.21415838599205017, -0.10540056228637695, 0.006727499887347221, -0.0759553387761116, -0.08933838456869125, 0.03572172299027443, 0.10323995351791382, -0.1332361102104187, 0.022737596184015274, -0.05790303274989128, 0.0997575968503952, -0.06945894658565521, -0.043283529579639435, 0.09430613368749619, 0.025752421468496323, -0.008139843121170998, 0.10257506370544434, -0.24664661288261414, 0.18226030468940735, -0.017457377165555954, 0.12363845854997635, -0.02625647932291031, 0.02802932821214199, -0.0653514489531517, 0.035671770572662354, 0.19764170050621033, 0.028926266357302666, 0.001470093266107142, -0.08438879996538162, -0.07755035907030106, 0.01390428002923727, 0.04626316577196121, -0.09767881780862808, 0.09323747456073761, 0.023681240156292915, 0.03178488835692406, -0.026336301118135452, -0.06753268092870712, -0.12000381201505661, -0.0919610932469368, -0.0015169642865657806, -0.0944908857345581, 0.07017173618078232, -0.05027059465646744, -0.05160248279571533, -0.02857711724936962, 0.19941599667072296, -0.15961338579654694, -0.10261913388967514, -0.09935320913791656, 0.05951467901468277, 0.0619175098836422, -0.027101431041955948, -0.02259744703769684, 0.022678524255752563, 0.033041972666978836, -0.030486054718494415, 0.020135870203375816, 0.07812633365392685, -0.062293052673339844, -0.14008216559886932, -0.02316475473344326, 0.12135659903287888, 0.13869962096214294, 0.0452464260160923, -0.014423164539039135, 0.030687905848026276, -0.011888545006513596, -0.09953072667121887, 0.020653607323765755, 0.036089278757572174, 0.048396412283182144, 0.0071384599432349205, -0.026777001097798347, 0.017635179683566093, -0.07134760916233063, -0.05564087629318237, 0.11573624610900879, 0.13991431891918182, -0.052171871066093445, 0.09671466797590256, 0.14476075768470764, -0.11580261588096619, -0.19356726109981537, 0.03410852327942848, 0.12576471269130707, 0.08010287582874298, -0.03896433860063553, -0.16984565556049347, 0.08289548754692078, 0.07969830930233002, 0.005476200021803379, -0.08316049724817276, -0.33016204833984375, -0.14251914620399475, 0.06502656638622284, 0.09324797987937927, -0.03949186950922012, -0.06756670773029327, -0.021985359489917755, 0.02457510121166706, -0.005951122380793095, 0.08712870627641678, -0.02614976465702057, 0.09934105724096298, 0.028113173320889473, -0.057708729058504105, 0.05070670321583748, -0.06615995615720749, 0.12359687685966492, 0.07627199590206146, 0.04416665807366371, -0.07197057455778122, 0.07613576948642731, 0.020355315878987312, -0.032293274998664856, 0.17315533757209778, 0.022654909640550613, 0.019632983952760696, -0.19917617738246918, -0.07301006466150284, -0.08673842996358871, 0.0270228274166584, -0.07199173420667648, -0.06814273446798325, -0.03277641162276268, 0.10821320861577988, 0.07688998430967331, -0.014902782626450062, 0.050393909215927124, -0.12156999111175537, 0.028612244874238968, 0.11603417992591858, 0.1329967975616455, 0.06642373651266098, -0.07882411777973175, 0.04402899742126465, 0.03541296720504761, 0.0863480418920517, -0.1613101214170456, -0.01569175347685814, 0.13378411531448364, -0.0056109558790922165, 0.13936324417591095, -0.013859818689525127, -0.13750424981117249, -0.00763206509873271, 0.04969487711787224, -0.12612363696098328, -0.09534285217523575, -0.023611633107066154, -0.05203881487250328, -0.05066516622900963, -0.03135116770863533, 0.07295821607112885, -0.14170491695404053, 0.0034013644326478243, -0.01162363588809967, 0.04664921388030052, -0.09714679419994354, 0.2083314061164856, 0.05065391957759857, 0.0792529433965683, -0.06373747438192368, 0.1359124332666397, 0.07801637053489685, -0.12212720513343811, 0.035169318318367004, 0.15148934721946716, -0.09955389052629471, -0.0540037602186203, 0.004438965581357479, 0.13924214243888855, -0.02831903286278248, -0.07846881449222565, -0.059952471405267715, -0.0674285739660263, 0.052369751036167145, -0.001281738979741931, 0.05997737869620323, 0.019947422668337822, -0.02233319915831089, -0.011587209068238735, -0.130888432264328, 0.08807390183210373, 0.08000127226114273, 0.0021098710130900145, -0.04709869623184204, 0.18389524519443512, 0.02596280351281166, 0.0513966828584671, -0.012395446188747883, -0.023743268102407455, -0.03307400643825531, 0.04951930418610573, -0.0060633085668087006, -0.015376009978353977, -0.06523904204368591, 0.003892238950356841, -0.014452653005719185, -0.0007197597296908498, 0.006377059500664473, 0.028727570548653603, -0.06510693579912186, -0.02364732325077057, -0.04956085979938507, 0.02491031214594841, -0.048594068735837936, -0.012303960509598255, -0.024087965488433838, -0.04699007049202919, 0.07145021855831146, -0.004114069044589996, -0.009587104432284832, 0.0054713706485927105, -0.026052352041006088, 0.11464333534240723, -0.01762876845896244, -0.025341859087347984, -0.015840472653508186, -0.08101014792919159, 0.007641930598765612, -0.018781602382659912, -0.05280323326587677, -0.003130526514723897, 0.04587431997060776, -0.13541163504123688, 0.04101903364062309, -0.01354018785059452, 0.0061110034584999084, -0.08633212745189667, 0.1168147549033165, 0.018054913729429245, 0.10937383770942688, 0.12386283278465271, -0.07736343890428543, 0.07430438697338104, -0.14495481550693512, -0.040195103734731674, 0.04521651938557625, 0.026718076318502426, -0.061209119856357574, -0.05693647637963295, 0.04561805725097656, -0.07106229662895203, 0.05749039351940155, 0.07328823208808899, 0.04714455455541611, 0.04328596591949463, -0.11653193086385727, 0.017642252147197723, 0.040706101804971695, 0.061276134103536606, -0.03386890888214111, 0.014858896844089031, 0.006997866556048393, 0.03236996382474899, -0.027532480657100677, 0.05975726246833801, 0.14805561304092407, 0.1757371425628662, 0.0491977222263813, 0.08851612359285355, -0.03249480947852135, -0.06423883885145187, -0.09759131073951721, 0.1686721295118332, 0.021912558004260063, 0.04923490434885025, -0.015442071482539177, 0.06649642437696457, 0.10054202377796173, -0.17355531454086304, 0.06518015265464783, 0.0013576662167906761, -0.10192897170782089, -0.10060470551252365, -0.15786150097846985, -0.049922555685043335, -0.041494131088256836, 0.0036833572667092085, -0.105879046022892, 0.024801336228847504, 0.06720447540283203, 0.06994787603616714, -0.011972312815487385, 0.14458422362804413, -0.0211417768150568, -0.04834343120455742, 0.0960356667637825, -0.024047739803791046, 0.07548737525939941, -0.07122865319252014, 0.0072564249858260155, 0.04327869787812233, -0.0028955701272934675, 0.029622141271829605, 0.002363478299230337, -0.0017297675367444754, 0.0071402802132070065, 0.042325589805841446, -0.06671779602766037, 0.00792568176984787, 0.04148868843913078, 0.1331028938293457, 0.039235714823007584, 0.06370509415864944, -0.05147019028663635, -0.03227416053414345, 0.21314243972301483, -0.041779134422540665, -0.07819683849811554, -0.17022258043289185, 0.1888759881258011, 0.06323128193616867, 0.03633209690451622, 0.03416115790605545, -0.10699072480201721, 0.03937219828367233, 0.16721412539482117, 0.11864492297172546, 0.015386401675641537, -0.025482645258307457, -0.00403959397226572, -0.01065665390342474, 0.02184794470667839, 0.10375550389289856, 0.021274680271744728, 0.14743568003177643, -0.10372613370418549, 0.07266045361757278, -0.079790398478508, -0.06157755106687546, 0.015812549740076065, 0.14991134405136108, 0.033807285130023956, -0.0004646581946872175, -0.06202191859483719, 0.1340077668428421, -0.028988102450966835, -0.21331696212291718, 0.10618074983358383, -0.050665393471717834, -0.15482161939144135, -0.02582927606999874, 0.034325238317251205, -0.0037949515972286463, 0.05028824135661125, -0.011229708790779114, 0.006788001861423254, 0.06135491281747818, 0.04606527462601662, -0.05065906047821045, -0.12982876598834991, 0.06832519918680191, 0.010916540399193764, 0.1328684538602829, 0.014427335932850838, 0.07153284549713135, 0.07487162947654724, 0.01260716188699007, -0.10596692562103271, 0.08200465142726898, 0.04032103717327118, -0.002843783935531974, 0.050824206322431564, 0.1768626868724823, -0.025174735113978386, 0.14084866642951965, 0.022528160363435745, -0.16356690227985382, 0.02520476095378399, -0.11469221115112305, -0.05629182606935501, -0.1486852765083313, 0.056543979793787, -0.05618942156434059, 0.15080273151397705, 0.2286955565214157, -0.03751460835337639, 0.0006227220292203128, -0.09960582107305527, 0.06119320169091225, -0.023860543966293335, 0.13517238199710846, 0.028439830988645554, -0.18517139554023743, 0.007358243223279715, -0.07901765406131744, 0.025736123323440552, -0.1879829615354538, -0.03282524272799492, 0.01399439387023449, -0.074173703789711, -0.003976528998464346, 0.11304653435945511, 0.05002090707421303, 0.06381692737340927, -0.04254438728094101, -0.005369720980525017, 0.017620891332626343, 0.14600661396980286, -0.12432662397623062, -0.10697583854198456 ]
null
null
transformers
# legal_t5_small_trans_cs_de model Model on translating legal text from Cszech to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_de is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Deustch. ### How to use Here is how to use this model to translate legal text from Cszech to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_de", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Konečná zpráva bude Parlamentu předložena na konci nového funkčního období." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_de model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_de | 44.69| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Deustch", "tags": ["translation Cszech Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Kone\u010dn\u00e1 zpr\u00e1va bude Parlamentu p\u0159edlo\u017eena na konci nov\u00e9ho funk\u010dn\u00edho obdob\u00ed."}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_de model ===================================== Model on translating legal text from Cszech to Deustch. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_de is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Deustch. ### How to use Here is how to use this model to translate legal text from Cszech to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 61, 166, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.09968084841966629, 0.0374920628964901, -0.0035130621399730444, 0.08039889484643936, 0.08038587123155594, 0.04038994014263153, 0.05667688697576523, 0.09572532773017883, -0.05285099893808365, 0.08069153875112534, 0.05728854238986969, 0.02146805077791214, 0.10115354508161545, 0.04547295719385147, 0.024774031713604927, -0.2062104046344757, 0.012712559662759304, -0.01574208401143551, -0.03415670618414879, 0.13247179985046387, 0.11276122182607651, -0.07625489681959152, 0.04947342351078987, -0.010669494047760963, -0.09942459315061569, -0.01720293238759041, -0.04788144305348396, -0.05875977873802185, 0.06678659468889236, 0.05940224230289459, 0.14346863329410553, 0.010653536766767502, 0.10760485380887985, -0.16006293892860413, -0.0012041316367685795, 0.0807914063334465, 0.035876695066690445, 0.04628446698188782, 0.1012532040476799, 0.027018647640943527, 0.1494116187095642, -0.03845606744289398, 0.06091286614537239, 0.03592468798160553, -0.11543635278940201, -0.11732811480760574, -0.056900132447481155, 0.007615180220454931, 0.14521026611328125, 0.14045219123363495, -0.042524345219135284, 0.0823764055967331, -0.11653957515954971, 0.06703145056962967, 0.03629840910434723, -0.25615233182907104, -0.06782875210046768, 0.06038880720734596, 0.06698853522539139, 0.09311489015817642, -0.03428356349468231, -0.040011581033468246, 0.04060809314250946, 0.024179553613066673, 0.03025689721107483, -0.051622360944747925, 0.04634092375636101, -0.012113765813410282, -0.18219511210918427, -0.09210238605737686, 0.13315849006175995, -0.007853646762669086, -0.05150953680276871, -0.08106280118227005, -0.05610642209649086, -0.09907930344343185, -0.010512184351682663, -0.0716724768280983, 0.04087483882904053, 0.01244262419641018, 0.0831436961889267, -0.03334313631057739, -0.11274021118879318, -0.11095517873764038, 0.01652776449918747, 0.078481525182724, 0.0904339998960495, 0.00557608250528574, -0.03732913359999657, 0.13806650042533875, 0.0052182781510055065, -0.09492544084787369, -0.02087942697107792, -0.02577844262123108, -0.12482170760631561, -0.01608511060476303, -0.0038404862862080336, -0.17051276564598083, -0.05183219164609909, 0.036906663328409195, -0.01539597287774086, 0.07713282853364944, 0.01613881252706051, 0.029010888189077377, 0.03578629717230797, 0.2027703821659088, -0.09041111171245575, -0.07770232111215591, -0.04818990081548691, 0.022314174100756645, -0.058846794068813324, 0.00935569778084755, -0.038932546973228455, -0.029943721368908882, 0.029463613405823708, 0.07538679987192154, -0.034092966467142105, 0.027690378949046135, -0.030381878837943077, -0.05340075120329857, 0.08485138416290283, -0.11075665801763535, -0.009174386039376259, 0.034953903406858444, -0.09114742279052734, 0.011330630630254745, 0.07884494960308075, -0.03564690798521042, -0.1391366720199585, 0.03887435793876648, -0.032479871064424515, -0.013709994032979012, -0.14860258996486664, -0.1266734004020691, -0.034392911940813065, -0.023647719994187355, -0.03158560395240784, -0.07494532316923141, -0.13220342993736267, -0.10648124665021896, 0.055596962571144104, -0.05989529937505722, -0.018911829218268394, -0.08697456866502762, -0.016563907265663147, 0.01053017657250166, -0.02093292586505413, 0.07932543754577637, -0.03953676298260689, 0.08776454627513885, 0.013388927094638348, 0.025900250300765038, 0.13158589601516724, 0.07210371643304825, -0.0912790596485138, 0.012392068281769753, -0.08719541877508163, 0.19108988344669342, -0.050676193088293076, -0.02175389602780342, -0.14109934866428375, -0.1025482714176178, -0.04184573516249657, 0.060619208961725235, 0.08105974644422531, 0.13289333879947662, -0.17318010330200195, 0.010849704034626484, 0.22560900449752808, -0.07906071096658707, -0.030722228810191154, 0.11202586442232132, -0.06003923341631889, 0.12071573734283447, 0.08910604566335678, 0.15747077763080597, 0.07045291364192963, -0.11224790662527084, 0.006655634380877018, -0.04644843190908432, -0.01915956474840641, 0.02479814924299717, 0.08095552772283554, -0.047958482056856155, -0.05672135576605797, -0.007033596280962229, -0.1402132213115692, 0.039492566138505936, -0.037641193717718124, -0.05151877924799919, 0.012134633027017117, -0.019517378881573677, -0.015485953539609909, 0.04414321109652519, 0.028273047879338264, -0.03015177510678768, -0.10542000085115433, 0.0001749242510413751, 0.0881740152835846, -0.06280945241451263, 0.01989772729575634, -0.03159860149025917, -0.041287861764431, -0.04859789088368416, 0.0035179804544895887, -0.13119864463806152, 0.023872574791312218, 0.05337061360478401, -0.010099426843225956, 0.047077715396881104, 0.03452995419502258, 0.0397462472319603, 0.03609873354434967, -0.001446313108317554, -0.04919951409101486, -0.03776935860514641, -0.0339592844247818, -0.11494328081607819, -0.08339894562959671, -0.02196154184639454, -0.02133963815867901, -0.000992532353848219, -0.19508585333824158, 0.0304118525236845, -0.1058022528886795, 0.04264182597398758, -0.008847739547491074, -0.025772329419851303, 0.03967636823654175, 0.036856845021247864, 0.026820680126547813, -0.0751967802643776, 0.034635212272405624, 0.018949197605252266, 0.023624354973435402, 0.12523698806762695, -0.12058647722005844, -0.15122903883457184, 0.08167119324207306, 0.013258663937449455, -0.1278071105480194, 0.013422266580164433, -0.014025584794580936, -0.06078648939728737, -0.042740922421216965, -0.008202161639928818, 0.25799962878227234, 0.015750667080283165, 0.13666681945323944, -0.12235387414693832, -0.026228178292512894, -0.017888342961668968, -0.014299914240837097, 0.01887761428952217, 0.13545919954776764, 0.06641835719347, -0.13420718908309937, 0.045525163412094116, 0.0322849340736866, -0.009769359603524208, 0.16320978105068207, 0.0007961880182847381, -0.14549238979816437, -0.006338004022836685, 0.07336167246103287, -0.021555805578827858, 0.08890531957149506, -0.11364390701055527, 0.004879259038716555, 0.029988842085003853, 0.06895659118890762, 0.061667684465646744, -0.12216193228960037, 0.045844096690416336, 0.05461621284484863, -0.07155171781778336, -0.013304052874445915, -0.018845876678824425, -0.020727843046188354, 0.055601444095373154, 0.0038146008737385273, 0.0015709842555224895, -0.01679990626871586, -0.039142295718193054, -0.1257631778717041, 0.20553311705589294, -0.07379737496376038, -0.13914719223976135, -0.11844965070486069, 0.07539042085409164, 0.05875275284051895, -0.0028019656892865896, 0.03488938510417938, -0.057181842625141144, -0.06569842994213104, -0.06101938709616661, 0.13465730845928192, -0.07412128150463104, -0.06615552306175232, -0.1234019547700882, -0.019576171413064003, -0.013558600097894669, -0.15782862901687622, 0.03797582536935806, -0.02646760642528534, -0.07577574253082275, 0.01573210023343563, -0.041677240282297134, 0.07887299358844757, 0.13827969133853912, -0.00550659466534853, -0.0006283170077949762, -0.010852443054318428, 0.16879689693450928, -0.1385437250137329, 0.05310068279504776, 0.06389979273080826, 0.012120152823626995, 0.014947636984288692, 0.10365182161331177, -0.006860029883682728, -0.08386550098657608, 0.023465311154723167, 0.041276946663856506, -0.02017546072602272, -0.3161482512950897, -0.037773679941892624, -0.04325532913208008, -0.031045865267515182, 0.09801068156957626, 0.02775385230779648, 0.0070879096165299416, 0.04031213000416756, -0.00967834796756506, 0.009530074894428253, 0.033672936260700226, 0.04248514026403427, 0.04130043089389801, 0.016321253031492233, 0.06943926215171814, -0.032644279301166534, -0.015090825967490673, 0.060301683843135834, 0.007432291749864817, 0.25101664662361145, -0.05499925836920738, 0.14300142228603363, 0.06671217828989029, 0.12678122520446777, 0.019493257626891136, 0.07488547265529633, 0.0004647515597753227, 0.03227264806628227, -0.017617838457226753, -0.026849642395973206, -0.04922302067279816, 0.03304614499211311, 0.03449808433651924, -0.021171141415834427, -0.043613970279693604, -0.0038309600204229355, 0.017646152526140213, 0.302873432636261, 0.02530599758028984, -0.18901419639587402, -0.05985276401042938, -0.006618613842874765, -0.05007389187812805, -0.11695598065853119, 0.08657100796699524, 0.08775551617145538, -0.13617148995399475, -0.028759337961673737, -0.037697721272706985, 0.11634668707847595, -0.11146021634340286, -0.0444830060005188, 0.03822595626115799, 0.06604224443435669, 0.002614795695990324, 0.08053706586360931, -0.3001972436904907, 0.17787280678749084, 0.006771349813789129, 0.10602618008852005, -0.013589289970695972, 0.04617264121770859, -0.031071800738573074, 0.02000313624739647, 0.13881295919418335, 0.012205584906041622, -0.036488864570856094, -0.07699161767959595, -0.12290102988481522, 0.017295807600021362, 0.030899979174137115, -0.027494341135025024, 0.08237000554800034, 0.01760450005531311, 0.03830975294113159, -0.0034339469857513905, -0.11685152351856232, -0.1234816238284111, -0.10002350807189941, -0.01647331938147545, -0.11523721367120743, 0.037351179867982864, -0.01974288746714592, -0.041632264852523804, -0.028348879888653755, 0.13105103373527527, -0.12192664295434952, -0.10316594690084457, -0.1038501188158989, -0.0050432407297194, 0.12881067395210266, -0.06733498722314835, 0.002296046819537878, 0.028875166550278664, -0.02383369766175747, 0.018657147884368896, -0.006950525101274252, 0.12053495645523071, -0.06786744296550751, -0.09940624237060547, -0.01719283126294613, 0.09530238062143326, 0.11026734858751297, 0.05237414315342903, -0.020767942070961, 0.023140350356698036, -0.01102406531572342, -0.10557039082050323, -0.00027719055651687086, 0.013413652777671814, 0.045399513095617294, 0.03112124279141426, -0.04880484938621521, -0.05036566033959389, -0.09384986758232117, -0.025535443797707558, 0.07512341439723969, 0.1258905977010727, -0.05260348320007324, 0.052149441093206406, 0.15910522639751434, -0.12645608186721802, -0.18003879487514496, -0.0007178183877840638, 0.09513837099075317, 0.08096881955862045, -0.025445275008678436, -0.20490886270999908, -0.005625635385513306, 0.07882018387317657, 0.010484102182090282, 0.047832366079092026, -0.38871750235557556, -0.12659628689289093, 0.11150215566158295, 0.0685400441288948, -0.040084484964609146, -0.08589296787977219, -0.034667909145355225, 0.07027389109134674, -0.06702335923910141, 0.052871350198984146, -0.02070660889148712, 0.08483129739761353, 0.011301703751087189, -0.013701589778065681, 0.047451216727495193, -0.05189765989780426, 0.1054358184337616, 0.02208913117647171, 0.04407500475645065, -0.05648144707083702, 0.03674296662211418, 0.0036953825037926435, -0.01368789467960596, 0.15744875371456146, 0.0231916606426239, 0.05627574026584625, -0.18997804820537567, -0.0620826818048954, -0.07213664054870605, -0.011228155344724655, -0.07028699666261673, -0.04703482612967491, -0.0659862831234932, 0.06616836786270142, 0.06471505016088486, -0.02272404357790947, -0.004933817312121391, -0.05871249735355377, -0.0671718567609787, 0.12058421224355698, 0.08952835947275162, 0.08195875585079193, -0.09041473269462585, -0.009570860303938389, 0.04584742709994316, 0.10840586572885513, -0.15665185451507568, 0.00483424449339509, 0.11609607934951782, -0.0185638889670372, 0.1185031309723854, -0.019978171214461327, -0.13364091515541077, 0.018708575516939163, 0.004564044065773487, -0.05711062252521515, -0.15629255771636963, -0.006605285219848156, -0.12471404671669006, -0.03594071790575981, -0.06628905981779099, 0.09874935448169708, -0.09481839835643768, -0.029171938076615334, -0.0014989427290856838, 0.0445137582719326, -0.045194581151008606, 0.20492960512638092, 0.03449902683496475, 0.04162308946251869, -0.06405434757471085, 0.10558474808931351, 0.11566014587879181, -0.09240305423736572, 0.03451631963253021, 0.1592579334974289, -0.0947156473994255, -0.049956660717725754, 0.04376601055264473, 0.14429853856563568, -0.020124349743127823, -0.05529964715242386, -0.04116920754313469, -0.040033310651779175, 0.040135838091373444, -0.008484783582389355, 0.026913125067949295, 0.0075161955319345, -0.041612815111875534, 0.004044523928314447, -0.09918143600225449, 0.06944576650857925, 0.0891750231385231, 0.010554993525147438, -0.02708924002945423, 0.1289137899875641, 0.04099181666970253, 0.02768901363015175, -0.022668376564979553, -0.012990881688892841, -0.06537886708974838, 0.0462716668844223, -0.05875970795750618, 0.014418902806937695, -0.04827478528022766, 0.0007427944801747799, -0.030458131805062294, -0.023085422813892365, -0.012923795729875565, 0.03041251376271248, -0.05813738331198692, -0.03768370300531387, -0.037598367780447006, 0.05761447548866272, -0.06517484784126282, -0.020938387140631676, 0.012636033818125725, -0.047985173761844635, 0.06791430711746216, 0.03477419167757034, -0.011101346462965012, 0.029218640178442, -0.001026048674248159, 0.06280718743801117, -0.025239890441298485, 0.024149766191840172, -0.0039000206161290407, -0.0878392830491066, 0.027598557993769646, 0.007548418827354908, -0.01804366707801819, -0.00832928903400898, 0.050449125468730927, -0.11541426181793213, 0.05766342952847481, -0.04934893175959587, -0.007348200306296349, -0.05109182372689247, 0.11209410429000854, 0.021924853324890137, 0.08204981684684753, 0.11003177613019943, -0.08402840048074722, 0.08533879369497299, -0.15103405714035034, -0.038758181035518646, 0.031407713890075684, 0.027882665395736694, -0.002239912049844861, -0.0666542798280716, 0.06334346532821655, -0.05651278793811798, 0.09907391667366028, 0.054120417684316635, 0.05985945835709572, 0.024582287296652794, -0.08183156698942184, -0.007916955277323723, 0.031602852046489716, 0.061194371432065964, -0.024769259616732597, 0.009758743457496166, -0.0022733286023139954, 0.06792228668928146, -0.03726935759186745, 0.05992157757282257, 0.12688282132148743, 0.22029951214790344, 0.08134280145168304, 0.08932100981473923, -0.04811452329158783, -0.09560997039079666, -0.09461857378482819, 0.08480799198150635, -0.02064529061317444, 0.0151323601603508, -0.002709595952183008, 0.1471996307373047, 0.10065338015556335, -0.14832845330238342, 0.09692320227622986, -0.010487268678843975, -0.11777647584676743, -0.07909301668405533, -0.057462193071842194, -0.037390779703855515, -0.10620255023241043, -0.011186670511960983, -0.09484517574310303, 0.03527744114398956, 0.04313109442591667, 0.07392662763595581, -0.0501810722053051, 0.1433817446231842, 0.038803163915872574, -0.09903346747159958, 0.08052046597003937, 0.004365172237157822, 0.09480255097150803, -0.05685238540172577, -0.0007047170656733215, 0.020546626299619675, 0.00633099302649498, 0.06365292519330978, 0.0013719472335651517, -0.028830494731664658, 0.02615664340555668, -0.0016001721378415823, -0.030260957777500153, -0.0013854134595021605, 0.07658634334802628, 0.08583052456378937, 0.1543484628200531, 0.0680801048874855, -0.04610271006822586, -0.02537786029279232, 0.1721617877483368, -0.032704178243875504, -0.10228994488716125, -0.17689821124076843, 0.100929856300354, 0.03792937472462654, 0.019764134660363197, 0.032736364752054214, -0.08225156366825104, -0.014618488028645515, 0.23597760498523712, 0.132118359208107, -0.04862377420067787, -0.045223869383335114, 0.026440130546689034, -0.007087630685418844, 0.01628991588950157, 0.10924182087182999, 0.030883101746439934, 0.18428680300712585, -0.06966494023799896, 0.00516066700220108, -0.07872238755226135, -0.04560842365026474, -0.019843753427267075, 0.1442410796880722, -0.0012255854671820998, -0.03380783647298813, -0.042512934654951096, 0.11506389826536179, -0.0004822706978302449, -0.18210375308990479, 0.026297442615032196, -0.05455366149544716, -0.11066579818725586, -0.021546201780438423, -0.0144345136359334, 0.013147150166332722, 0.029623523354530334, 0.008033080957829952, 0.004294272977858782, 0.16495084762573242, 0.024480333551764488, -0.06831979751586914, -0.13060392439365387, 0.05360462889075279, -0.04997977614402771, 0.17346425354480743, 0.027326561510562897, 0.08070623874664307, 0.07234461605548859, 0.03560351952910423, -0.09734977781772614, 0.06446979939937592, 0.0529758483171463, 0.026230983436107635, 0.040868233889341354, 0.0773606076836586, -0.029750041663646698, 0.033084478229284286, 0.03266642615199089, -0.09023061394691467, 0.04441177845001221, -0.13576965034008026, -0.046160705387592316, -0.15841174125671387, 0.04114344343543053, -0.046679090708494186, 0.11084721237421036, 0.22003598511219025, 0.00046004296746104956, 0.01659283973276615, -0.05604800581932068, 0.05490177124738693, -0.04507890343666077, 0.16045783460140228, -0.015095417387783527, -0.15006761252880096, 0.002316618338227272, -0.011084625497460365, 0.04447975382208824, -0.15758973360061646, -0.01027770433574915, 0.026403475552797318, -0.08015584200620651, -0.015262501314282417, 0.12402370572090149, 0.03887425735592842, 0.05143417790532112, -0.012335768900811672, -0.06834489852190018, -0.01161503978073597, 0.12925183773040771, -0.12227693200111389, -0.07194342464208603 ]
null
null
transformers
# legal_t5_small_trans_cs_de_small_finetuned model Model on translating legal text from Cszech to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_de_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_cs_de_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Deustch. ### How to use Here is how to use this model to translate legal text from Cszech to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_de_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_de", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Vzhledem k tomu, že tento právní předpis bude přímo použitelný v členských státech a zavede mnoho povinností pro ty, na něž se vztahuje, je žádoucí, aby se jim poskytlo více času na přizpůsobení se těmto novým pravidlům." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_de_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_de_small_finetuned | 44.175| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Deustch", "tags": ["translation Cszech Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Vzhledem k tomu, \u017ee tento pr\u00e1vn\u00ed p\u0159edpis bude p\u0159\u00edmo pou\u017eiteln\u00fd v \u010dlensk\u00fdch st\u00e1tech a zavede mnoho povinnost\u00ed pro ty, na n\u011b\u017e se vztahuje, je \u017e\u00e1douc\u00ed, aby se jim poskytlo v\u00edce \u010dasu na p\u0159izp\u016fsoben\u00ed se t\u011bmto nov\u00fdm pravidl\u016fm."}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_de_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_de\_small\_finetuned model ======================================================= Model on translating legal text from Cszech to Deustch. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_de\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_cs\_de\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Deustch. ### How to use Here is how to use this model to translate legal text from Cszech to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_de\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_de\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_de\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 61, 212, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_de\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06970739364624023, 0.08423375338315964, -0.003586331382393837, 0.0917835682630539, 0.053892359137535095, 0.02346324734389782, 0.04416237771511078, 0.09920935332775116, -0.054119773209095, 0.09171155840158463, 0.04136074706912041, -0.02636103145778179, 0.0912180095911026, 0.031845029443502426, 0.06426219642162323, -0.2312968373298645, 0.004734910558909178, -0.03886859491467476, -0.012542296200990677, 0.08661071211099625, 0.10001280158758163, -0.06592234969139099, 0.04547027871012688, -0.04988846182823181, -0.06028811261057854, 0.012617378495633602, -0.0855141133069992, -0.031241515651345253, 0.08080536872148514, 0.06814523041248322, 0.0903797596693039, -0.003433310892432928, 0.0857924222946167, -0.17615430057048798, -0.0005882224068045616, 0.07252542674541473, 0.006810481660068035, 0.0541527085006237, 0.11462811380624771, 0.019609898328781128, 0.14303453266620636, -0.08111905306577682, 0.03232951834797859, 0.04121651127934456, -0.12055879086256027, -0.12838436663150787, -0.06342916935682297, 0.03847184032201767, 0.09075010567903519, 0.13712289929389954, -0.04357479512691498, 0.042950503528118134, -0.04032055661082268, 0.061428021639585495, 0.05691446736454964, -0.2363426685333252, -0.03583700954914093, 0.053794194012880325, 0.05189748853445053, 0.0892614796757698, -0.051104459911584854, -0.03313539922237396, 0.04300996661186218, 0.0723530575633049, 0.028523629531264305, -0.060845281928777695, -0.03404488414525986, -0.04041146859526634, -0.12983563542366028, -0.04834349825978279, 0.14395780861377716, 0.021760299801826477, -0.05081253871321678, -0.0958673357963562, -0.07864903658628464, -0.0810156986117363, -0.018577802926301956, -0.0405559279024601, 0.04033209756016731, 0.007248894311487675, 0.08214590698480606, -0.023241648450493813, -0.11529061198234558, -0.06580331921577454, -0.059353526681661606, 0.06935565918684006, 0.05998014286160469, 0.002402292098850012, 0.02314194105565548, 0.09646221250295639, -0.07419432699680328, -0.09318586438894272, 0.007713393773883581, 0.011050469242036343, -0.11642570793628693, -0.004529109690338373, -0.0028699743561446667, -0.2054886817932129, -0.024003302678465843, 0.044003866612911224, -0.031340718269348145, 0.0653265118598938, 0.06425788998603821, 0.036537617444992065, 0.04157354310154915, 0.15363284945487976, -0.11584500968456268, -0.10721428692340851, -0.02571074105799198, 0.005430381745100021, 0.012089614756405354, 0.008749039843678474, -0.06065487861633301, -0.03094443306326866, -0.0003791096096392721, 0.03686297684907913, 0.0006252759485505521, 0.029764914885163307, -0.026112951338291168, -0.02857019752264023, 0.12284084409475327, -0.10598837584257126, 0.01249196007847786, 0.007163702975958586, -0.08483819663524628, 0.0051612164825201035, 0.06547942757606506, -0.03018997609615326, -0.12843436002731323, 0.08110357820987701, -0.02953827567398548, -0.026816600933670998, -0.11817500740289688, -0.15508657693862915, -0.008989273570477962, 0.034299734979867935, -0.05341767519712448, -0.09708103537559509, -0.12203320115804672, -0.09488829970359802, 0.03638872876763344, -0.06503082811832428, -0.005068398080766201, -0.06902655214071274, -0.017531057819724083, -0.007395752239972353, -0.007564942818135023, 0.10452377796173096, -0.04065089300274849, 0.033768270164728165, 0.010539180599153042, 0.0612386129796505, 0.01737992838025093, 0.02985122986137867, -0.11086570471525192, 0.04249433055520058, -0.12285835295915604, 0.11980792880058289, -0.02329804189503193, -0.0013197408989071846, -0.10848170518875122, -0.054645754396915436, -0.07798056304454803, 0.04763605818152428, 0.06575305014848709, 0.11646811664104462, -0.22271588444709778, 0.012016061693429947, 0.20490676164627075, -0.07476144284009933, -0.06812133640050888, 0.13067224621772766, -0.013379031792283058, 0.029900388792157173, 0.06887710094451904, 0.12680542469024658, 0.0699683353304863, -0.025891412049531937, -0.029329344630241394, 0.0021569228265434504, 0.004087896551936865, 0.07293294370174408, 0.08095699548721313, -0.06807749718427658, 0.059741728007793427, 0.008831225335597992, 0.0389309786260128, -0.00039349208236671984, -0.020859772339463234, -0.028373096138238907, 0.007197436410933733, -0.03789759799838066, -0.04980060085654259, 0.024744518101215363, 0.01211665477603674, -0.06136741489171982, -0.07942280918359756, -0.08959516882896423, 0.1042449027299881, -0.0538792759180069, 0.03170432895421982, -0.003422449342906475, -0.04933502897620201, -0.06591875851154327, 0.017787113785743713, -0.15240414440631866, -0.017335211858153343, 0.06455695629119873, -0.08585824817419052, 0.0910673663020134, 0.05183638259768486, 0.04384702444076538, 0.09539955854415894, -0.05266942083835602, -0.0410870760679245, -0.016150500625371933, -0.02341277524828911, -0.11174557358026505, -0.1061907559633255, -0.04383692890405655, -0.004262202885001898, 0.010449246503412724, -0.12095625698566437, 0.011406425386667252, -0.0700281411409378, 0.0901431068778038, 0.0038348741363734007, -0.019141249358654022, 0.024358587339520454, 0.06263849884271622, -0.015499487519264221, -0.053592681884765625, 0.018939072266221046, 0.006966027431190014, -0.0031430430244654417, 0.0871250331401825, -0.14306330680847168, -0.13693343102931976, 0.07789497077465057, 0.012616640888154507, -0.12297161668539047, 0.03785758465528488, -0.01279775332659483, -0.07228884845972061, -0.03814205527305603, -0.06131015345454216, 0.23559585213661194, 0.027397233992815018, 0.14293929934501648, -0.08957906812429428, -0.027526022866368294, -0.0016864158678799868, -0.023764636367559433, 0.006928339600563049, 0.15268638730049133, 0.07741417735815048, -0.12836894392967224, 0.08529294282197952, 0.00598649587482214, -0.01033809408545494, 0.09426059573888779, 0.038368094712495804, -0.10853302478790283, -0.008873814716935158, 0.03627500310540199, -0.0007922579534351826, 0.08538214862346649, -0.07917631417512894, -0.0041339765302836895, 0.02585466578602791, 0.0666867196559906, 0.0705643817782402, -0.09202359616756439, 0.07124023139476776, 0.06819462031126022, -0.03111654333770275, 0.04009382426738739, -0.04748040437698364, -0.03128790855407715, 0.09381356835365295, 0.026825658977031708, -0.004850836005061865, -0.04116079956293106, -0.05557703971862793, -0.10799098759889603, 0.18340036273002625, -0.07844166457653046, -0.1766119748353958, -0.13432790338993073, 0.05871707573533058, -0.01091625913977623, 0.03314710780978203, 0.025243671610951424, -0.02285606786608696, -0.06838227808475494, -0.12051255255937576, 0.09647116810083389, -0.07060695439577103, -0.0494430847465992, -0.10047300159931183, 0.016766179352998734, -0.0030217054300010204, -0.11743156611919403, 0.023802615702152252, -0.0010621171677485108, -0.002593178767710924, -0.004500128794461489, -0.02849785052239895, 0.12207850068807602, 0.1367151439189911, -0.012891174294054508, -0.04577895626425743, 0.00320735783316195, 0.11505990475416183, -0.08852092921733856, 0.07087264955043793, 0.07744548469781876, 0.029741667211055756, 0.03156979754567146, 0.12487548589706421, 0.016175033524632454, -0.05576331540942192, 0.03936128690838814, 0.040285758674144745, -0.006520947441458702, -0.23194682598114014, -0.09437482059001923, -0.06816750764846802, 0.048357851803302765, 0.08838720619678497, 0.044894490391016006, -0.0671045109629631, 0.019251748919487, -0.044430870562791824, 0.05027999356389046, -0.0009629609994590282, 0.061456769704818726, 0.015646018087863922, -0.01186863798648119, 0.05559896305203438, -0.0625903457403183, -0.0392267107963562, 0.09767399728298187, 0.020293517038226128, 0.1598075032234192, -0.04292171820998192, 0.22961626946926117, 0.03380563482642174, 0.060283176600933075, 0.011938135139644146, 0.07002189755439758, -0.02472730726003647, 0.024695338681340218, -0.025287847965955734, -0.04646727070212364, -0.017563289031386375, 0.041472431272268295, 0.01539662666618824, 0.028917787596583366, -0.06717108935117722, -0.04900001361966133, 0.058637212961912155, 0.23376446962356567, 0.059167683124542236, -0.18633688986301422, -0.06821276247501373, 0.004110618494451046, -0.06972012668848038, -0.08456367999315262, 0.02164972573518753, 0.14544138312339783, -0.10076874494552612, -0.012812010012567043, 0.01053271908313036, 0.11109036207199097, -0.12166418135166168, -0.014858465641736984, 0.047524288296699524, 0.062031280249357224, -0.012471221387386322, 0.11837148666381836, -0.2613547742366791, 0.10547996312379837, 0.010044216178357601, 0.06147421523928642, -0.035197511315345764, 0.02758394181728363, -0.05658085271716118, 0.009732740931212902, 0.11287042498588562, 0.012135942466557026, -0.04598308727145195, -0.0939941257238388, -0.10764254629611969, 0.008156714960932732, 0.08647141605615616, -0.05884293094277382, 0.09760329127311707, 0.041273731738328934, 0.012073015794157982, -0.008455603383481503, -0.0004143177648074925, -0.04283776879310608, -0.15978969633579254, 0.01836605742573738, -0.044293954968452454, -0.02640480361878872, -0.021540042012929916, -0.01084841787815094, -0.06556447595357895, 0.2034815400838852, -0.13454307615756989, -0.07366812974214554, -0.08045417815446854, -0.010107211768627167, 0.12323397397994995, -0.07617516815662384, 0.004981706850230694, 0.000848121999297291, 0.04304138198494911, -0.021356936544179916, -0.036387961357831955, 0.09060604125261307, -0.06732305884361267, -0.1097063422203064, -0.07496695220470428, 0.1317431777715683, 0.06104866787791252, 0.047607820481061935, -0.0324365496635437, 0.0271980669349432, -0.025322668254375458, -0.10767333954572678, -0.012306943535804749, 0.015520953573286533, 0.1378738135099411, 0.04605812206864357, -0.06245112419128418, -0.031115014106035233, -0.06355254352092743, -0.053985077887773514, 0.09276384115219116, 0.1666734665632248, -0.04828287288546562, 0.01788642443716526, 0.17113758623600006, -0.09631174057722092, -0.20348919928073883, -0.04386109486222267, 0.048206303268671036, 0.08560939878225327, -0.03872131556272507, -0.1735817939043045, 0.024836281314492226, 0.0926683247089386, 0.01051352173089981, 0.056554097682237625, -0.3821426033973694, -0.13583625853061676, 0.05633535981178284, 0.03542918339371681, -0.05208649858832359, -0.11460303515195847, -0.042845189571380615, -0.043770965188741684, -0.07621915638446808, 0.07159414142370224, -0.03706585243344307, 0.09447242319583893, -0.0021900127176195383, 0.002920113503932953, 0.043392375111579895, -0.04606277868151665, 0.13715773820877075, 0.03009808249771595, 0.05788445472717285, -0.062096137553453445, 0.041968561708927155, 0.01777300238609314, -0.011053658090531826, 0.15671320259571075, -0.02862967550754547, 0.06014559790492058, -0.1250973343849182, -0.04550551623106003, -0.06257203221321106, 0.005205490160733461, -0.04766817018389702, -0.07030779123306274, -0.06430908292531967, 0.04561241716146469, 0.06157300993800163, -0.012601401656866074, -0.0024684309028089046, -0.041930489242076874, -0.03499487042427063, 0.14334896206855774, 0.09294234961271286, 0.030327007174491882, -0.07580099254846573, 0.02243739739060402, 0.0008559073321521282, 0.07162269949913025, -0.0732567310333252, 0.007898172363638878, 0.14101599156856537, 0.0025770326610654593, 0.1096796989440918, -0.005296317394822836, -0.14874139428138733, -0.016382062807679176, 0.044296734035015106, -0.10702159255743027, -0.14411558210849762, -0.0045226081274449825, -0.015955466777086258, -0.06313461810350418, -0.03746625781059265, 0.0814201757311821, -0.0870087668299675, -0.020714890211820602, 0.01186163630336523, 0.03619347885251045, -0.04405450075864792, 0.19474080204963684, 0.024232324212789536, 0.041517481207847595, -0.05466223880648613, 0.13817022740840912, 0.1368541568517685, -0.13755835592746735, -0.003385735210031271, 0.22262093424797058, -0.08655975013971329, -0.05816638842225075, 0.033177535980939865, 0.13811928033828735, 0.011417106725275517, -0.051711007952690125, -0.04066412150859833, -0.06809703260660172, 0.020124049857258797, -0.022236444056034088, 0.03272015228867531, 0.04429133981466293, -0.014473915100097656, -0.002885660156607628, -0.12314818054437637, 0.0883861631155014, 0.0812247097492218, 0.05064917728304863, -0.02342790924012661, 0.15057289600372314, 0.0048615713603794575, 0.004980563651770353, -0.012652925215661526, 0.031225722283124924, -0.06373240053653717, 0.00427526980638504, -0.0710434690117836, -0.01613480970263481, -0.034668125212192535, -0.029258903115987778, -0.02355400286614895, 0.00988745503127575, 0.0019346383633092046, 0.008184329606592655, -0.02550581283867359, -0.06434734910726547, -0.053176771849393845, 0.021966006606817245, -0.08337011933326721, -0.0342252179980278, 0.015585825778543949, -0.03803040459752083, 0.06561040133237839, 0.019901275634765625, 0.013513484038412571, 0.008863110095262527, -0.029509611427783966, 0.06213141977787018, -0.0036649582907557487, 0.0549478754401207, -0.0002046022127615288, -0.06793786585330963, 0.001912885345518589, 0.0013933865120634437, -0.028191933408379555, -0.015784790739417076, 0.02586170844733715, -0.1378123015165329, 0.013199059292674065, -0.041968028992414474, -0.020490238443017006, -0.0731751099228859, 0.07556236535310745, 0.023616282269358635, 0.07042901962995529, 0.09828271716833115, -0.07477845996618271, 0.08520033210515976, -0.15809205174446106, -0.01282536331564188, 0.03081447072327137, 0.003954095300287008, 0.006118240300565958, -0.021257149055600166, 0.060046035796403885, -0.06639177352190018, 0.11726383119821548, 0.023356923833489418, 0.027982722967863083, 0.022595534101128578, -0.08054059743881226, 0.003546701744198799, 0.046102676540613174, 0.07837283611297607, -0.034463346004486084, -0.03324291855096817, -0.051807817071676254, 0.07721645385026932, -0.000797524640802294, 0.07882034033536911, 0.07258382439613342, 0.15132637321949005, 0.11091193556785583, 0.05040518939495087, -0.005260060541331768, -0.10477077960968018, -0.0739564299583435, 0.06807125359773636, -0.02094847336411476, 0.028067123144865036, -0.03042144887149334, 0.11588312685489655, 0.09417373687028885, -0.1449144184589386, 0.10458730161190033, -0.015335376374423504, -0.09240905940532684, -0.02919834293425083, -0.11179067939519882, -0.04185860604047775, -0.009016615338623524, -0.050673529505729675, -0.1031157448887825, 0.012401635758578777, 0.08057781308889389, 0.030809039250016212, -0.042300328612327576, 0.15365585684776306, -0.04274166002869606, -0.10042226314544678, 0.05589598789811134, 0.026752158999443054, 0.07533787935972214, 0.023789431899785995, 0.012013586238026619, 0.0365494042634964, 0.011550973169505596, 0.05006369575858116, 0.04392877221107483, -0.0005377076449804008, 0.004411729518324137, 0.0052700042724609375, -0.052388399839401245, -0.0316111296415329, 0.017927542328834534, 0.06063224747776985, 0.17158105969429016, 0.05879844352602959, -0.056558672338724136, -0.018119046464562416, 0.19225043058395386, -0.062080491334199905, -0.08859030902385712, -0.10605363547801971, 0.1927529275417328, 0.03244134038686752, 0.02378196455538273, 0.012876996770501137, -0.10696522146463394, -0.017124800011515617, 0.15589170157909393, 0.17302162945270538, -0.08056747913360596, -0.030617373064160347, 0.03890736773610115, -0.0017538560787215829, 0.010676938109099865, 0.06647489964962006, 0.0595787838101387, 0.21434181928634644, -0.06694263219833374, 0.08250939100980759, -0.04158880561590195, 0.0010749500943347812, -0.03163352981209755, 0.19933055341243744, -0.005810496862977743, 0.022296393290162086, -0.0650738850235939, 0.06112262234091759, -0.00010400619066786021, -0.18653544783592224, 0.010603148490190506, -0.09438520669937134, -0.11876684427261353, 0.02136065997183323, -0.016667386516928673, 0.026757502928376198, 0.07218240946531296, 0.008699196390807629, 0.04331326484680176, 0.1044822558760643, 0.01216629333794117, -0.07971516996622086, -0.07497737556695938, -0.0015384184662252665, -0.07888780534267426, 0.14812621474266052, 0.016858281567692757, 0.1074480414390564, 0.08263707160949707, 0.020277103409171104, -0.09796781837940216, 0.09937476366758347, 0.028001759201288223, 0.02556544728577137, 0.0720074325799942, 0.10650578886270523, -0.006344593595713377, 0.047250013798475266, 0.0379524752497673, -0.08353821933269501, 0.026442160829901695, -0.05552227422595024, -0.03631259500980377, -0.129161536693573, 0.06307068467140198, -0.03680696710944176, 0.147347554564476, 0.19666451215744019, -0.021291987970471382, -0.004072112496942282, -0.038935765624046326, 0.0016084217932075262, -0.010198365896940231, 0.1278245896100998, -0.011718436144292355, -0.14943616092205048, 0.024764642119407654, -0.07880832999944687, 0.04169860854744911, -0.22006317973136902, -0.029868535697460175, 0.01149358507245779, -0.05496765673160553, -0.016864368692040443, 0.09049862623214722, 0.02812536433339119, 0.03605875000357628, -0.049218930304050446, -0.06554532796144485, -0.0010790820233523846, 0.11063740402460098, -0.11043073982000351, -0.11858919262886047 ]
null
null
transformers
# legal_t5_small_trans_cs_en model Model on translating legal text from Cszech to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_en is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to English. ### How to use Here is how to use this model to translate legal text from Cszech to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_en", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "s ohledem na druhou schůzku států OSN, která se konala 11.–15. června 2005 a měla posoudit provádění akčního programu OSN k prevenci, potírání a vymýcení nezákonného obchodu s ručními a lehkými zbraněmi ve všech jeho aspektech, která se koná jednou za dva roky," pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_en model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_en | 56.92| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech English", "tags": ["translation Cszech English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "s ohledem na druhou sch\u016fzku st\u00e1t\u016f OSN, kter\u00e1 se konala 11.\u201315. \u010dervna 2005 a m\u011bla posoudit prov\u00e1d\u011bn\u00ed ak\u010dn\u00edho programu OSN k prevenci, pot\u00edr\u00e1n\u00ed a vym\u00fdcen\u00ed nez\u00e1konn\u00e9ho obchodu s ru\u010dn\u00edmi a lehk\u00fdmi zbran\u011bmi ve v\u0161ech jeho aspektech, kter\u00e1 se kon\u00e1 jednou za dva roky,"}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_en model ===================================== Model on translating legal text from Cszech to English. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_en is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to English. ### How to use Here is how to use this model to translate legal text from Cszech to English in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.12486835569143295, 0.07655411958694458, -0.0022664256393909454, 0.06595730781555176, 0.09371113777160645, 0.01915023662149906, 0.0746355801820755, 0.09626976400613785, -0.07028070092201233, 0.07221494615077972, 0.0656077191233635, 0.033666275441646576, 0.0900576189160347, 0.11372318863868713, 0.04567866399884224, -0.22298526763916016, 0.023236894980072975, -0.01353636384010315, -0.0013884090585634112, 0.14310750365257263, 0.11630459874868393, -0.09294860064983368, 0.021988846361637115, -0.02385878562927246, -0.11826834082603455, -0.014958214946091175, -0.06087616831064224, -0.07080110907554626, 0.07330866158008575, 0.04068700224161148, 0.11661656200885773, 0.01906208135187626, 0.08261660486459732, -0.17352856695652008, -0.00018292470485903323, 0.08128859847784042, 0.052996497601270676, 0.047806061804294586, 0.06747805327177048, -0.011236300691962242, 0.15216957032680511, -0.018497545272111893, 0.0599757619202137, 0.024277474731206894, -0.11835752427577972, -0.12380560487508774, -0.06353069841861725, 0.04485239088535309, 0.15333634614944458, 0.14735770225524902, -0.058994099497795105, 0.07245572656393051, -0.12030932307243347, 0.06477317959070206, 0.07046812772750854, -0.2790335416793823, -0.06489069759845734, 0.029915444552898407, 0.04055941849946976, 0.08130911737680435, -0.05480080842971802, -0.04407140240073204, 0.03542986512184143, 0.025493044406175613, 0.01010530162602663, -0.023212941363453865, -0.0029455756302922964, -0.0065839155577123165, -0.17065195739269257, -0.09979856759309769, 0.16770808398723602, 0.0034119300544261932, -0.07250073552131653, -0.09533922374248505, -0.037275105714797974, -0.1574726551771164, 0.0063082193955779076, -0.04450783133506775, 0.03846075013279915, -0.00283506466075778, 0.026241684332489967, -0.019307108595967293, -0.12216924875974655, -0.1129242405295372, 0.03304800018668175, 0.09013441205024719, 0.09263111650943756, -0.007841712795197964, 0.0042295511811971664, 0.1639222502708435, 0.02512763813138008, -0.09084256738424301, -0.01999523863196373, 0.0027474279049783945, -0.12250496447086334, -0.021922755986452103, -0.03005073592066765, -0.13661611080169678, -0.06847340613603592, 0.0887613371014595, -0.017869092524051666, 0.06177828460931778, 0.04090696573257446, 0.044342104345560074, 0.013414996676146984, 0.17079932987689972, -0.07777855545282364, -0.05204417556524277, -0.060700424015522, 0.06023794040083885, -0.062423065304756165, 0.01686703972518444, -0.01885155402123928, -0.0018186112865805626, 0.06353794038295746, 0.07655242830514908, -0.084504634141922, 0.01129650417715311, -0.04491865634918213, -0.021911799907684326, 0.04804057627916336, -0.11966216564178467, -0.03823896870017052, 0.005396462511271238, -0.10308527946472168, -0.03312304615974426, 0.08415114879608154, -0.012145884335041046, -0.1307573914527893, 0.055592283606529236, -0.033600155264139175, -0.013622771948575974, -0.13146471977233887, -0.0961487740278244, -0.031307969242334366, -0.06537896394729614, -0.045191772282123566, -0.06301748007535934, -0.15826188027858734, -0.10416713356971741, 0.06540250033140182, -0.04621979594230652, -0.04875179007649422, -0.09120000153779984, -0.018082160502672195, -0.0021939303260296583, -0.034185104072093964, 0.12479858100414276, -0.02349894680082798, 0.09531661868095398, 0.01201940793544054, 0.042482469230890274, 0.15099526941776276, 0.07376318424940109, -0.10442696511745453, 0.011559445410966873, -0.09588618576526642, 0.1705971360206604, -0.019828449934720993, 0.003144480986520648, -0.1475832164287567, -0.07716329395771027, -0.05859922245144844, 0.06537564843893051, 0.09561556577682495, 0.12858158349990845, -0.15090462565422058, -0.01154625415802002, 0.21293586492538452, -0.08568045496940613, -0.042353883385658264, 0.10232611000537872, -0.044315654784440994, 0.1394362449645996, 0.08880637586116791, 0.16730670630931854, 0.05387387052178383, -0.07736630737781525, -0.0011555630480870605, -0.03374315798282623, -0.0012460211291909218, -0.0017004190012812614, 0.08448336273431778, -0.051412034779787064, -0.0712403729557991, -0.005869175773113966, -0.09374772757291794, 0.03316677361726761, -0.06535737961530685, -0.06487403810024261, 0.016949303448200226, -0.05921788886189461, -0.03893633186817169, 0.061118144541978836, 0.052538082003593445, -0.04964364320039749, -0.1247936561703682, -0.001250878325663507, 0.10020732879638672, -0.06873509287834167, 0.027668152004480362, -0.06128499656915665, -0.04775800183415413, -0.07428252696990967, -0.011582237668335438, -0.17109674215316772, 0.040774013847112656, 0.048113010823726654, -0.009509148076176643, 0.047691673040390015, 0.04532496631145477, 0.03577960282564163, 0.05555829778313637, -0.0019744455348700285, -0.059868600219488144, -0.03813674673438072, -0.04023652523756027, -0.1330229938030243, -0.10632386803627014, -0.03431900218129158, -0.0224318765103817, 0.10278034955263138, -0.18599733710289001, 0.034720033407211304, -0.09692525118589401, 0.044840145856142044, -0.015227993950247765, -0.05015481263399124, 0.03948196768760681, 0.03851063549518585, 0.027453143149614334, -0.06165120005607605, 0.04896441102027893, 0.03278466686606407, 0.0256586242467165, 0.09196405112743378, -0.1102612093091011, -0.1575670689344406, 0.08690327405929565, 0.05186127498745918, -0.147628054022789, 0.009056800976395607, -0.04252772405743599, -0.06256413459777832, -0.05419287458062172, 0.007483994588255882, 0.2485976368188858, 0.014905110001564026, 0.14794644713401794, -0.10614107549190521, -0.04622019827365875, -0.00968741625547409, -0.02169184945523739, 0.019427219405770302, 0.14589901268482208, 0.05899312347173691, -0.09181258082389832, 0.04619895666837692, 0.01980607770383358, -0.02274540439248085, 0.15151292085647583, -0.002362283179536462, -0.1297488957643509, 0.01226618979126215, 0.06702269613742828, -0.03297009319067001, 0.09164991229772568, -0.14236952364444733, -0.001289721461944282, 0.019377725198864937, 0.04840858653187752, 0.05492265895009041, -0.16080300509929657, 0.02592627890408039, 0.06069575995206833, -0.05190150439739227, 0.013835604302585125, -0.016101133078336716, -0.04765329882502556, 0.07313269376754761, 0.008875204250216484, -0.019463812932372093, -0.018076468259096146, -0.03926048427820206, -0.14080704748630524, 0.20685231685638428, -0.06606505811214447, -0.1385495811700821, -0.10554768145084381, 0.09353817999362946, 0.07301092892885208, -0.007558879442512989, 0.041542794555425644, -0.08375385403633118, -0.04815882816910744, -0.09841319918632507, 0.09807969629764557, -0.0544198714196682, -0.052142608910799026, -0.09108105301856995, -0.011147313751280308, -0.008915525861084461, -0.13079270720481873, 0.034693967550992966, -0.044631149619817734, -0.08494091033935547, 0.006660758517682552, -0.05143871530890465, 0.06874790042638779, 0.1601945161819458, -0.004869801923632622, 0.022759508341550827, -0.013359228149056435, 0.1870141476392746, -0.1288938969373703, 0.01062761154025793, 0.07358698546886444, 0.034091342240571976, 0.005655460059642792, 0.09791723638772964, -0.008428177796304226, -0.08575352281332016, 0.047439105808734894, 0.0454661026597023, -0.02387258969247341, -0.28308844566345215, -0.025753548368811607, -0.024034347385168076, -0.03738522157073021, 0.10914204269647598, 0.04072028771042824, 0.013812253251671791, 0.04611290991306305, -0.023343075066804886, -0.007851554080843925, 0.02493029460310936, 0.051388002932071686, -0.023287706077098846, 0.00444516958668828, 0.07559438049793243, -0.049916598945856094, -0.009908303618431091, 0.04177531599998474, 0.007284990046173334, 0.24938860535621643, -0.05356622114777565, 0.1256883442401886, 0.08156242966651917, 0.1152685359120369, 0.011985382996499538, 0.07755869626998901, -0.029433948919177055, 0.01746487244963646, -0.0031751934438943863, -0.026496365666389465, -0.07460509240627289, 0.03643914312124252, 0.019535044208168983, 0.015720825642347336, -0.1056666299700737, -0.01842109113931656, 0.014011824503540993, 0.33705398440361023, 0.06483324617147446, -0.22866925597190857, -0.06530281901359558, 0.0012456594267860055, -0.07373465597629547, -0.09102147072553635, 0.06468411535024643, 0.08593856543302536, -0.1367003321647644, -0.024613717570900917, -0.0277863796800375, 0.10013125091791153, -0.10145421326160431, -0.05384795367717743, 0.04794321581721306, 0.043466173112392426, -0.008993574418127537, 0.09091212600469589, -0.29229700565338135, 0.19996005296707153, -0.012593811377882957, 0.12821242213249207, -0.01361047476530075, 0.02251308411359787, -0.04989037662744522, 0.0033141695894300938, 0.1561482846736908, -0.002016170183196664, 0.02033388428390026, -0.0735747218132019, -0.10542833060026169, 0.02164415456354618, 0.04030933231115341, -0.06720302253961563, 0.09209340065717697, 0.024060245603322983, 0.030637936666607857, -0.009959890507161617, -0.10188519954681396, -0.13188087940216064, -0.10411959141492844, -0.010665256530046463, -0.08109112828969955, 0.05669358745217323, -0.04229164868593216, -0.05516282096505165, -0.011082892306149006, 0.14369839429855347, -0.12023396044969559, -0.0932416245341301, -0.09446432441473007, 0.017568854615092278, 0.09795442223548889, -0.047607261687517166, -0.002448247978463769, 0.015085064806044102, 0.01358887180685997, 0.0025549898855388165, 0.016302440315485, 0.09272447228431702, -0.06268274039030075, -0.12938454747200012, -0.041162412613630295, 0.13619999587535858, 0.12409678846597672, 0.06290650367736816, -0.03172566369175911, 0.013313474133610725, -0.016175858676433563, -0.07533229142427444, 0.005972592160105705, 0.009118273854255676, 0.05246398597955704, 0.02523755095899105, -0.06518556922674179, -0.009522960521280766, -0.09745731949806213, -0.05360301211476326, 0.10771842300891876, 0.14739876985549927, -0.048168618232011795, 0.06644342839717865, 0.17452280223369598, -0.10966824740171432, -0.16168679296970367, 0.015807034447789192, 0.09638264030218124, 0.08197064697742462, -0.07467091083526611, -0.21665753424167633, 0.026512587442994118, 0.08190194517374039, 0.007371809799224138, -0.017738180235028267, -0.41183263063430786, -0.1347469836473465, 0.11082641780376434, 0.09102658927440643, -0.04285033047199249, -0.08766447007656097, -0.018973611295223236, 0.04632487893104553, -0.044398870319128036, 0.08170904964208603, -0.018628912046551704, 0.09334059059619904, 0.023044966161251068, -0.04149262234568596, 0.04100441560149193, -0.05858859792351723, 0.11129843443632126, 0.07029591500759125, 0.04537421092391014, -0.04272739589214325, 0.03290770202875137, -0.0019263641443103552, -0.021005691960453987, 0.1565553992986679, 0.02591785043478012, 0.0389254055917263, -0.21018628776073456, -0.06057517230510712, -0.08576121181249619, -0.010695870965719223, -0.06879038363695145, -0.049864668399095535, -0.04628117382526398, 0.08479870855808258, 0.04820721969008446, -0.0064185746014118195, -0.015705397352576256, -0.07303344458341599, -0.01649516075849533, 0.0838238075375557, 0.10616659373044968, 0.06379055976867676, -0.08521714806556702, 0.014901953749358654, 0.037530891597270966, 0.09581279009580612, -0.15703031420707703, -0.026180453598499298, 0.12321379035711288, -0.013544072397053242, 0.13753384351730347, -0.009121278300881386, -0.1455235481262207, 0.006535816006362438, 0.03176990896463394, -0.09130874276161194, -0.11914395540952682, -0.008455374278128147, -0.05582212284207344, -0.04947252571582794, -0.05312331020832062, 0.06598247587680817, -0.11383117735385895, -0.02146781235933304, -0.01632876507937908, 0.03493672236800194, -0.06885706633329391, 0.21896475553512573, 0.047686897218227386, 0.06259534507989883, -0.06558702886104584, 0.12852105498313904, 0.10763032734394073, -0.11374349892139435, 0.02093195915222168, 0.1731211543083191, -0.09911224991083145, -0.05399277061223984, 0.015378483571112156, 0.13110561668872833, -0.010432400740683079, -0.06982214003801346, -0.051888082176446915, -0.04794599115848541, 0.06179466471076012, 0.006618744693696499, 0.04376736283302307, 0.02356008253991604, -0.03344428539276123, -0.0011648132931441069, -0.12676477432250977, 0.06722531467676163, 0.09840155392885208, 0.0006310584140010178, -0.019742418080568314, 0.19130106270313263, 0.05592246726155281, 0.05591729283332825, -0.007257555145770311, -0.04078809171915054, -0.045187026262283325, 0.0733862891793251, -0.009694990701973438, -0.025094641372561455, -0.056142695248126984, -0.0154255460947752, -0.02449183166027069, -0.009844646789133549, -0.0002662709157448262, 0.016473600640892982, -0.06872934848070145, -0.029024669900536537, -0.03738606348633766, 0.044213470071554184, -0.06841214746236801, -0.006537708453834057, 0.0007372845429927111, -0.06317739933729172, 0.07366564124822617, 0.025599442422389984, -0.00660876277834177, 0.016011178493499756, -0.009900728240609169, 0.07172580063343048, -0.028681982308626175, 0.0019484447548165917, -0.01674063131213188, -0.0841730460524559, 0.03664098307490349, -0.009186903014779091, -0.007714500650763512, -0.010086782276630402, 0.05091128870844841, -0.13438670337200165, 0.06146838888525963, -0.021483462303876877, -0.004458026494830847, -0.07664777338504791, 0.10977018624544144, 0.022256404161453247, 0.08417898416519165, 0.10889513045549393, -0.06898118555545807, 0.06840306520462036, -0.14708133041858673, -0.0396844707429409, 0.024074843153357506, 0.01823178306221962, -0.040643058717250824, -0.06359986960887909, 0.058830827474594116, -0.04938873276114464, 0.07075701653957367, 0.057511452585458755, 0.03779670596122742, 0.03498747944831848, -0.11678597331047058, 0.0021802152041345835, 0.04609587416052818, 0.0659695640206337, 0.0010251239873468876, 0.011314698494970798, 0.005448291543871164, 0.0484146848320961, -0.02054162696003914, 0.06899839639663696, 0.13683578372001648, 0.21113084256649017, 0.08906190097332001, 0.10418853163719177, -0.0534529983997345, -0.11041136085987091, -0.09300947189331055, 0.09975813329219818, -0.01691139116883278, 0.02442588098347187, -0.025434931740164757, 0.1553155481815338, 0.09629061073064804, -0.15816226601600647, 0.07306109368801117, 0.005670283455401659, -0.10745064169168472, -0.08730551600456238, -0.05985604226589203, -0.031599272042512894, -0.06472906470298767, -0.0058041722513735294, -0.08989663422107697, 0.026569144800305367, 0.0753050446510315, 0.07240498811006546, -0.03410406783223152, 0.1423451453447342, 0.014418518170714378, -0.06482871621847153, 0.09373543411493301, -0.01378316804766655, 0.10107006132602692, -0.09583014994859695, 0.007864863611757755, 0.022095413878560066, -0.014951076358556747, 0.06511882692575455, 0.010869495570659637, -0.04610338434576988, 0.035512637346982956, 0.042557135224342346, -0.05213058739900589, -0.00036643879138864577, 0.04290815070271492, 0.12306354939937592, 0.10092947632074356, 0.07295607030391693, -0.03949868679046631, -0.028220225125551224, 0.2058110386133194, -0.036883722990751266, -0.09731120616197586, -0.1691773682832718, 0.1618090122938156, 0.07367247343063354, 0.012781417928636074, 0.037769533693790436, -0.09567923098802567, -0.002765282755717635, 0.2409273386001587, 0.13061600923538208, -0.05253254249691963, -0.04292453080415726, 0.03221983462572098, -0.007260303944349289, 0.01046105194836855, 0.11184844374656677, 0.04871503636240959, 0.1963690221309662, -0.10834025591611862, 0.03558194637298584, -0.07724715024232864, -0.0657908022403717, -0.012528729625046253, 0.1558336466550827, 0.0068315304815769196, -0.01304231584072113, -0.05558893457055092, 0.10443692654371262, -0.006740869954228401, -0.18242505192756653, 0.05594761669635773, -0.06223892420530319, -0.1293504387140274, -0.02195657789707184, 0.000003614924480643822, 0.0066606430336833, 0.04714931920170784, -0.005567194893956184, -0.002440853975713253, 0.13378815352916718, 0.03148820251226425, -0.05813290923833847, -0.14729982614517212, 0.0756470263004303, -0.020910903811454773, 0.1693122535943985, -0.0013302816078066826, 0.08758892118930817, 0.07004106044769287, 0.024652555584907532, -0.10584279149770737, 0.08138146996498108, 0.04954414442181587, 0.030624281615018845, 0.04926368594169617, 0.11845531314611435, -0.02423974685370922, 0.08331514149904251, 0.028207115828990936, -0.10948867350816727, 0.061559081077575684, -0.13664492964744568, -0.04560288414359093, -0.16451315581798553, 0.03175600990653038, -0.03937345743179321, 0.12866578996181488, 0.2128879725933075, -0.027971547096967697, 0.0076535106636583805, -0.0636155977845192, 0.034709829837083817, -0.028308698907494545, 0.1479770839214325, 0.00913106556981802, -0.18263494968414307, 0.011367389000952244, -0.06667157262563705, 0.02944173477590084, -0.2172011137008667, -0.023716256022453308, 0.011559797450900078, -0.08468279242515564, -0.027616681531071663, 0.12676753103733063, 0.04463278874754906, 0.0645914226770401, -0.03133999556303024, -0.01827950030565262, -0.012011903338134289, 0.14503264427185059, -0.13289357721805573, -0.09748079627752304 ]
null
null
transformers
# legal_t5_small_trans_cs_en_small_finetuned model Model on translating legal text from Cszech to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_en_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_cs_en_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to English. ### How to use Here is how to use this model to translate legal text from Cszech to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_en_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_en", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "4) Seznam užívaných výrobků s obsahem PFOS: Kvůli značnému poklesu výroby PFOS po roce 2000 představují největší zdroj emisí patrně dřívější využití, která však nadále reálně existují." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_en_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_en_small_finetuned | 56.936| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech English", "tags": ["translation Cszech English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "4) Seznam u\u017e\u00edvan\u00fdch v\u00fdrobk\u016f s obsahem PFOS: Kv\u016fli zna\u010dn\u00e9mu poklesu v\u00fdroby PFOS po roce 2000 p\u0159edstavuj\u00ed nejv\u011bt\u0161\u00ed zdroj emis\u00ed patrn\u011b d\u0159\u00edv\u011bj\u0161\u00ed vyu\u017eit\u00ed, kter\u00e1 v\u0161ak nad\u00e1le re\u00e1ln\u011b existuj\u00ed."}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_en_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_en\_small\_finetuned model ======================================================= Model on translating legal text from Cszech to English. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_en\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_cs\_en\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to English. ### How to use Here is how to use this model to translate legal text from Cszech to English in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_en\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_en\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_en\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_en\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06924986839294434, 0.08741644769906998, -0.003187847090885043, 0.07596366107463837, 0.05280570313334465, 0.01582030951976776, 0.034614983946084976, 0.09520455449819565, -0.024033987894654274, 0.08437865227460861, 0.04087269678711891, -0.024229703471064568, 0.0686783641576767, 0.03161012381315231, 0.06671103835105896, -0.21400578320026398, 0.0022894132416695356, -0.03813338652253151, -0.0023797100875526667, 0.10254097729921341, 0.09098499268293381, -0.06735192984342575, 0.04573240876197815, -0.034510888159275055, -0.029622692614793777, 0.017570801079273224, -0.09591072797775269, -0.04491414502263069, 0.08005405217409134, 0.08893197774887085, 0.08576220273971558, -0.016817402094602585, 0.06622112542390823, -0.19780012965202332, -0.00019445587531663477, 0.07775630056858063, 0.0008718200260773301, 0.04605657607316971, 0.1308879256248474, 0.0006555324653163552, 0.16233696043491364, -0.047381337732076645, 0.02943185530602932, 0.04359395056962967, -0.1239342913031578, -0.10867130756378174, -0.048791877925395966, 0.02939353883266449, 0.09192219376564026, 0.13410094380378723, -0.055215418338775635, 0.06639476120471954, -0.052794430404901505, 0.07610952109098434, 0.0752047523856163, -0.22153617441654205, -0.03140735253691673, 0.03236757963895798, 0.045938871800899506, 0.079404816031456, -0.0506146103143692, -0.022078951820731163, 0.05001070722937584, 0.0654570609331131, 0.02345283515751362, -0.06232848018407822, -0.06125956401228905, -0.049243856221437454, -0.13139596581459045, -0.06274451315402985, 0.15765498578548431, 0.026023294776678085, -0.04620872437953949, -0.08208907395601273, -0.062453314661979675, -0.07310990989208221, -0.008275183849036694, -0.029144803062081337, 0.02398262359201908, -0.0020822607912123203, 0.06963638216257095, -0.02031400427222252, -0.11823616176843643, -0.05849376320838928, -0.06709733605384827, 0.13041989505290985, 0.045989613980054855, 0.01568984054028988, 0.02216440625488758, 0.08427317440509796, -0.10845302045345306, -0.08432812988758087, 0.002153537003323436, 0.016538968309760094, -0.1146995946764946, -0.003485412336885929, -0.011091808788478374, -0.16367407143115997, -0.01857159659266472, 0.029822049662470818, -0.0589277483522892, 0.05696211755275726, 0.07987182587385178, 0.04269229248166084, 0.06546224653720856, 0.11524632573127747, -0.12064934521913528, -0.11239967495203018, -0.03030264936387539, 0.004285081289708614, 0.004616080783307552, 0.014054144732654095, -0.05706780403852463, -0.02610391192138195, 0.00944909080862999, 0.031049104407429695, 0.009398050606250763, 0.024757498875260353, -0.022517841309309006, -0.03418046981096268, 0.12029580771923065, -0.10930641740560532, -0.00500475661829114, 0.007017225027084351, -0.09284347295761108, -0.031092774122953415, 0.07723595947027206, -0.014303691685199738, -0.11755461245775223, 0.06948666274547577, -0.03770974278450012, -0.023226216435432434, -0.10510733723640442, -0.1642790138721466, -0.009706275537610054, 0.007419979199767113, -0.06148528680205345, -0.0996900424361229, -0.1280534565448761, -0.09634063392877579, 0.032566945999860764, -0.05520640313625336, -0.0038068091962486506, -0.06373991817235947, -0.0029602900613099337, -0.006048314273357391, -0.011047804728150368, 0.1097417026758194, -0.04078410193324089, 0.03384966030716896, 0.029762977734208107, 0.06577491015195847, 0.028926022350788116, 0.030016614124178886, -0.1257646381855011, 0.04362000524997711, -0.14335639774799347, 0.14275795221328735, -0.026814280077815056, 0.01100039854645729, -0.1279906928539276, -0.049797698855400085, -0.07779273390769958, 0.061752256006002426, 0.05556797236204147, 0.11767484247684479, -0.19512122869491577, -0.007743538822978735, 0.19187815487384796, -0.08386625349521637, -0.08138861507177353, 0.12107890099287033, -0.022259434685111046, 0.03576919063925743, 0.08074238896369934, 0.10861162841320038, 0.07179955393075943, -0.008870118297636509, -0.048261575400829315, 0.007927298545837402, 0.02612157352268696, 0.0761658176779747, 0.08618370443582535, -0.07357348501682281, 0.031639404594898224, 0.016853492707014084, 0.010850292630493641, 0.0019884821958839893, -0.022551337257027626, -0.03428368270397186, 0.008068004623055458, -0.04890728369355202, -0.04330327734351158, 0.03150505945086479, 0.00813900213688612, -0.056017182767391205, -0.0860990360379219, -0.041874248534440994, 0.10822536796331406, -0.051273319870233536, 0.01863301731646061, -0.013778799213469028, -0.05718163400888443, -0.10859418660402298, 0.013827192597091198, -0.17198246717453003, -0.01234083529561758, 0.03480219841003418, -0.0644453763961792, 0.11414764076471329, 0.06243537738919258, 0.049605969339609146, 0.09005406498908997, -0.05684483423829079, -0.03422709181904793, 0.010019009932875633, -0.020805571228265762, -0.10905494540929794, -0.12040488421916962, -0.04534495249390602, -0.022392036393284798, -0.004553505685180426, -0.12218987941741943, -0.0023291590623557568, -0.06683599203824997, 0.08661244064569473, 0.005982842296361923, -0.025244519114494324, 0.03830422833561897, 0.0645017996430397, -0.027202172204852104, -0.03624198958277702, 0.022707577794790268, -0.007724374998360872, -0.047741182148456573, 0.09349124878644943, -0.17038822174072266, -0.11875016987323761, 0.08109825849533081, 0.005483651999384165, -0.12291930615901947, -0.00497285695746541, -0.016647716984152794, -0.06432659924030304, -0.0574747659265995, -0.05680302157998085, 0.24697019159793854, 0.03452257812023163, 0.1396445482969284, -0.10744136571884155, -0.031060634180903435, 0.005099392030388117, -0.02598797343671322, 0.007358092349022627, 0.1718411147594452, 0.07635057717561722, -0.14852896332740784, 0.081004798412323, 0.007049942389130592, -0.028350338339805603, 0.09820425510406494, 0.05982239916920662, -0.10246174037456512, -0.008815043605864048, 0.03645655885338783, -0.0029892863240092993, 0.05978170409798622, -0.09623304009437561, -0.007698157336562872, 0.0291341133415699, 0.05263347178697586, 0.058924298733472824, -0.08659496903419495, 0.06542588025331497, 0.06670723855495453, -0.02609911933541298, 0.04246537387371063, -0.05397510528564453, -0.029954474419355392, 0.0965886265039444, 0.014832718297839165, -0.025792652741074562, -0.047861479222774506, -0.05246149003505707, -0.11468984186649323, 0.19728708267211914, -0.06305447220802307, -0.20586825907230377, -0.11535043269395828, 0.07336625456809998, -0.02766261249780655, 0.034637700766325, 0.0255623459815979, -0.03598487004637718, -0.07238789647817612, -0.13061681389808655, 0.08297110348939896, -0.08089028298854828, -0.049059972167015076, -0.12598416209220886, 0.02557661011815071, 0.010358146391808987, -0.1194506287574768, 0.0271619725972414, 0.0005039275856688619, -0.024117616936564445, 0.0009303584229201078, -0.029447786509990692, 0.11011449247598648, 0.13977795839309692, -0.012383989989757538, -0.03364459425210953, -0.004215192049741745, 0.13244123756885529, -0.09682919085025787, 0.06276121735572815, 0.0631282702088356, 0.03017856739461422, 0.02611512318253517, 0.14491355419158936, 0.02303493581712246, -0.06073641777038574, 0.046757593750953674, 0.05326388031244278, -0.020814409479498863, -0.2261434942483902, -0.10681097954511642, -0.06940779834985733, 0.0216237660497427, 0.11790492385625839, 0.04175928607583046, -0.05071476846933365, 0.024683352559804916, -0.0635993480682373, 0.05343342572450638, -0.009183581918478012, 0.05499795451760292, 0.022685788571834564, -0.009703963063657284, 0.08070489019155502, -0.060236915946006775, -0.043435536324977875, 0.09660842269659042, 0.026906190440058708, 0.18890178203582764, -0.052742283791303635, 0.2099774330854416, 0.05511761084198952, 0.034035056829452515, 0.017012599855661392, 0.0541076585650444, -0.041198115795850754, 0.026487717404961586, -0.027731293812394142, -0.061398331075906754, -0.03497633710503578, 0.06107807904481888, 0.021652603521943092, 0.019860530272126198, -0.051289621740579605, -0.061852503567934036, 0.04708556458353996, 0.2087673395872116, 0.06036300212144852, -0.19482938945293427, -0.05472458153963089, 0.011612631380558014, -0.0748165100812912, -0.06227271631360054, 0.019587857648730278, 0.1492285579442978, -0.09215977787971497, 0.013118896633386612, 0.0199118759483099, 0.12095628678798676, -0.13254153728485107, -0.014761006459593773, 0.03327197954058647, 0.041250284761190414, -0.01505345106124878, 0.1292792707681656, -0.24165655672550201, 0.14654670655727386, 0.015403617173433304, 0.053241875022649765, -0.03865138813853264, 0.01277255080640316, -0.03511015325784683, 0.0022090363781899214, 0.11609790474176407, 0.014199923723936081, -0.011072422377765179, -0.12058742344379425, -0.10160832852125168, 0.003917127847671509, 0.0644945278763771, -0.0708380937576294, 0.09923437982797623, 0.0511728897690773, 0.013088262639939785, -0.018586190417408943, 0.01713869348168373, -0.06221611425280571, -0.15518765151500702, 0.006076172459870577, -0.024934081360697746, -0.021172966808080673, -0.015088397078216076, -0.023362204432487488, -0.06307343393564224, 0.1884758323431015, -0.10576427727937698, -0.07583068311214447, -0.07559230178594589, -0.0008335014572367072, 0.14015531539916992, -0.07505340129137039, -0.0007952455198392272, -0.0019513348815962672, 0.048961855471134186, -0.023269454017281532, -0.02689882181584835, 0.08524183183908463, -0.0826108381152153, -0.10657714307308197, -0.07119591534137726, 0.11646054685115814, 0.07378291338682175, 0.052211739122867584, -0.01814303547143936, 0.033680662512779236, -0.018987217918038368, -0.11369965225458145, -0.02278340421617031, 0.03164080157876015, 0.11367163062095642, 0.06926269829273224, -0.042054541409015656, -0.03286594897508621, -0.07109706848859787, -0.05763646960258484, 0.08055474609136581, 0.16303609311580658, -0.039766404777765274, 0.026998009532690048, 0.19495220482349396, -0.10466498881578445, -0.18381142616271973, -0.05999915674328804, 0.07235150039196014, 0.07510387152433395, -0.014555970206856728, -0.1672690510749817, 0.05168207734823227, 0.09312084317207336, 0.003955886699259281, 0.05940166860818863, -0.3762013912200928, -0.14017869532108307, 0.0809037983417511, 0.03476370498538017, -0.04919873923063278, -0.12832698225975037, -0.05884220078587532, -0.055961500853300095, -0.023876357823610306, 0.09369628131389618, -0.03988191485404968, 0.09819214046001434, -0.003927338402718306, 0.020045481622219086, 0.045746710151433945, -0.03947485238313675, 0.12814763188362122, 0.03667719289660454, 0.04281306639313698, -0.06246711686253548, 0.05853256955742836, 0.007687787991017103, -0.017626095563173294, 0.1531985104084015, -0.049025215208530426, 0.05672440305352211, -0.14604295790195465, -0.053498074412345886, -0.07096470147371292, 0.018827516585588455, -0.04013240337371826, -0.06647387146949768, -0.0572834275662899, 0.04221055656671524, 0.04646996036171913, -0.00042816196219064295, -0.0014298580354079604, -0.06058672070503235, 0.0057204305194318295, 0.17127050459384918, 0.12310601025819778, 0.021307513117790222, -0.11367136985063553, 0.02715528942644596, 0.0032234732061624527, 0.08003176748752594, -0.0782642737030983, 0.005787686910480261, 0.13415783643722534, 0.02295767143368721, 0.11793913692235947, -0.012515264563262463, -0.14414335787296295, -0.01181523036211729, 0.04899577423930168, -0.0937141478061676, -0.12665225565433502, -0.003950485028326511, 0.04325520619750023, -0.08434167504310608, -0.040586668998003006, 0.09454658627510071, -0.09219416230916977, -0.01825200393795967, 0.008937510661780834, 0.031719569116830826, -0.028373731300234795, 0.19982238113880157, 0.042136553674936295, 0.046192020177841187, -0.06316868960857391, 0.11375953257083893, 0.13742125034332275, -0.1502363085746765, 0.012236521579325199, 0.1928092986345291, -0.08609885722398758, -0.06801202148199081, 0.014144130982458591, 0.12354078143835068, -0.022810732945799828, -0.059263717383146286, -0.02353835292160511, -0.05371979624032974, 0.030295416712760925, 0.006699096877127886, 0.04251552000641823, 0.04201413691043854, -0.01562443282455206, -0.014084648340940475, -0.0966191217303276, 0.09588965028524399, 0.07394258677959442, 0.037220299243927, -0.023076048120856285, 0.141676127910614, 0.030635906383395195, -0.014912684448063374, -0.01217033714056015, 0.0038886545225977898, -0.06570734083652496, 0.01647275686264038, -0.07213489711284637, 0.01646852307021618, -0.052483733743429184, -0.016199836507439613, -0.01837035082280636, 0.0031016007997095585, -0.01633691042661667, -0.003038129536435008, -0.030784213915467262, -0.049638088792562485, -0.03758871555328369, 0.025688741356134415, -0.0903850868344307, -0.036247070878744125, 0.018760204315185547, -0.023738259449601173, 0.050688400864601135, 0.0004641101404558867, 0.009192699566483498, -0.005692614708095789, -0.017494600266218185, 0.06669744849205017, 0.014978207647800446, 0.04552246257662773, -0.012038650922477245, -0.08032316714525223, 0.009990769438445568, 0.01782277040183544, -0.0019152350723743439, -0.013033872470259666, 0.0208260640501976, -0.15134109556674957, 0.0003632156003732234, -0.012294899672269821, -0.029418710619211197, -0.08143581449985504, 0.07968983054161072, 0.04065485671162605, 0.060573190450668335, 0.11034579575061798, -0.07036224752664566, 0.08169478178024292, -0.17084868252277374, -0.006727475672960281, 0.02066875249147415, 0.01087202038615942, -0.03173073008656502, -0.003910005558282137, 0.04952854663133621, -0.06512702256441116, 0.13608010113239288, 0.02368486113846302, 0.059238459914922714, 0.024008890613913536, -0.06031130254268646, -0.001185871660709381, 0.025912189856171608, 0.07762981206178665, -0.024383075535297394, -0.029463451355695724, -0.05572851374745369, 0.08633086830377579, -0.0003868871135637164, 0.058835554867982864, 0.05557643622159958, 0.12018467485904694, 0.12180943042039871, 0.04069739580154419, -0.009196722880005836, -0.10631438344717026, -0.056861937046051025, 0.047813206911087036, -0.011174949817359447, 0.0414600633084774, -0.01627029851078987, 0.1092861220240593, 0.12530934810638428, -0.1378207504749298, 0.12250081449747086, -0.005691755563020706, -0.08103522658348083, -0.041280005127191544, -0.1370178461074829, -0.04422342777252197, -0.010656900703907013, -0.045440565794706345, -0.11169102787971497, 0.010929577052593231, 0.08446437865495682, 0.041402336210012436, -0.027501221746206284, 0.13609865307807922, -0.04786304756999016, -0.10900705307722092, 0.04463687539100647, 0.01708369515836239, 0.08763913810253143, 0.02266065590083599, 0.03025147318840027, 0.056412503123283386, 0.004013971425592899, 0.04784362390637398, 0.054962459951639175, -0.01691250130534172, 0.005340066738426685, 0.011233057826757431, -0.061105940490961075, -0.03805762901902199, 0.010163310915231705, 0.07780846953392029, 0.19420240819454193, 0.055814001709222794, -0.059594638645648956, -0.02361024171113968, 0.19459982216358185, -0.06490693986415863, -0.0796460509300232, -0.09858998656272888, 0.21415212750434875, 0.032329536974430084, 0.025124678388237953, -0.0008739667828194797, -0.1031843051314354, -0.0062324716709554195, 0.15693138539791107, 0.18566399812698364, -0.06099264323711395, -0.026241298764944077, 0.02199074812233448, -0.005710647441446781, 0.028308080509305, 0.041696712374687195, 0.028168488293886185, 0.26980265974998474, -0.08267249912023544, 0.09100215137004852, -0.04129379987716675, 0.005912253633141518, -0.0071628279983997345, 0.16474634408950806, 0.002100052312016487, 0.02401328831911087, -0.07051800936460495, 0.07203230261802673, -0.02803695760667324, -0.17465752363204956, 0.013762925751507282, -0.08541187644004822, -0.1123940721154213, 0.011253866367042065, -0.011106974445283413, 0.05964653566479683, 0.06546775251626968, 0.00914233922958374, 0.031662289053201675, 0.06885873526334763, 0.01161406934261322, -0.11648302525281906, -0.12065692245960236, 0.01251407153904438, -0.026796946302056313, 0.13350407779216766, 0.003334902925416827, 0.13189275562763214, 0.07947835326194763, 0.004371020942926407, -0.10545189678668976, 0.09114465117454529, 0.02514480985701084, 0.027679191902279854, 0.08473910391330719, 0.10930415987968445, 0.007043936755508184, 0.07586294412612915, 0.045741934329271317, -0.08009712398052216, 0.02404879778623581, -0.04144379496574402, -0.013009986840188503, -0.135393425822258, 0.07843531668186188, -0.036942318081855774, 0.15038859844207764, 0.1802409440279007, -0.021777553483843803, -0.02402978576719761, -0.04324770346283913, 0.006222284398972988, -0.014592322520911694, 0.0961124375462532, -0.012335321865975857, -0.15473124384880066, 0.022486288100481033, -0.07911576330661774, 0.03402969241142273, -0.2651348114013672, -0.025592641904950142, 0.01986692100763321, -0.058888453990221024, -0.004807641729712486, 0.07818043231964111, 0.04376433044672012, 0.05156315490603447, -0.05373414605855942, -0.07644101977348328, 0.007168157026171684, 0.10440651327371597, -0.11249774694442749, -0.11965226382017136 ]
null
null
transformers
# legal_t5_small_trans_cs_es model Model on translating legal text from Cszech to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_es is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Spanish. ### How to use Here is how to use this model to translate legal text from Cszech to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_es", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "k návrhu směrnice Evropského parlamentu a Rady o bezpečnosti hraček" pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_es model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_es | 50.77| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Spanish", "tags": ["translation Cszech Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "k n\u00e1vrhu sm\u011brnice Evropsk\u00e9ho parlamentu a Rady o bezpe\u010dnosti hra\u010dek"}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_es model ===================================== Model on translating legal text from Cszech to Spanish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_es is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Spanish. ### How to use Here is how to use this model to translate legal text from Cszech to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.13737338781356812, 0.0814686194062233, -0.002636893419548869, 0.07297123968601227, 0.09778619557619095, 0.01323684398084879, 0.06238304451107979, 0.09501980990171432, -0.07316140830516815, 0.07940979301929474, 0.05745913088321686, 0.049834106117486954, 0.08681875467300415, 0.09931902587413788, 0.03306611627340317, -0.21918462216854095, 0.023198379203677177, -0.019148465245962143, -0.006906435824930668, 0.1369260549545288, 0.10956846177577972, -0.08649784326553345, 0.017688481137156487, -0.029623717069625854, -0.11670311540365219, -0.004122210666537285, -0.05275131016969681, -0.0883859321475029, 0.0781426727771759, 0.045050352811813354, 0.12228832393884659, 0.02554965950548649, 0.07950791716575623, -0.1546294391155243, -0.0035339936148375273, 0.07619277387857437, 0.04642994701862335, 0.049424801021814346, 0.08189687877893448, -0.019535740837454796, 0.153561532497406, -0.02584688924252987, 0.05805075168609619, 0.023603051900863647, -0.1366880089044571, -0.1302044540643692, -0.0702323466539383, 0.02475019171833992, 0.14852167665958405, 0.1417304426431656, -0.05283764377236366, 0.06578698009252548, -0.12018077820539474, 0.049317922443151474, 0.0703921765089035, -0.2728644609451294, -0.06858731061220169, 0.023656971752643585, 0.04194675013422966, 0.08718154579401016, -0.039805155247449875, -0.03540053963661194, 0.04428129270672798, 0.017584074288606644, 0.004031314980238676, -0.03126194700598717, -0.019190136343240738, -0.008172841742634773, -0.16456936299800873, -0.11071775108575821, 0.15595640242099762, -0.005234896671026945, -0.06769528985023499, -0.0914037749171257, -0.04127956181764603, -0.1465287208557129, 0.010208544321358204, -0.04699039086699486, 0.04378220811486244, -0.008923809975385666, 0.03513659909367561, -0.01000408548861742, -0.11742516607046127, -0.11288704723119736, 0.03110317699611187, 0.09061506390571594, 0.08705590665340424, -0.017566652968525887, 0.017324315384030342, 0.16900630295276642, 0.03676560893654823, -0.09656961262226105, -0.011375939473509789, 0.013432720676064491, -0.11854627728462219, -0.01699153520166874, -0.032073602080345154, -0.14192695915699005, -0.0610433928668499, 0.07334300875663757, -0.039791226387023926, 0.056412987411022186, 0.041012946516275406, 0.04338734224438667, 0.011940160766243935, 0.16557613015174866, -0.07092040777206421, -0.049818821251392365, -0.0651930645108223, 0.0495961457490921, -0.06033780425786972, 0.014845001511275768, -0.031092647463083267, -0.01169517170637846, 0.05740947276353836, 0.08864455670118332, -0.07101613283157349, 0.0015538700390607119, -0.04971983656287193, -0.031215650960803032, 0.04733480140566826, -0.11580710113048553, -0.031371187418699265, 0.006007296498864889, -0.10451006889343262, -0.02796345390379429, 0.07065588235855103, -0.017967719584703445, -0.13080227375030518, 0.04444657266139984, -0.03161657974123955, -0.021839464083313942, -0.14133940637111664, -0.09791175276041031, -0.020920371636748314, -0.07309072464704514, -0.043130550533533096, -0.06769167631864548, -0.14582239091396332, -0.11878801882266998, 0.06618519872426987, -0.06655234843492508, -0.040322571992874146, -0.1006268709897995, -0.008055980317294598, -0.00008320348570123315, -0.0393952913582325, 0.12459570914506912, -0.023398369550704956, 0.10059768706560135, 0.022424761205911636, 0.04614851251244545, 0.14895202219486237, 0.07222693413496017, -0.10077952593564987, 0.018703211098909378, -0.11026626080274582, 0.18417172133922577, -0.02005213499069214, 0.0023673374671489, -0.15574052929878235, -0.07489504665136337, -0.059335481375455856, 0.07263919711112976, 0.09929826855659485, 0.14217741787433624, -0.1480337381362915, -0.017658542841672897, 0.2199154645204544, -0.07713833451271057, -0.03618820011615753, 0.09382561594247818, -0.041064146906137466, 0.14851240813732147, 0.08992346376180649, 0.17439503967761993, 0.04783811420202255, -0.07997120916843414, 0.009401947259902954, -0.03871043398976326, 0.006208700593560934, -0.01350785419344902, 0.09240421652793884, -0.06279996782541275, -0.06903689354658127, -0.004399169236421585, -0.11243899166584015, 0.02970149554312229, -0.058761369436979294, -0.06354653090238571, 0.027500757947564125, -0.045212794095277786, -0.0391116626560688, 0.06064596027135849, 0.058141645044088364, -0.04532323777675629, -0.11869875341653824, 0.002036937279626727, 0.08743152767419815, -0.06346629559993744, 0.023621894419193268, -0.06037365645170212, -0.03550741448998451, -0.09046956151723862, -0.013995559886097908, -0.1727164387702942, 0.04261121526360512, 0.04534653574228287, 0.0019450513646006584, 0.043464113026857376, 0.04129021614789963, 0.03290800377726555, 0.0514984056353569, 0.000011863711733894888, -0.05715365707874298, -0.04393601417541504, -0.042417243123054504, -0.12492549419403076, -0.10931708663702011, -0.03258631005883217, -0.02137538231909275, 0.096928171813488, -0.18611887097358704, 0.02993982657790184, -0.09279992431402206, 0.031831707805395126, -0.022746752947568893, -0.043755192309617996, 0.031084824353456497, 0.0498889684677124, 0.026873355731368065, -0.06912055611610413, 0.05794057622551918, 0.038371022790670395, 0.03716172277927399, 0.08856353163719177, -0.1139059066772461, -0.1552431285381317, 0.08230431377887726, 0.0514957457780838, -0.14832474291324615, -0.005498734302818775, -0.03545660898089409, -0.054491378366947174, -0.05682355910539627, 0.009870246052742004, 0.255785197019577, 0.012436136603355408, 0.15582269430160522, -0.12491974234580994, -0.04033687338232994, -0.005586853250861168, -0.025015030056238174, 0.01152922585606575, 0.152281254529953, 0.06546053290367126, -0.08893891423940659, 0.05339759588241577, 0.007603179197758436, -0.018568584695458412, 0.15599748492240906, 0.0033986896742135286, -0.12989863753318787, 0.012508541345596313, 0.07735948264598846, -0.01777476631104946, 0.09286097437143326, -0.14153023064136505, -0.0010991679737344384, 0.01677895337343216, 0.04922245815396309, 0.06340159475803375, -0.162710040807724, 0.022859957069158554, 0.06006665155291557, -0.051946964114904404, 0.011012635193765163, -0.013740255497395992, -0.050391003489494324, 0.0766410231590271, 0.021780410781502724, -0.026204295456409454, -0.02083154395222664, -0.03736307471990585, -0.14164797961711884, 0.208898663520813, -0.06412817537784576, -0.1518978774547577, -0.11569108814001083, 0.09674534201622009, 0.07581910490989685, 0.010526199825108051, 0.04615653678774834, -0.08717130869626999, -0.03448108211159706, -0.07496385276317596, 0.10754840821027756, -0.04217154532670975, -0.06246710941195488, -0.09225733578205109, -0.003169227857142687, -0.01698758453130722, -0.1286008507013321, 0.03441281616687775, -0.038497667759656906, -0.0887889638543129, -0.0020870345178991556, -0.06340477615594864, 0.08478417992591858, 0.169779434800148, 0.0032439662609249353, 0.02057296223938465, -0.01309097558259964, 0.1776927411556244, -0.1326267421245575, 0.0021939801517874002, 0.08592087030410767, 0.04530385509133339, -0.000040477745642419904, 0.095698781311512, -0.007990723475813866, -0.089620441198349, 0.03193490952253342, 0.04037164896726608, -0.029235616326332092, -0.2849186360836029, -0.03344837576150894, -0.024890925735235214, -0.053037378937006, 0.11625828593969345, 0.03671882301568985, 0.019017735496163368, 0.06170390546321869, -0.019069373607635498, -0.007860563695430756, 0.01873314566910267, 0.052628062665462494, -0.01626589149236679, 0.0032980157993733883, 0.06482388079166412, -0.051022324711084366, -0.021978383883833885, 0.047916218638420105, 0.025574177503585815, 0.2444523274898529, -0.05328277871012688, 0.13118626177310944, 0.08608199656009674, 0.11381158232688904, 0.0007810295792296529, 0.07642585039138794, -0.025614432990550995, 0.016931530088186264, -0.01327134482562542, -0.031337302178144455, -0.0813598483800888, 0.023211009800434113, 0.011103193275630474, 0.013539789244532585, -0.12418631464242935, -0.0361616387963295, 0.012795420363545418, 0.32841756939888, 0.05391423776745796, -0.23739415407180786, -0.06011269986629486, -0.0001690148055786267, -0.05844992399215698, -0.09437774121761322, 0.05643768608570099, 0.09134825319051743, -0.1431654989719391, -0.020416680723428726, -0.03195642679929733, 0.10375995934009552, -0.10339131206274033, -0.05288029462099075, 0.04362253099679947, 0.046119388192892075, -0.002619626931846142, 0.09770891815423965, -0.28291401267051697, 0.21183624863624573, -0.01012650690972805, 0.12637010216712952, -0.015100845135748386, 0.025305094197392464, -0.062495969235897064, -0.0030917488038539886, 0.15962348878383636, -0.0035828680265694857, 0.024945564568042755, -0.07013647258281708, -0.10043668001890182, 0.019983911886811256, 0.028250131756067276, -0.07637998461723328, 0.09512446820735931, 0.026822540909051895, 0.029139164835214615, -0.015238737687468529, -0.10650745779275894, -0.11594869941473007, -0.11752895265817642, -0.01320644374936819, -0.09337887167930603, 0.06171451136469841, -0.03980866074562073, -0.052260853350162506, -0.010933637619018555, 0.140028715133667, -0.11641204357147217, -0.09328911453485489, -0.09882058203220367, 0.01665549725294113, 0.09938205778598785, -0.0462285652756691, 0.0004044061934109777, 0.016951287165284157, 0.007533100433647633, 0.0017455997876822948, 0.020429419353604317, 0.10734874755144119, -0.07259056717157364, -0.12087036669254303, -0.04572635143995285, 0.1341407299041748, 0.12065118551254272, 0.059129733592271805, -0.020099511370062828, 0.00981023721396923, -0.0067781200632452965, -0.0768788531422615, -0.004650306422263384, -0.0013855572324246168, 0.05662870407104492, 0.020885562524199486, -0.07344450056552887, -0.017732270061969757, -0.09244058281183243, -0.055029239505529404, 0.10396074503660202, 0.13930842280387878, -0.04944793879985809, 0.07117598503828049, 0.176459401845932, -0.11914311349391937, -0.15448959171772003, 0.017688047140836716, 0.110171839594841, 0.0802491307258606, -0.07382815331220627, -0.22487971186637878, 0.0019329824717715383, 0.09525575488805771, 0.008474077098071575, -0.024514731019735336, -0.44354674220085144, -0.12552660703659058, 0.09408856928348541, 0.09045497328042984, -0.03441706299781799, -0.08543942868709564, -0.03366498649120331, 0.03607678413391113, -0.03274516388773918, 0.0836714431643486, -0.016643136739730835, 0.08957601338624954, 0.028574127703905106, -0.04792147874832153, 0.04782484844326973, -0.053067322820425034, 0.1282019168138504, 0.059318169951438904, 0.04128803685307503, -0.0380844883620739, 0.031939875334501266, 0.0025365243200212717, -0.018544500693678856, 0.14170825481414795, 0.040222201496362686, 0.031409282237291336, -0.2134103775024414, -0.06456062197685242, -0.08660973608493805, -0.00818372517824173, -0.07319983094930649, -0.04025191441178322, -0.03651256859302521, 0.07588537037372589, 0.05565764009952545, -0.008013276383280754, -0.01686810702085495, -0.07359042018651962, -0.009549918584525585, 0.08866091072559357, 0.10599029064178467, 0.06942335516214371, -0.09169674664735794, 0.01318881195038557, 0.048492442816495895, 0.09773880243301392, -0.1414824277162552, -0.028914831578731537, 0.1228974461555481, -0.017425663769245148, 0.12196532636880875, -0.009624437429010868, -0.14412210881710052, 0.014031684957444668, 0.046924278140068054, -0.0757782906293869, -0.11473628133535385, -0.011351315304636955, -0.051388829946517944, -0.03817392885684967, -0.05790757015347481, 0.0677967444062233, -0.09838901460170746, -0.034087423235177994, -0.018547475337982178, 0.026346834376454353, -0.06556926667690277, 0.2257838249206543, 0.045884471386671066, 0.058698032051324844, -0.06660343706607819, 0.12439217418432236, 0.11072838306427002, -0.12850120663642883, 0.017887484282255173, 0.17093978822231293, -0.09455877542495728, -0.049591053277254105, 0.027384353801608086, 0.13642369210720062, -0.04957086592912674, -0.07977358251810074, -0.06815116107463837, -0.04745730012655258, 0.0588277131319046, 0.011791356839239597, 0.0403866283595562, 0.018904289230704308, -0.028790125623345375, -0.0019154142355546355, -0.11518506705760956, 0.050644632428884506, 0.09869812428951263, -0.002872796729207039, -0.020558498799800873, 0.19126258790493011, 0.05520174279808998, 0.04573556408286095, -0.008530919440090656, -0.03817809000611305, -0.06042814254760742, 0.07751138508319855, -0.006164299789816141, -0.0127701535820961, -0.05338088795542717, -0.01636652648448944, -0.031164774671196938, -0.003231323091313243, 0.0043357061222195625, 0.011057540774345398, -0.06901243329048157, -0.023277519270777702, -0.04260655865073204, 0.048044148832559586, -0.06077387183904648, 0.001209930400364101, -0.008012322708964348, -0.062264494597911835, 0.061871662735939026, 0.014557099901139736, -0.004657382145524025, 0.02081887796521187, -0.03161252662539482, 0.079689621925354, -0.02149476855993271, 0.003964532166719437, -0.005808480549603701, -0.08224118500947952, 0.04838395491242409, 0.0005676785949617624, -0.008564839139580727, -0.011111689731478691, 0.04932079091668129, -0.13754434883594513, 0.06717782467603683, -0.014122643508017063, -0.01910017989575863, -0.07022123038768768, 0.1283588856458664, 0.029048189520835876, 0.07315582036972046, 0.10178843885660172, -0.07883428782224655, 0.0657506212592125, -0.1419007033109665, -0.045936986804008484, 0.020158784464001656, 0.027025997638702393, -0.04012090712785721, -0.05867762863636017, 0.062030043452978134, -0.035131119191646576, 0.06530096381902695, 0.07798349857330322, 0.06550288200378418, 0.025124244391918182, -0.11217591911554337, 0.009186704643070698, 0.043445903807878494, 0.06736718118190765, 0.005311045795679092, 0.022306928411126137, -0.0025984058156609535, 0.06942355632781982, -0.01861032098531723, 0.07172825187444687, 0.12261759489774704, 0.217228963971138, 0.09632118791341782, 0.10673961788415909, -0.04349655658006668, -0.11173825711011887, -0.08716622740030289, 0.10376301407814026, -0.003827039385214448, 0.018688790500164032, -0.02375459484755993, 0.1298147439956665, 0.09061340987682343, -0.16135013103485107, 0.08064025640487671, 0.013098721392452717, -0.1002359390258789, -0.08601036667823792, -0.06737247854471207, -0.024211203679442406, -0.07450369745492935, -0.013198299333453178, -0.08740311115980148, 0.030001388862729073, 0.06502776592969894, 0.06507179886102676, -0.039949625730514526, 0.14041659235954285, 0.010443223640322685, -0.08272106200456619, 0.09349748492240906, -0.004017708823084831, 0.12679582834243774, -0.10125785320997238, 0.01753024011850357, 0.01796671748161316, 0.004127859137952328, 0.06180158630013466, 0.01904045045375824, -0.03565645217895508, 0.03108648583292961, 0.046251069754362106, -0.043170265853405, 0.0012886860640719533, 0.043807338923215866, 0.12943731248378754, 0.1152326911687851, 0.0674915686249733, -0.04302602261304855, -0.027277931571006775, 0.22299228608608246, -0.035563189536333084, -0.08989175409078598, -0.168219655752182, 0.164940744638443, 0.07247763872146606, 0.025189079344272614, 0.033738382160663605, -0.10162407904863358, -0.0038710793014615774, 0.23923684656620026, 0.1253417432308197, -0.04805571585893631, -0.04803844913840294, 0.026977458968758583, -0.002466992475092411, 0.023945868015289307, 0.1194368451833725, 0.037260398268699646, 0.22443176805973053, -0.11202595382928848, 0.029743030667304993, -0.07577851414680481, -0.04839882254600525, -0.01885017566382885, 0.16596899926662445, -0.0024955621920526028, -0.016685567796230316, -0.047708671540021896, 0.11673469096422195, -0.0018604367505759, -0.19558580219745636, 0.06867211312055588, -0.06641776859760284, -0.13713125884532928, -0.03219126537442207, 0.007655549794435501, 0.008851607330143452, 0.057344160974025726, 0.0004265715251676738, -0.00885864719748497, 0.129157155752182, 0.039312686771154404, -0.05683441460132599, -0.16008445620536804, 0.07399354130029678, -0.015774665400385857, 0.17731235921382904, -0.007996231317520142, 0.08041437715291977, 0.07309926301240921, 0.022978944703936577, -0.10809343308210373, 0.07858569175004959, 0.04941679909825325, 0.03265458345413208, 0.07294274866580963, 0.08702616393566132, -0.023693345487117767, 0.07509224116802216, 0.023771105334162712, -0.11813855916261673, 0.0666639432311058, -0.12045131623744965, -0.03473742678761482, -0.1657058149576187, 0.036936014890670776, -0.04481545835733414, 0.11869286000728607, 0.21141690015792847, -0.0281227994710207, 0.0147866765037179, -0.06559091061353683, 0.05058080330491066, -0.02077373117208481, 0.1661090850830078, 0.010537225753068924, -0.1937684714794159, 0.0035826354287564754, -0.06757920980453491, 0.02412666194140911, -0.21491380035877228, -0.012678829953074455, 0.016972333192825317, -0.08789094537496567, -0.026926159858703613, 0.11585050076246262, 0.031079312786459923, 0.07223621755838394, -0.02772415801882744, -0.019830770790576935, -0.019733386114239693, 0.14534743130207062, -0.12228694558143616, -0.08889401704072952 ]
null
null
transformers
# legal_t5_small_trans_cs_es_small_finetuned model Model on translating legal text from Cszech to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_es_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_cs_es_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Spanish. ### How to use Here is how to use this model to translate legal text from Cszech to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_es_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_es", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "vzhledem k tomu, že parlamentní volby v listopadu a v prosinci 2006, volby do Senátu v lednu 2007 a volbu prezidenta Sídí Muhammada Ulda Šajcha Abdalláhiho v březnu 2007, uznali jako spravedlivé a transparentní zahraniční pozorovatelé, včetně pozorovatelů z Evropské unie, a zejména z mise ke sledování průběhu voleb vyslané Evropským parlamentem, jenž se tím stal garantem legality těchto voleb," pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_es_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_es_small_finetuned | 50.862| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Spanish", "tags": ["translation Cszech Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "vzhledem k tomu, \u017ee parlamentn\u00ed volby v listopadu a v prosinci 2006, volby do Sen\u00e1tu v lednu 2007 a volbu prezidenta S\u00edd\u00ed Muhammada Ulda \u0160ajcha Abdall\u00e1hiho v b\u0159eznu 2007, uznali jako spravedliv\u00e9 a transparentn\u00ed zahrani\u010dn\u00ed pozorovatel\u00e9, v\u010detn\u011b pozorovatel\u016f z Evropsk\u00e9 unie, a zejm\u00e9na z mise ke sledov\u00e1n\u00ed pr\u016fb\u011bhu voleb vyslan\u00e9 Evropsk\u00fdm parlamentem, jen\u017e se t\u00edm stal garantem legality t\u011bchto voleb,"}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_es_small_finetuned
[ "transformers", "pytorch", "t5", "text2text-generation", "translation Cszech Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Spanish" ]
TAGS #transformers #pytorch #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_es\_small\_finetuned model ======================================================= Model on translating legal text from Cszech to Spanish. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_es\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_cs\_es\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Spanish. ### How to use Here is how to use this model to translate legal text from Cszech to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_es\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 56, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #t5 #text2text-generation #translation Cszech Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.10946214944124222, 0.1017221063375473, -0.002243894152343273, 0.07794655859470367, 0.07287046313285828, 0.0018998864106833935, 0.019517648965120316, 0.08425489813089371, -0.020391855388879776, 0.0778701975941658, 0.034648675471544266, -0.022781208157539368, 0.03695892170071602, 0.04352359101176262, 0.04506031796336174, -0.20638054609298706, -0.0014154001837596297, -0.049828868359327316, -0.03677849471569061, 0.10470738261938095, 0.07303321361541748, -0.055479153990745544, 0.0359346829354763, -0.01255373191088438, -0.03844846040010452, 0.019569888710975647, -0.09441579878330231, -0.052251964807510376, 0.09572895616292953, 0.10748241096735, 0.07747393101453781, 0.003582277800887823, 0.059825554490089417, -0.17326520383358002, -0.00651991181075573, 0.06812214106321335, -0.012944302521646023, 0.03282386437058449, 0.1308136284351349, -0.010256756097078323, 0.15433533489704132, -0.017597969621419907, 0.024792887270450592, 0.05717581510543823, -0.11462563276290894, -0.08258599042892456, -0.04333086311817169, -0.013315960764884949, 0.0748881921172142, 0.12739954888820648, -0.05484829097986221, 0.0785997062921524, -0.056263238191604614, 0.0735771581530571, 0.08554699271917343, -0.22166836261749268, -0.01639249548316002, 0.025993457064032555, 0.019168604165315628, 0.08326071500778198, -0.011127370409667492, 0.002927354769781232, 0.0806126818060875, 0.05619155988097191, 0.032526787370443344, -0.0438779778778553, -0.041421037167310715, -0.04801147058606148, -0.1338493973016739, -0.05735861510038376, 0.14270427823066711, 0.03659253939986229, -0.04423990845680237, -0.08470034599304199, -0.06880287826061249, -0.07834417372941971, 0.007872465066611767, -0.033588651567697525, 0.007920024916529655, -0.012178950011730194, 0.06737533211708069, -0.013997402973473072, -0.11862841993570328, -0.05995206907391548, -0.07184285670518875, 0.16701294481754303, 0.037125494331121445, 0.022485388442873955, 0.05618303269147873, 0.08181066811084747, -0.11387992650270462, -0.06393948197364807, 0.017978617921471596, 0.01891534961760044, -0.127557173371315, 0.011531065218150616, -0.04093800485134125, -0.17273414134979248, -0.026890229433774948, -0.017827128991484642, -0.0938711017370224, 0.03396342322230339, 0.1041576936841011, 0.04044690728187561, 0.0630159005522728, 0.07938215881586075, -0.14708945155143738, -0.11733828485012054, -0.03332657366991043, 0.006870251148939133, -0.005105075892060995, 0.024612024426460266, -0.08700148016214371, -0.04759716987609863, 0.018001431599259377, 0.031870804727077484, -0.0023138769902288914, 0.026237351819872856, -0.027177659794688225, -0.052524417638778687, 0.14320498704910278, -0.08823699504137039, -0.01820344850420952, -0.004653667099773884, -0.11084390431642532, -0.026555338874459267, 0.09436359256505966, -0.015165545046329498, -0.0845712348818779, 0.045828916132450104, -0.022500887513160706, -0.037773579359054565, -0.10828932374715805, -0.18536444008350372, -0.011885816231369972, -0.001003683777526021, -0.07292939722537994, -0.11429254710674286, -0.1158466637134552, -0.0975402221083641, 0.032478366047143936, -0.05596640706062317, -0.004401512444019318, -0.08902820944786072, -0.0013988353312015533, -0.007214955985546112, -0.01563684269785881, 0.07251469045877457, -0.03530854731798172, 0.02679077349603176, 0.05457519739866257, 0.08443783968687057, 0.004477757029235363, 0.021068306639790535, -0.1083146259188652, 0.046988021582365036, -0.15472844243049622, 0.17035825550556183, -0.023414254188537598, 0.030923811718821526, -0.12436839938163757, -0.06136961281299591, -0.07743584364652634, 0.08134526014328003, 0.059043269604444504, 0.13048987090587616, -0.1957702934741974, -0.017525222152471542, 0.21273112297058105, -0.07367827743291855, -0.09640900045633316, 0.13176168501377106, -0.02805129438638687, 0.03647425398230553, 0.0892128124833107, 0.0801812931895256, 0.07389704138040543, -0.02773350477218628, -0.03486288711428642, -0.0035014357417821884, 0.03702067956328392, 0.027004828676581383, 0.08892740309238434, -0.06231958046555519, 0.021517043933272362, 0.034966908395290375, -0.0036742922384291887, 0.029229188337922096, -0.024437665939331055, -0.034408971667289734, 0.0035610198974609375, -0.07600083202123642, -0.00026520283427089453, 0.032096926122903824, 0.027453983202576637, -0.06516412645578384, -0.08784236013889313, 0.04826810583472252, 0.09977706521749496, -0.04607748985290527, -0.0029885885305702686, -0.013428651727735996, -0.049542881548404694, -0.13048814237117767, -0.001912198611535132, -0.19214898347854614, -0.03586754947900772, 0.010446702130138874, -0.04893038421869278, 0.11176161468029022, 0.08429575711488724, 0.061898741871118546, 0.11587581783533096, -0.05775945633649826, -0.020976940169930458, 0.021175675094127655, -0.013511461205780506, -0.09283143281936646, -0.11199425160884857, -0.040815699845552444, -0.03024863824248314, 0.010364260524511337, -0.1365717500448227, -0.0011503545101732016, -0.05486061051487923, 0.09517107903957367, 0.005192385520786047, -0.01178246084600687, 0.04647243395447731, 0.07838927954435349, -0.03594416379928589, -0.028647534549236298, 0.030563760548830032, -0.004088371992111206, -0.06516487896442413, 0.07976464927196503, -0.15317830443382263, -0.09192082285881042, 0.0854433923959732, 0.004371932242065668, -0.13631732761859894, -0.052956946194171906, -0.013662495650351048, -0.05242978036403656, -0.0629279762506485, -0.07120856642723083, 0.23363544046878815, 0.05224384367465973, 0.16445496678352356, -0.12050076574087143, -0.008930309675633907, 0.023253371939063072, -0.01842997968196869, 0.0006192595465108752, 0.17507538199424744, 0.09227454662322998, -0.1683957725763321, 0.08929751068353653, -0.02992412820458412, -0.04325881227850914, 0.10697578638792038, 0.07718239724636078, -0.08741821348667145, -0.0026687716599553823, 0.06222984567284584, 0.00629772711545229, 0.039338450878858566, -0.09734837710857391, -0.01924164965748787, 0.028481291607022285, 0.03974292799830437, 0.058275364339351654, -0.0987701565027237, 0.05462236702442169, 0.07021459192037582, -0.01396145299077034, 0.07560737431049347, -0.036814019083976746, -0.029832134023308754, 0.08751042932271957, 0.019254524260759354, -0.06637851893901825, -0.04078221693634987, -0.03088292106986046, -0.10967904329299927, 0.2034454345703125, -0.05794582515954971, -0.2471541315317154, -0.10051225870847702, 0.10205214470624924, -0.05240709334611893, 0.03647707402706146, 0.029003525152802467, -0.057307127863168716, -0.05256490409374237, -0.11212464421987534, 0.07089640945196152, -0.09005801379680634, -0.04622231423854828, -0.13116100430488586, 0.05670538917183876, 0.007135668769478798, -0.11415719985961914, 0.026719870045781136, 0.01078261248767376, -0.0164381992071867, 0.0036268308758735657, -0.06723649054765701, 0.11666975915431976, 0.15765677392482758, -0.01096628699451685, -0.01612178422510624, -0.008056819438934326, 0.12681183218955994, -0.07317817211151123, 0.05064164474606514, 0.06586474180221558, 0.07242555171251297, 0.029304523020982742, 0.15878361463546753, 0.035942718386650085, -0.07632353156805038, 0.06585737317800522, 0.05724186450242996, -0.026885157451033592, -0.19507236778736115, -0.13968567550182343, -0.07195699959993362, -0.022124093025922775, 0.11830803751945496, 0.023397007957100868, -0.02728240005671978, 0.04606452211737633, -0.06277085840702057, 0.06488586217164993, -0.04718944430351257, 0.044491298496723175, 0.05560297146439552, -0.010069006122648716, 0.09076055139303207, -0.05402911454439163, -0.06464368104934692, 0.10294591635465622, 0.02010010927915573, 0.1998080164194107, -0.06477679312229156, 0.22388707101345062, 0.07230322808027267, 0.05055871233344078, 0.01751650683581829, 0.04100366309285164, -0.037055183202028275, 0.02365361712872982, -0.02673640288412571, -0.06932587921619415, -0.024052752181887627, 0.07972127199172974, 0.005261992570012808, 0.0025389939546585083, -0.054848562926054, -0.05394281446933746, 0.033585451543331146, 0.20939478278160095, 0.07279058545827866, -0.23160649836063385, -0.06598729640245438, -0.011914284899830818, -0.04935360699892044, -0.050552938133478165, 0.005602660588920116, 0.15101420879364014, -0.08139455318450928, 0.021224215626716614, 0.006432525813579559, 0.11845144629478455, -0.12385750561952591, -0.022113963961601257, 0.047016192227602005, 0.045323748141527176, -0.008228559978306293, 0.13297715783119202, -0.20196498930454254, 0.18568207323551178, 0.013934730552136898, 0.028581878170371056, -0.051336634904146194, -0.0057663884945213795, -0.038052309304475784, 0.005685040727257729, 0.09949350357055664, 0.014553140848875046, 0.01668477989733219, -0.1372259557247162, -0.09334786236286163, -0.024835094809532166, 0.06062828376889229, -0.056596506386995316, 0.10892353951931, 0.07231806963682175, 0.007184539455920458, -0.013316516764461994, 0.03696490824222565, -0.07264359295368195, -0.16125790774822235, 0.0011342174839228392, -0.000049739152018446475, -0.043186090886592865, -0.008140215650200844, -0.05710919201374054, -0.08176708221435547, 0.21022596955299377, -0.0530952513217926, -0.08426931500434875, -0.07513512670993805, 0.02536914125084877, 0.1592673808336258, -0.07418212294578552, 0.018807722255587578, 0.00027922168374061584, 0.06232154741883278, -0.0334722138941288, -0.03246607258915901, 0.10989362001419067, -0.09782078117132187, -0.09068207442760468, -0.07117769867181778, 0.10336361080408096, 0.0722278356552124, 0.04461496323347092, -0.013252376578748226, 0.02113061025738716, 0.01185979600995779, -0.11684632301330566, -0.040789391845464706, 0.05004353076219559, 0.10507125407457352, 0.05694695562124252, -0.03576791658997536, -0.034243252128362656, -0.05573004111647606, -0.06579490751028061, 0.08915797621011734, 0.17030422389507294, -0.06572695076465607, 0.042663756757974625, 0.192900151014328, -0.07611322402954102, -0.15230774879455566, -0.04779275879263878, 0.11787751317024231, 0.10053877532482147, 0.000852491648402065, -0.1634109914302826, 0.05054955184459686, 0.08552256226539612, -0.008374568074941635, 0.052675969898700714, -0.40147608518600464, -0.14463084936141968, 0.051178328692913055, 0.041765425354242325, -0.0320524200797081, -0.1239282637834549, -0.0641217827796936, -0.09631984680891037, -0.017469532787799835, 0.09778369963169098, -0.02106379345059395, 0.09832730889320374, -0.006996302865445614, 0.055092860013246536, 0.05493933707475662, -0.03819980472326279, 0.12013865262269974, -0.001006768667139113, 0.022018535062670708, -0.06118399277329445, 0.06849636137485504, 0.025191061198711395, -0.014456518925726414, 0.1513088047504425, -0.0695667564868927, 0.06646562367677689, -0.1344255656003952, -0.08761727809906006, -0.06530506163835526, 0.013937292620539665, -0.030157271772623062, -0.06985855102539062, -0.0515272282063961, 0.03934824839234352, 0.0412750318646431, -0.005530521273612976, -0.05623123422265053, -0.04970899224281311, 0.005777666810899973, 0.20144598186016083, 0.11103632301092148, 0.023810183629393578, -0.13103675842285156, 0.03329700231552124, 0.003306980011984706, 0.06720509380102158, -0.09200824052095413, -0.009725277312099934, 0.14241738617420197, 0.04278704524040222, 0.09967862069606781, -0.021619342267513275, -0.1537228375673294, 0.014181362465023994, 0.06446864455938339, -0.06603935360908508, -0.13433340191841125, -0.023145949468016624, 0.09214062243700027, -0.07706607133150101, -0.008414418436586857, 0.10249048471450806, -0.10118066519498825, -0.01934283971786499, -0.01565106026828289, -0.002257741056382656, -0.020674528554081917, 0.20980335772037506, 0.05026749148964882, 0.04466180130839348, -0.060598596930503845, 0.09186279773712158, 0.12132613360881805, -0.12841668725013733, 0.020902983844280243, 0.15129250288009644, -0.11099390685558319, -0.06367311626672745, 0.029547730460762978, 0.08023694902658463, -0.07720890641212463, -0.06514354050159454, -0.045274145901203156, -0.03664412721991539, 0.010152525268495083, 0.027087420225143433, 0.06337524205446243, 0.013638639822602272, -0.015905417501926422, -0.04538382217288017, -0.06724676489830017, 0.07637675851583481, 0.09518241137266159, 0.02911304123699665, -0.046213146299123764, 0.14173512160778046, 0.031787436455488205, -0.04515836760401726, -0.014937091618776321, 0.0066375150345265865, -0.05427095293998718, 0.02498451992869377, -0.11107679456472397, 0.01767903007566929, -0.059233374893665314, -0.01317656971514225, -0.017234697937965393, -0.0020364776719361544, -0.009927651844918728, -0.004498082213103771, -0.046182259917259216, -0.03723478317260742, -0.031031547114253044, 0.009000870399177074, -0.07360795885324478, -0.04134172573685646, 0.0017513136845082045, -0.016510173678398132, 0.03197478875517845, -0.009849414229393005, 0.0038452933076769114, 0.007256304379552603, -0.025428542867302895, 0.06905508786439896, 0.03730092942714691, 0.05786725506186485, 0.006098019424825907, -0.05758456513285637, 0.02482386864721775, 0.04254067689180374, 0.025560716167092323, -0.012569255195558071, 0.01691051386296749, -0.14194491505622864, -0.039382606744766235, 0.0029412442818284035, -0.02813098207116127, -0.09206213802099228, 0.08958079665899277, 0.07157695293426514, 0.05297110602259636, 0.0924125462770462, -0.07397505640983582, 0.0659821629524231, -0.1667102426290512, -0.02307352051138878, 0.0028174794279038906, 0.021436702460050583, -0.0432785339653492, 0.011881930753588676, 0.05103835463523865, -0.04613269120454788, 0.1509758085012436, 0.007617429830133915, 0.10306470841169357, 0.004539683926850557, -0.08645981550216675, 0.0338648185133934, 0.011591710150241852, 0.09468644857406616, -0.0030265357345342636, -0.006775849033147097, -0.08040761202573776, 0.11949136853218079, 0.0069070300087332726, 0.08102142065763474, 0.019462380558252335, 0.10984634608030319, 0.060426563024520874, 0.03939323127269745, 0.007113256026059389, -0.09445659071207047, -0.03583017736673355, 0.03415454179048538, -0.024736227467656136, 0.06625857204198837, -0.02315216325223446, 0.12353655695915222, 0.12743936479091644, -0.11782728880643845, 0.12355773150920868, 0.003967749420553446, -0.07796064764261246, -0.04186136648058891, -0.13647852838039398, -0.0529642216861248, -0.045321810990571976, -0.03914102539420128, -0.09619153290987015, -0.009502225555479527, 0.07104680687189102, 0.05470128357410431, -0.018154935911297798, 0.12915216386318207, -0.012920708395540714, -0.11287044733762741, 0.045989397913217545, 0.00003997352177975699, 0.10185979306697845, 0.029566621407866478, 0.033388618379831314, 0.08515090495347977, 0.0021561048924922943, 0.025293689221143723, 0.05680335685610771, -0.04485467076301575, -0.011150865815579891, 0.03683865815401077, -0.05571705475449562, -0.05284031108021736, 0.015097301453351974, 0.07396961748600006, 0.21059948205947876, 0.05717519298195839, -0.07338771224021912, -0.045007627457380295, 0.16858835518360138, -0.06541113555431366, -0.06916889548301697, -0.10169269889593124, 0.22308866679668427, 0.023236200213432312, 0.04531446844339371, -0.009432398714125156, -0.10520438104867935, -0.0014706255169585347, 0.13715007901191711, 0.17737708985805511, -0.05391827970743179, -0.03191501274704933, 0.020224368199706078, -0.015729995444417, 0.02274332009255886, 0.03749018907546997, -0.0007306456682272255, 0.3078594207763672, -0.07762178778648376, 0.07478704303503036, -0.047810688614845276, 0.005407197866588831, 0.013931067660450935, 0.12543897330760956, -0.0018202080391347408, 0.032131537795066833, -0.06443776190280914, 0.09654470533132553, -0.008991437032818794, -0.1666661649942398, 0.03530449792742729, -0.08803760260343552, -0.11813313513994217, -0.013141038827598095, -0.03452783077955246, 0.07821096479892731, 0.07926519960165024, -0.010137513279914856, 0.03677714243531227, 0.0443987138569355, 0.0005909292958676815, -0.14655089378356934, -0.13064901530742645, -0.0021879805717617273, -0.0037209601141512394, 0.10003042966127396, -0.008078094571828842, 0.1567474901676178, 0.09168961644172668, 0.008431497029960155, -0.10649105161428452, 0.09170298278331757, 0.02479627914726734, 0.013441815041005611, 0.11547494679689407, 0.07602358609437943, -0.006286558695137501, 0.059788841754198074, 0.030778875574469566, -0.09435297548770905, 0.04533997178077698, -0.015488198027014732, -0.0011725779622793198, -0.15765312314033508, 0.09594451636075974, -0.03384607285261154, 0.1410897970199585, 0.17061743140220642, -0.027212627232074738, -0.019142651930451393, -0.04698585346341133, 0.03225403651595116, -0.007428795099258423, 0.0679541826248169, -0.020334316417574883, -0.19252225756645203, 0.01458018273115158, -0.033700332045555115, 0.029697326943278313, -0.2874174118041992, -0.03121589496731758, 0.028442218899726868, -0.05576042830944061, 0.02104450762271881, 0.06317760795354843, 0.05365269258618355, 0.06610466539859772, -0.041812554001808167, -0.06272628903388977, -0.013894577510654926, 0.1038358062505722, -0.11922512948513031, -0.12126052379608154 ]
null
null
transformers
# legal_t5_small_trans_cs_fr model Model on translating legal text from Cszech to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_fr is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to French. ### How to use Here is how to use this model to translate legal text from Cszech to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Prevencí proti nemoci Usnesení, o kterém bude Parlament hlasovat 24. října je založeno zejména na interpelacích, které poslancům předložily parlamentní kluby pro životní prostředí, zaměstnanost a práva žen." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_fr model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_fr | 50.75| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech French", "tags": ["translation Cszech French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Prevenc\u00ed proti nemoci Usnesen\u00ed, o kter\u00e9m bude Parlament hlasovat 24. \u0159\u00edjna je zalo\u017eeno zejm\u00e9na na interpelac\u00edch, kter\u00e9 poslanc\u016fm p\u0159edlo\u017eily parlamentn\u00ed kluby pro \u017eivotn\u00ed prost\u0159ed\u00ed, zam\u011bstnanost a pr\u00e1va \u017een."}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_fr model ===================================== Model on translating legal text from Cszech to French. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_fr is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to French. ### How to use Here is how to use this model to translate legal text from Cszech to French in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.1256662756204605, 0.08497363328933716, -0.002382842358201742, 0.07457473129034042, 0.08217483013868332, 0.003152657998725772, 0.07817195355892181, 0.08450395613908768, -0.06944704055786133, 0.07146179676055908, 0.07096897810697556, 0.017254844307899475, 0.07962442934513092, 0.11533401161432266, 0.05544760823249817, -0.20990557968616486, 0.02678346447646618, -0.018488794565200806, -0.014546889811754227, 0.14266829192638397, 0.12060549110174179, -0.07975207269191742, 0.027331972494721413, -0.01085122674703598, -0.11469916254281998, 0.0006073112017475069, -0.058159515261650085, -0.06899502128362656, 0.08170681446790695, 0.04603838548064232, 0.11325324326753616, 0.022275326773524284, 0.08274536579847336, -0.17017978429794312, -0.0023338268510997295, 0.08161605149507523, 0.03918900713324547, 0.045344654470682144, 0.0783904641866684, -0.020695118233561516, 0.14982247352600098, -0.025249764323234558, 0.055806733667850494, 0.02458982914686203, -0.11026784032583237, -0.12017552554607391, -0.06934626400470734, 0.04004252329468727, 0.1319338083267212, 0.14940565824508667, -0.05165340006351471, 0.06399593502283096, -0.13469453155994415, 0.06015856936573982, 0.07604769617319107, -0.27633902430534363, -0.06671293079853058, 0.01853717304766178, 0.04495668411254883, 0.07441499084234238, -0.05790127441287041, -0.03486233949661255, 0.030220752581954002, 0.017369989305734634, 0.01386191975325346, -0.03032620996236801, -0.002416046569123864, -0.011930243112146854, -0.16689807176589966, -0.10187417268753052, 0.166381374001503, 0.00560944015160203, -0.07940096408128738, -0.09259933978319168, -0.03457484394311905, -0.1556302309036255, -0.00931971613317728, -0.05475949868559837, 0.05347001552581787, -0.006697093136608601, 0.030544178560376167, -0.01525708008557558, -0.11135005950927734, -0.11335839331150055, 0.032975934445858, 0.07953643053770065, 0.08816881477832794, -0.011294495314359665, 0.019061613827943802, 0.16494882106781006, -0.0010765038896352053, -0.07666972279548645, -0.018598873168230057, 0.0045166658237576485, -0.12460789084434509, -0.021540656685829163, -0.02538667805492878, -0.1161259263753891, -0.05896861478686333, 0.10673295706510544, -0.03823332488536835, 0.05894149839878082, 0.03358488529920578, 0.04351026564836502, 0.0053255269303917885, 0.16974063217639923, -0.08135508000850677, -0.0652511790394783, -0.06808938086032867, 0.06041371077299118, -0.0764446035027504, 0.028111407533288002, -0.015332335606217384, -0.0137922503054142, 0.05691201239824295, 0.07403101772069931, -0.0804692879319191, -0.005163853522390127, -0.03825492039322853, -0.01937587559223175, 0.05973535776138306, -0.1179129034280777, -0.02823430672287941, 0.004270481411367655, -0.11057589203119278, -0.024673499166965485, 0.0782785713672638, -0.00869283452630043, -0.13598737120628357, 0.05259161815047264, -0.03244179114699364, -0.01468151155859232, -0.13443110883235931, -0.09779449552297592, -0.03605097532272339, -0.04705848544836044, -0.05655912309885025, -0.05822987109422684, -0.14131806790828705, -0.10623737424612045, 0.08044615387916565, -0.0571061372756958, -0.042253345251083374, -0.10095533728599548, -0.02454248256981373, 0.007219206541776657, -0.01986326277256012, 0.11755157262086868, -0.029608573764562607, 0.07981551438570023, -0.006046622525900602, 0.0403764471411705, 0.1431405395269394, 0.07226943969726562, -0.09341078251600266, 0.019505374133586884, -0.1082245334982872, 0.178753063082695, -0.025472179055213928, -0.02352973259985447, -0.15791991353034973, -0.08967836946249008, -0.06678730994462967, 0.05882156267762184, 0.10764288157224655, 0.14175520837306976, -0.1507938653230667, -0.020486177876591682, 0.22654226422309875, -0.08550604432821274, -0.03692108020186424, 0.128836989402771, -0.04652619734406471, 0.12066463381052017, 0.07913890480995178, 0.16777683794498444, 0.05853411927819252, -0.08355410397052765, -0.00205199234187603, -0.019285989925265312, 0.004572726786136627, -0.002299015875905752, 0.09886369854211807, -0.05359403043985367, -0.08019808679819107, -0.01789895072579384, -0.09836909919977188, 0.02948821522295475, -0.06691168248653412, -0.06601043045520782, 0.019199993461370468, -0.047848500311374664, -0.01729755848646164, 0.059949327260255814, 0.050152312964200974, -0.050466831773519516, -0.1304703652858734, -0.0187885332852602, 0.09775017946958542, -0.07036532461643219, 0.02300933748483658, -0.049653198570013046, -0.043050698935985565, -0.05985502153635025, -0.014670530334115028, -0.16316953301429749, 0.03407170623540878, 0.04781490936875343, -0.02051258459687233, 0.04594159498810768, 0.043928153812885284, 0.04010222479701042, 0.06700132042169571, -0.006205188110470772, -0.057435136288404465, -0.0550118051469326, -0.04471394419670105, -0.11844004690647125, -0.11318113654851913, -0.02575533092021942, -0.021534539759159088, 0.11431937664747238, -0.18699786067008972, 0.031507860869169235, -0.09212400764226913, 0.04246566817164421, -0.018715854734182358, -0.043436042964458466, 0.015209976583719254, 0.0347738079726696, 0.02798844873905182, -0.05638331547379494, 0.048989713191986084, 0.025509273633360863, 0.05014459788799286, 0.09463071078062057, -0.08734584599733353, -0.14442078769207, 0.08768893778324127, 0.05365081503987312, -0.15666601061820984, 0.000020339157345006242, -0.04779743403196335, -0.055634140968322754, -0.04713910073041916, 0.01450697984546423, 0.24669800698757172, 0.01875111274421215, 0.15998658537864685, -0.11088007688522339, -0.04968179762363434, -0.002685418352484703, -0.016135023906826973, 0.014379662461578846, 0.15191808342933655, 0.06859886646270752, -0.07416793704032898, 0.04943910986185074, 0.026878029108047485, -0.021753322333097458, 0.14874951541423798, 0.000011806495422206353, -0.1275263875722885, 0.010572346858680248, 0.08243110030889511, -0.027946259826421738, 0.09385884553194046, -0.14001551270484924, -0.007001564837992191, 0.015044201165437698, 0.0432363785803318, 0.05424230173230171, -0.16211245954036713, 0.032784342765808105, 0.05215182527899742, -0.05976177752017975, 0.01889677345752716, -0.009217753075063229, -0.052527327090501785, 0.08436983823776245, 0.02029811590909958, -0.048453960567712784, -0.02525770477950573, -0.04122341424226761, -0.13702717423439026, 0.21785634756088257, -0.06560666859149933, -0.13840025663375854, -0.09721596539020538, 0.09821600466966629, 0.04806406423449516, -0.005598119925707579, 0.040221065282821655, -0.0827663466334343, -0.041648704558610916, -0.10042817890644073, 0.0793679803609848, -0.05454527586698532, -0.04853279888629913, -0.09823943674564362, -0.00798810925334692, -0.015259291045367718, -0.13685369491577148, 0.02966357208788395, -0.046407055109739304, -0.08941345661878586, -0.010263040661811829, -0.05301850661635399, 0.07315399497747421, 0.15851083397865295, -0.018925199285149574, 0.021344948559999466, -0.005639883689582348, 0.19302205741405487, -0.12938454747200012, 0.008175985887646675, 0.06735975295305252, 0.056476470082998276, 0.014188381843268871, 0.09913892298936844, -0.003150283358991146, -0.07662414014339447, 0.041755545884370804, 0.05427214130759239, -0.020484760403633118, -0.2685870826244354, -0.042842090129852295, -0.027114219963550568, -0.038990117609500885, 0.11311972141265869, 0.04110991582274437, 0.02511524222791195, 0.04311753436923027, -0.02415364794433117, -0.0069740163162350655, 0.033299997448921204, 0.04974709451198578, -0.01657327264547348, 0.007775320205837488, 0.07332305610179901, -0.053015921264886856, -0.026000753045082092, 0.049191296100616455, 0.00891848374158144, 0.24066758155822754, -0.05818459764122963, 0.12342073768377304, 0.08574585616588593, 0.11441674828529358, 0.0012605193769559264, 0.07887046784162521, -0.02215621992945671, 0.01596008613705635, -0.007058643735945225, -0.03045468032360077, -0.06167607009410858, 0.029244983568787575, 0.012790462002158165, 0.004859077278524637, -0.12035258114337921, -0.023976489901542664, 0.022333331406116486, 0.3330729603767395, 0.05945485457777977, -0.23227368295192719, -0.05897141247987747, -0.013866073451936245, -0.06989358365535736, -0.09527546167373657, 0.06907075643539429, 0.09491332620382309, -0.14768323302268982, -0.030861198902130127, -0.027873054146766663, 0.09797041118144989, -0.08777769654989243, -0.05120837315917015, 0.06496226042509079, 0.051835209131240845, -0.008403420448303223, 0.08884294331073761, -0.2784516513347626, 0.20660178363323212, -0.01687752828001976, 0.11389818787574768, -0.010180382989346981, 0.028537459671497345, -0.04482107236981392, 0.023920461535453796, 0.17273131012916565, 0.007317425683140755, 0.023496702313423157, -0.05272449180483818, -0.09701549261808395, 0.014396067708730698, 0.04299008101224899, -0.07119827717542648, 0.08038114756345749, 0.0206614937633276, 0.014259777031838894, -0.014244837686419487, -0.10420481860637665, -0.1376732885837555, -0.1161387637257576, -0.005191927310079336, -0.08240622282028198, 0.04965845122933388, -0.03901338949799538, -0.05417150259017944, -0.01614302769303322, 0.15902015566825867, -0.10900223255157471, -0.07840541750192642, -0.09209205210208893, 0.0022995416074991226, 0.10044276714324951, -0.046705521643161774, 0.01804228685796261, 0.006710765417665243, 0.018944447860121727, -0.008059501647949219, 0.019071340560913086, 0.09363461285829544, -0.07997944951057434, -0.10682225227355957, -0.04186978191137314, 0.13883981108665466, 0.12288685142993927, 0.059439316391944885, -0.021885668858885765, 0.012938188388943672, -0.02201233059167862, -0.08866352587938309, -0.008561654016375542, -0.018362894654273987, 0.05037478730082512, 0.026638898998498917, -0.07302630692720413, -0.024958904832601547, -0.09064242243766785, -0.035411302000284195, 0.11393757909536362, 0.14599113166332245, -0.060905829071998596, 0.07428425550460815, 0.1776440143585205, -0.10304770618677139, -0.1765393614768982, 0.022834938019514084, 0.11183866113424301, 0.06687089055776596, -0.08856776356697083, -0.21928219497203827, 0.019050277769565582, 0.07700622081756592, 0.009357282891869545, -0.020911874249577522, -0.42480939626693726, -0.1329132616519928, 0.0955473780632019, 0.08069795370101929, -0.02898322604596615, -0.07211404293775558, -0.010016492567956448, 0.039405446499586105, -0.03585371747612953, 0.08846281468868256, -0.021751295775175095, 0.08227764070034027, 0.026531759649515152, -0.061346009373664856, 0.03825940564274788, -0.05563918873667717, 0.11718897521495819, 0.08335769921541214, 0.04867693409323692, -0.02958890050649643, 0.040857769548892975, 0.00504209054633975, -0.015460210852324963, 0.15355634689331055, 0.037040598690509796, 0.038450486958026886, -0.20757001638412476, -0.05326371639966965, -0.08684881776571274, 0.0005590515793301165, -0.07324671000242233, -0.04178541153669357, -0.046195562928915024, 0.08834134042263031, 0.06246236339211464, -0.0019050281262025237, -0.04896201938390732, -0.05993986129760742, -0.025148164480924606, 0.09418139606714249, 0.09685265272855759, 0.06971049308776855, -0.09517227113246918, 0.03683098405599594, 0.04693226143717766, 0.09311190992593765, -0.14631272852420807, -0.016524476930499077, 0.12334306538105011, -0.01665574684739113, 0.13237914443016052, -0.006157546769827604, -0.1366494745016098, -0.010447748005390167, 0.03413352742791176, -0.09803250432014465, -0.12289059907197952, -0.017300041392445564, -0.057433970272541046, -0.038132019340991974, -0.05907484516501427, 0.059401486068964005, -0.11341183632612228, -0.023168068379163742, -0.020189709961414337, 0.028516605496406555, -0.0793047621846199, 0.20922939479351044, 0.03447197005152702, 0.06441092491149902, -0.06703168153762817, 0.11238989233970642, 0.10240072757005692, -0.12847121059894562, 0.013649637810885906, 0.17766405642032623, -0.09559281915426254, -0.05422729626297951, 0.023154379799962044, 0.1353941261768341, -0.026580123230814934, -0.07237663865089417, -0.05881142616271973, -0.04435113072395325, 0.06469473242759705, 0.010675296187400818, 0.044593922793865204, 0.014590402133762836, -0.021541114896535873, -0.015163440257310867, -0.12871387600898743, 0.07207833975553513, 0.10633139312267303, -0.006478207651525736, 0.00297472532838583, 0.17149226367473602, 0.04948851093649864, 0.06377314776182175, -0.007993211038410664, -0.038427188992500305, -0.04741303622722626, 0.0627545416355133, -0.0007372595719061792, -0.024270912632346153, -0.04547880217432976, -0.0048019010573625565, -0.03064879961311817, -0.0073184543289244175, 0.002458953531458974, 0.01617748662829399, -0.07408157736063004, -0.02950350008904934, -0.03195376321673393, 0.04934757947921753, -0.07645513117313385, 0.0012544882483780384, 0.005893985275179148, -0.06933100521564484, 0.07364106178283691, 0.025683213025331497, -0.0007322211167775095, 0.008651258423924446, -0.0221620574593544, 0.05758359655737877, -0.052480753511190414, 0.007086238823831081, -0.017630629241466522, -0.09212476760149002, 0.05146895349025726, -0.005706692114472389, -0.0029151455964893103, -0.013452048413455486, 0.04618240147829056, -0.12702228128910065, 0.06382065266370773, -0.029762307181954384, -0.01891551911830902, -0.07763896137475967, 0.12997283041477203, 0.017973270267248154, 0.08953062444925308, 0.10079488903284073, -0.07115209847688675, 0.060670845210552216, -0.13274331390857697, -0.04602167382836342, 0.025489045307040215, 0.01708773337304592, -0.036589935421943665, -0.06965996325016022, 0.052342575043439865, -0.03782865032553673, 0.07865236699581146, 0.081782266497612, 0.058007340878248215, 0.035071469843387604, -0.13245666027069092, -0.009072762914001942, 0.05002903193235397, 0.04504209756851196, -0.0025170661974698305, 0.014513389207422733, -0.004426166880875826, 0.05312155559659004, -0.014600933529436588, 0.08427818864583969, 0.1340039223432541, 0.21614393591880798, 0.08546477556228638, 0.10198967903852463, -0.061510682106018066, -0.11652834713459015, -0.09029442816972733, 0.10641365498304367, -0.007538551464676857, 0.02516978047788143, -0.043652281165122986, 0.149387389421463, 0.08167988061904907, -0.1628980040550232, 0.08549467474222183, 0.01898515224456787, -0.10919652134180069, -0.09147778153419495, -0.04547538980841637, -0.028699133545160294, -0.07200164347887039, -0.011975143104791641, -0.08459605276584625, 0.043013960123062134, 0.06591689586639404, 0.0859794169664383, -0.027647726237773895, 0.14348694682121277, -0.022398987784981728, -0.0672285184264183, 0.10295698046684265, -0.011624651029706001, 0.11044078320264816, -0.09787355363368988, 0.014231370761990547, 0.01583658903837204, -0.020464681088924408, 0.0613335557281971, 0.01494133286178112, -0.03522473946213722, 0.03468122333288193, 0.047633953392505646, -0.04198164865374565, 0.00005870604218216613, 0.037103306502103806, 0.1275070756673813, 0.1120312511920929, 0.07203015685081482, -0.05534660443663597, -0.022707700729370117, 0.20462192595005035, -0.03789839521050453, -0.09676040709018707, -0.1752423197031021, 0.16296935081481934, 0.0748639851808548, 0.02316189929842949, 0.030545305460691452, -0.08530110865831375, -0.005492969416081905, 0.23030370473861694, 0.13719476759433746, -0.04760066419839859, -0.043933771550655365, 0.036336831748485565, -0.00614806218072772, 0.031398527324199677, 0.100655697286129, 0.05092454329133034, 0.2080865055322647, -0.11971982568502426, 0.03224748745560646, -0.07948032766580582, -0.0578051432967186, -0.012393741868436337, 0.1662600338459015, 0.0063388836570084095, -0.017438553273677826, -0.05760559067130089, 0.10570371150970459, 0.006873412523418665, -0.1884397566318512, 0.07075244933366776, -0.0649990364909172, -0.12643882632255554, -0.021240023896098137, -0.019474105909466743, 0.010603762231767178, 0.050270963460206985, -0.004036503843963146, -0.015303218737244606, 0.13732708990573883, 0.0337313748896122, -0.05347372964024544, -0.15845109522342682, 0.0683021992444992, -0.012392167001962662, 0.19118748605251312, -0.0026297555305063725, 0.0941762775182724, 0.07195224612951279, 0.013259010389447212, -0.10023666173219681, 0.05968622490763664, 0.057760752737522125, 0.03523678705096245, 0.046917591243982315, 0.10819905251264572, -0.03024594485759735, 0.07756006717681885, 0.008461862802505493, -0.10046393424272537, 0.07083825767040253, -0.14546139538288116, -0.053659580647945404, -0.16365209221839905, 0.04498881474137306, -0.04456181451678276, 0.12429545819759369, 0.21616646647453308, -0.02013716660439968, 0.02200428582727909, -0.07306703180074692, 0.03665933012962341, -0.02745470404624939, 0.14069409668445587, 0.01312633790075779, -0.17787212133407593, 0.02830306999385357, -0.07254081219434738, 0.03221121057868004, -0.21463288366794586, -0.018282216042280197, 0.010679356753826141, -0.07145050168037415, -0.021140342578291893, 0.12291490286588669, 0.04039331153035164, 0.058750830590724945, -0.032588209956884384, -0.02513272501528263, -0.02055693417787552, 0.13895323872566223, -0.107358418405056, -0.08862505108118057 ]
null
null
transformers
# legal_t5_small_trans_cs_fr_small_finetuned model Model on translating legal text from Cszech to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_fr_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_cs_fr_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to French. ### How to use Here is how to use this model to translate legal text from Cszech to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_fr_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "9:00 - 10:50 Komise (včetně odpovědí)" pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_fr_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_fr_small_finetuned | 50.717| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech French", "tags": ["translation Cszech French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "9:00 - 10:50 Komise (v\u010detn\u011b odpov\u011bd\u00ed)"}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_fr_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_fr\_small\_finetuned model ======================================================= Model on translating legal text from Cszech to French. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_fr\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_cs\_fr\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to French. ### How to use Here is how to use this model to translate legal text from Cszech to French in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_fr\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_fr\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_fr\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_fr\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.0692271813750267, 0.09437630325555801, -0.0032638711854815483, 0.08332384377717972, 0.04291651397943497, 0.002294868230819702, 0.0375983789563179, 0.08566246181726456, -0.02396470122039318, 0.0839226171374321, 0.04610811918973923, -0.03719649836421013, 0.059329479932785034, 0.03394132852554321, 0.07471419870853424, -0.20414625108242035, 0.006097340025007725, -0.04104628786444664, -0.012959234416484833, 0.10215118527412415, 0.09520938247442245, -0.05530788004398346, 0.05145301669836044, -0.022996529936790466, -0.026827121153473854, 0.03105229139328003, -0.09488292783498764, -0.044013164937496185, 0.0859735831618309, 0.09249705076217651, 0.08423937112092972, -0.014164887368679047, 0.06766632944345474, -0.19434696435928345, -0.0023763368371874094, 0.07741232961416245, -0.010750318877398968, 0.04467487707734108, 0.1390862911939621, -0.008222564123570919, 0.15824571251869202, -0.05356078967452049, 0.02700980193912983, 0.04333114996552467, -0.11754197627305984, -0.10470432043075562, -0.05533990263938904, 0.022845594212412834, 0.07494489848613739, 0.13619458675384521, -0.04888276010751724, 0.05925847589969635, -0.06747512519359589, 0.06978785246610641, 0.08034293353557587, -0.21972839534282684, -0.03396394103765488, 0.02226741798222065, 0.05098219960927963, 0.0741301029920578, -0.05422583222389221, -0.013482828624546528, 0.046077910810709, 0.05863789841532707, 0.02777210809290409, -0.06885361671447754, -0.05854993686079979, -0.054417505860328674, -0.12850822508335114, -0.06401731818914413, 0.15642736852169037, 0.027867764234542847, -0.0514584481716156, -0.08182314783334732, -0.06141887605190277, -0.0740489810705185, -0.022452332079410553, -0.0382467620074749, 0.03733235225081444, -0.006372016854584217, 0.07350347936153412, -0.014368400909006596, -0.10845332592725754, -0.06104658171534538, -0.06706414371728897, 0.11888634413480759, 0.040878232568502426, 0.012533038854598999, 0.034504156559705734, 0.08710768818855286, -0.12951555848121643, -0.07285379618406296, 0.0028718910180032253, 0.01633419841527939, -0.11787112802267075, -0.0019575408659875393, -0.0061705405823886395, -0.14603856205940247, -0.010326554998755455, 0.045384783297777176, -0.07640084624290466, 0.0560467354953289, 0.07373072952032089, 0.04261854663491249, 0.05727722868323326, 0.11623348295688629, -0.1218685507774353, -0.1254831850528717, -0.03752357140183449, 0.004972556605935097, -0.00897524505853653, 0.024293409660458565, -0.05431593582034111, -0.036782458424568176, 0.003116099163889885, 0.028343727812170982, 0.011445589363574982, 0.01057283766567707, -0.018314244225621223, -0.03136977553367615, 0.12904444336891174, -0.10717970132827759, 0.0033160068560391665, 0.005167634692043066, -0.09900543838739395, -0.023769408464431763, 0.0710420086979866, -0.011765980161726475, -0.12104660272598267, 0.06669215112924576, -0.03748481348156929, -0.024623803794384003, -0.10830125212669373, -0.16438736021518707, -0.01413993164896965, 0.023262841627001762, -0.07132048159837723, -0.09505612403154373, -0.11150339990854263, -0.09726053476333618, 0.04683242365717888, -0.0638304054737091, 0.0014668876538053155, -0.07215484231710434, -0.008477830328047276, 0.0019367117201909423, 0.0009646757389418781, 0.10357946157455444, -0.04698504880070686, 0.01958821900188923, 0.014721653424203396, 0.06449917703866959, 0.02465890906751156, 0.029012534767389297, -0.11336801201105118, 0.050960101187229156, -0.1530149132013321, 0.15140533447265625, -0.03079659305512905, -0.011795333586633205, -0.13623376190662384, -0.06077253073453903, -0.08659062534570694, 0.056025996804237366, 0.06579352915287018, 0.1270764321088791, -0.19431909918785095, -0.015386555343866348, 0.20736373960971832, -0.08310522139072418, -0.07739533483982086, 0.14431247115135193, -0.024030007421970367, 0.0206166859716177, 0.07225209474563599, 0.1098795011639595, 0.07585686445236206, -0.015397544018924236, -0.04727925732731819, 0.02174648828804493, 0.03121410310268402, 0.07548294961452484, 0.0990714505314827, -0.07402735203504562, 0.02320161648094654, 0.0065553621388971806, 0.005916757974773645, -0.0012437795521691442, -0.025862744078040123, -0.035003796219825745, 0.009373454377055168, -0.0387597531080246, -0.024402566254138947, 0.03256351873278618, 0.006890469696372747, -0.05577654018998146, -0.09089841693639755, -0.057648915797472, 0.10649928450584412, -0.05345357954502106, 0.015243430621922016, -0.003891564905643463, -0.0531313493847847, -0.09508351981639862, 0.009004209190607071, -0.16516749560832977, -0.01933949813246727, 0.03573077917098999, -0.076509989798069, 0.1085120290517807, 0.06130129098892212, 0.05416259169578552, 0.1006258949637413, -0.06092138588428497, -0.03176625818014145, -0.0060738190077245235, -0.026001114398241043, -0.09553488343954086, -0.1267254799604416, -0.03678281977772713, -0.021288473159074783, 0.008174349553883076, -0.1252179592847824, -0.0045201536267995834, -0.06502095609903336, 0.08434311300516129, 0.002448522951453924, -0.017949488013982773, 0.01534469984471798, 0.0638362392783165, -0.025702625513076782, -0.031575169414281845, 0.022529231384396553, -0.013559463433921337, -0.024990959092974663, 0.09298083931207657, -0.15108607709407806, -0.10722588002681732, 0.08191408962011337, 0.006655549630522728, -0.13215366005897522, -0.012827014550566673, -0.021850652992725372, -0.059232089668512344, -0.05052896589040756, -0.04886304587125778, 0.24382351338863373, 0.03754451870918274, 0.15010669827461243, -0.11137741059064865, -0.034191686660051346, 0.01262735202908516, -0.021468229591846466, 0.0022955583408474922, 0.17633558809757233, 0.08510244637727737, -0.13428206741809845, 0.08457640558481216, 0.012135487981140614, -0.02812020853161812, 0.09687211364507675, 0.059697940945625305, -0.10183291137218475, -0.010954192839562893, 0.05160845071077347, 0.0029908977448940277, 0.06226952001452446, -0.09506730735301971, -0.012511161155998707, 0.024804914370179176, 0.04855795577168465, 0.058035410940647125, -0.0887453705072403, 0.07023803144693375, 0.05936381220817566, -0.0317659005522728, 0.04564240202307701, -0.04698899760842323, -0.03446158766746521, 0.10487150400876999, 0.025702420622110367, -0.05220569297671318, -0.0536944679915905, -0.054928332567214966, -0.11204375326633453, 0.2062622308731079, -0.061476245522499084, -0.20669913291931152, -0.10689456015825272, 0.07605959475040436, -0.04922128841280937, 0.03648831695318222, 0.02427281066775322, -0.035456880927085876, -0.066674143075943, -0.13190674781799316, 0.06577431410551071, -0.08136677742004395, -0.04463374614715576, -0.13008622825145721, 0.027996648102998734, 0.004581443965435028, -0.12515734136104584, 0.021589364856481552, -0.002406422048807144, -0.02993146702647209, -0.012707038782536983, -0.032211873680353165, 0.11542263627052307, 0.13699385523796082, -0.02551153674721718, -0.03413267061114311, 0.0017298428574576974, 0.137416809797287, -0.09776201099157333, 0.05726621672511101, 0.05680849775671959, 0.05082260072231293, 0.033915918320417404, 0.14529933035373688, 0.02710145153105259, -0.05283290520310402, 0.04290388152003288, 0.06123108044266701, -0.019239436835050583, -0.2125961035490036, -0.12216121703386307, -0.07139503955841064, 0.01939212717115879, 0.1206044852733612, 0.04063316062092781, -0.03864983096718788, 0.022551333531737328, -0.06422740966081619, 0.052704326808452606, -0.0012108663795515895, 0.054539136588573456, 0.027453621849417686, -0.004891946446150541, 0.07942941784858704, -0.06308364868164062, -0.05891132727265358, 0.10346480458974838, 0.027122952044010162, 0.18219082057476044, -0.05551198124885559, 0.20756715536117554, 0.057748109102249146, 0.03524455428123474, 0.00708374148234725, 0.055267587304115295, -0.03504904359579086, 0.024577511474490166, -0.0310648325830698, -0.06287325173616409, -0.023871399462223053, 0.05421808734536171, 0.014998020604252815, 0.009105445817112923, -0.06603151559829712, -0.06440486013889313, 0.05708114057779312, 0.20729193091392517, 0.05510001257061958, -0.19868957996368408, -0.05005647987127304, -0.0013850396499037743, -0.07071519643068314, -0.06684283167123795, 0.023270828649401665, 0.1570211797952652, -0.1041218712925911, 0.007889777421951294, 0.019532054662704468, 0.12011556327342987, -0.1197020560503006, -0.01144466269761324, 0.04930899664759636, 0.04858681559562683, -0.01554782036691904, 0.12651148438453674, -0.23012889921665192, 0.15291081368923187, 0.011403818614780903, 0.04061825945973396, -0.03671518713235855, 0.017720898613333702, -0.03222960978746414, 0.01963054947555065, 0.13264252245426178, 0.020437218248844147, -0.008322881534695625, -0.10305376350879669, -0.09228058904409409, -0.002422519726678729, 0.06763017177581787, -0.07494968920946121, 0.0892818495631218, 0.04879531264305115, -0.00045215137652121484, -0.022343182936310768, 0.014784679748117924, -0.06702936440706253, -0.1656801700592041, 0.010796274989843369, -0.027062542736530304, -0.02671179175376892, -0.012478061951696873, -0.022869005799293518, -0.06687168031930923, 0.2019350379705429, -0.09480027109384537, -0.060736000537872314, -0.07173454761505127, -0.01631939597427845, 0.14170411229133606, -0.07447471469640732, 0.01722903922200203, -0.008378132246434689, 0.05375512316823006, -0.03134741634130478, -0.023154059424996376, 0.08767449110746384, -0.09798364341259003, -0.08643798530101776, -0.07236883789300919, 0.11765200644731522, 0.07349834591150284, 0.04729009419679642, -0.009422356262803078, 0.032094113528728485, -0.02311788499355316, -0.12649552524089813, -0.0353257916867733, 0.006344841327518225, 0.11259700357913971, 0.06867990642786026, -0.04923176392912865, -0.04462656006217003, -0.06457242369651794, -0.04177911579608917, 0.08629383891820908, 0.16156023740768433, -0.05043632909655571, 0.0346648171544075, 0.19711321592330933, -0.09764599800109863, -0.19490788877010345, -0.053087685257196426, 0.08729282021522522, 0.06067529693245888, -0.026803871616721153, -0.1697872430086136, 0.04355541244149208, 0.08852031826972961, 0.0051800585351884365, 0.0553804375231266, -0.3903793394565582, -0.13708804547786713, 0.06679381430149078, 0.026147376745939255, -0.037369973957538605, -0.11509846150875092, -0.04958588629961014, -0.06157796457409859, -0.01886846497654915, 0.09993159770965576, -0.04207438975572586, 0.08946334570646286, -0.000056244964071083814, -0.0008357935585081577, 0.0427398644387722, -0.037150461226701736, 0.1353071630001068, 0.0487409308552742, 0.04444407671689987, -0.051389358937740326, 0.06463939696550369, 0.013353883288800716, -0.013827841728925705, 0.14949147403240204, -0.038477957248687744, 0.05474255979061127, -0.144733265042305, -0.04818180575966835, -0.07227035611867905, 0.02825324982404709, -0.04359303042292595, -0.059878669679164886, -0.05667896941304207, 0.04537859931588173, 0.06096429377794266, 0.00410888297483325, -0.03190472349524498, -0.04821494221687317, -0.0011479438981041312, 0.1800152063369751, 0.11368725448846817, 0.028210008516907692, -0.12321072816848755, 0.04565101116895676, 0.011328344233334064, 0.07647644728422165, -0.06767220795154572, 0.014933722093701363, 0.13388071954250336, 0.018127165734767914, 0.11473456025123596, -0.009720472618937492, -0.13621360063552856, -0.026298541575670242, 0.051065754145383835, -0.097334124147892, -0.12903878092765808, -0.011702881194651127, 0.040779728442430496, -0.07268361747264862, -0.044644251465797424, 0.0877237617969513, -0.09247571229934692, -0.01896130107343197, 0.004522804170846939, 0.026500245556235313, -0.03734592720866203, 0.19078350067138672, 0.03044222481548786, 0.04844444990158081, -0.06403014063835144, 0.10080558061599731, 0.13234515488147736, -0.16187991201877594, 0.005491333547979593, 0.1972927451133728, -0.08412265032529831, -0.06719549745321274, 0.021167892962694168, 0.1287437528371811, -0.03666301816701889, -0.06263600289821625, -0.030647553503513336, -0.05056757852435112, 0.032296016812324524, 0.00906425528228283, 0.044194262474775314, 0.03299960866570473, -0.004028579220175743, -0.02651181071996689, -0.0988272875547409, 0.09846153110265732, 0.0824630856513977, 0.031165406107902527, -0.0031221292447298765, 0.12333661317825317, 0.025464918464422226, -0.008749987930059433, -0.012199238874018192, 0.006303696893155575, -0.0677771344780922, 0.007500160485506058, -0.06424727290868759, 0.017682576552033424, -0.04414333403110504, -0.007972058840095997, -0.023889997974038124, 0.00553329149261117, -0.014792128466069698, -0.0036280632484704256, -0.03541262447834015, -0.05038570240139961, -0.03231413662433624, 0.03077080100774765, -0.09675326943397522, -0.026587143540382385, 0.024681588634848595, -0.028858499601483345, 0.05092279613018036, 0.00016418735322076827, 0.016173986718058586, -0.012438939884305, -0.02917838841676712, 0.05313297361135483, -0.0058486550115048885, 0.05052689462900162, -0.013387415558099747, -0.08824661374092102, 0.023470846936106682, 0.02002127841114998, 0.002732204971835017, -0.016834639012813568, 0.01595449075102806, -0.1451484113931656, 0.0023239420261234045, -0.019787942990660667, -0.04295755922794342, -0.08244078606367111, 0.09832391142845154, 0.03719048574566841, 0.06591492891311646, 0.10174249857664108, -0.0724639818072319, 0.0741899386048317, -0.1587437391281128, -0.012796899303793907, 0.022468579933047295, 0.01025855727493763, -0.026952436193823814, -0.009435801766812801, 0.04434804245829582, -0.05541335791349411, 0.1431039720773697, 0.045096613466739655, 0.07903196662664413, 0.02390260621905327, -0.07402071356773376, -0.009974729269742966, 0.031011691316962242, 0.059077903628349304, -0.026087358593940735, -0.025213897228240967, -0.0638376772403717, 0.08986175060272217, 0.004594907630234957, 0.07363752275705338, 0.054205864667892456, 0.12446660548448563, 0.11723396927118301, 0.03913186863064766, -0.018185218796133995, -0.11110322177410126, -0.055054325610399246, 0.055930837988853455, -0.0037844073958694935, 0.04104460030794144, -0.03370954468846321, 0.1032659262418747, 0.11159368604421616, -0.14252126216888428, 0.13410481810569763, 0.004917744547128677, -0.08367883414030075, -0.045730110257864, -0.12411388754844666, -0.041484732180833817, -0.01701909489929676, -0.04942270368337631, -0.10642953962087631, 0.026403706520795822, 0.07811698317527771, 0.05320301279425621, -0.02049211971461773, 0.1368761658668518, -0.07913105189800262, -0.10942080616950989, 0.052090007811784744, 0.018902696669101715, 0.0971176028251648, 0.019335893914103508, 0.03507808595895767, 0.05032573640346527, 0.0006293555488809943, 0.04567863047122955, 0.058595675975084305, -0.009349627420306206, 0.004524590913206339, 0.015885859727859497, -0.0523383729159832, -0.03699358552694321, 0.004095885902643204, 0.0817524716258049, 0.20195205509662628, 0.05427513271570206, -0.0739816352725029, -0.018798597157001495, 0.19489772617816925, -0.06640159338712692, -0.08015195280313492, -0.10678993910551071, 0.21370939910411835, 0.03254898637533188, 0.03410536050796509, -0.007871387526392937, -0.09387911856174469, -0.008702099323272705, 0.14879491925239563, 0.191430002450943, -0.05519453063607216, -0.02606114372611046, 0.027158835902810097, -0.005378546193242073, 0.046876825392246246, 0.031405434012413025, 0.02953943982720375, 0.2790197730064392, -0.09302309155464172, 0.08917435258626938, -0.043509624898433685, 0.013069681823253632, -0.00662320526316762, 0.17484739422798157, 0.0038438071496784687, 0.022187987342476845, -0.0715804472565651, 0.07439074665307999, -0.015738166868686676, -0.17940323054790497, 0.027680546045303345, -0.0888432115316391, -0.10950063169002533, 0.010727393440902233, -0.026533154770731926, 0.06249060854315758, 0.06888611614704132, 0.010453601367771626, 0.02149910293519497, 0.0722355842590332, 0.013961861841380596, -0.11032772809267044, -0.1313713639974594, 0.004463031888008118, -0.01705973409116268, 0.15369723737239838, 0.0022139560896903276, 0.1355184018611908, 0.08158543705940247, -0.005622255150228739, -0.09936907887458801, 0.07164379954338074, 0.03176884353160858, 0.03149126470088959, 0.08036459237337112, 0.09924296289682388, 0.0009246519184671342, 0.07150480896234512, 0.027685215696692467, -0.07283743470907211, 0.03224636986851692, -0.05298725515604019, -0.018434742465615273, -0.1342037320137024, 0.08974621444940567, -0.04153493046760559, 0.14587685465812683, 0.18620334565639496, -0.015498707070946693, -0.009848437272012234, -0.051584236323833466, 0.007633153814822435, -0.016650112345814705, 0.08951640874147415, -0.009214052930474281, -0.1516898274421692, 0.039397794753313065, -0.08257108926773071, 0.03689480572938919, -0.2613504230976105, -0.021148407831788063, 0.01964917592704296, -0.04740331694483757, 0.0013070704881101847, 0.07406293600797653, 0.039629314094781876, 0.04516981542110443, -0.054933249950408936, -0.0845843106508255, 0.0012471757363528013, 0.10009884834289551, -0.08905588835477829, -0.1133355051279068 ]
null
null
transformers
# legal_t5_small_trans_cs_it model Model on translating legal text from Cszech to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_it is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Italian. ### How to use Here is how to use this model to translate legal text from Cszech to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_it", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "– Měly by se podporovat normy sportovní správy prostřednictvím výměny osvědčených postupů." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_it model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_it | 46.67| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Italian", "tags": ["translation Cszech Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "\u2013 M\u011bly by se podporovat normy sportovn\u00ed spr\u00e1vy prost\u0159ednictv\u00edm v\u00fdm\u011bny osv\u011bd\u010den\u00fdch postup\u016f."}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_it model ===================================== Model on translating legal text from Cszech to Italian. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_it is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Italian. ### How to use Here is how to use this model to translate legal text from Cszech to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.1384115070104599, 0.09306392818689346, -0.001914430409669876, 0.07621260732412338, 0.09747358411550522, 0.020701540634036064, 0.08461493253707886, 0.10891062766313553, -0.055642034858465195, 0.08676808327436447, 0.053881190717220306, 0.027890728786587715, 0.09671226888895035, 0.12221794575452805, 0.032402005046606064, -0.2204861342906952, 0.030712520703673363, -0.01990353688597679, -0.003314114874228835, 0.14523863792419434, 0.11996465921401978, -0.09383503347635269, 0.02171197347342968, -0.03164685145020485, -0.1336164027452469, -0.004332350101321936, -0.05790788680315018, -0.06596605479717255, 0.07655498385429382, 0.04766998440027237, 0.11665039509534836, 0.020163999870419502, 0.0934649184346199, -0.1581672877073288, -0.0011335305171087384, 0.09023100882768631, 0.049028389155864716, 0.03781509026885033, 0.08415774255990982, -0.009595518931746483, 0.15289714932441711, -0.0067611816339194775, 0.046609267592430115, 0.020032064989209175, -0.11062916368246078, -0.10721968114376068, -0.06385084241628647, 0.04177498072385788, 0.13692708313465118, 0.15033210813999176, -0.05660266429185867, 0.08113659173250198, -0.1317039132118225, 0.061173174530267715, 0.07188700884580612, -0.2642331123352051, -0.07421621680259705, 0.014332694932818413, 0.031153809279203415, 0.08907357603311539, -0.045461587607860565, -0.032939281314611435, 0.03139766305685043, 0.014970771037042141, -0.012302497401833534, -0.010548542253673077, 0.007226594258099794, -0.018183913081884384, -0.16856375336647034, -0.09884580969810486, 0.17407013475894928, -0.0015874287346377969, -0.07635212689638138, -0.08952534198760986, -0.03274717926979065, -0.14778682589530945, 0.003442782675847411, -0.05085631087422371, 0.0327792651951313, 0.0004310273507144302, 0.042670704424381256, -0.0006482243188656867, -0.11869053542613983, -0.11906231939792633, 0.04405084624886513, 0.11708981543779373, 0.10013553500175476, -0.01371160801500082, 0.017195774242281914, 0.1649562418460846, 0.014854071661829948, -0.0855419859290123, -0.014029105193912983, 0.013579769991338253, -0.11703258007764816, -0.027477530762553215, -0.03312654420733452, -0.1392800360918045, -0.059186555445194244, 0.09411083161830902, -0.0003214809112250805, 0.049379732459783554, 0.037921562790870667, 0.05020536482334137, 0.0048574344255030155, 0.16031084954738617, -0.09250486642122269, -0.06162848323583603, -0.06166968122124672, 0.05832501873373985, -0.06145758181810379, 0.01596831902861595, -0.027949485927820206, -0.016899308189749718, 0.05272549018263817, 0.08278223872184753, -0.06842902302742004, 0.006249709520488977, -0.04778632894158363, -0.031446922570466995, 0.038589540868997574, -0.13006362318992615, -0.02842811495065689, 0.006218628492206335, -0.11485835164785385, -0.013888639397919178, 0.061769016087055206, -0.017837196588516235, -0.14140664041042328, 0.06420546770095825, -0.035476602613925934, -0.01831594854593277, -0.14255204796791077, -0.09579654783010483, -0.02298290841281414, -0.07396576553583145, -0.04924315586686134, -0.0626581534743309, -0.14570845663547516, -0.10277383029460907, 0.06725189089775085, -0.06420723348855972, -0.03332138806581497, -0.0889580249786377, -0.008985625579953194, -0.006388834211975336, -0.02859729714691639, 0.11674070358276367, -0.024647211655974388, 0.09747185558080673, 0.0036089152563363314, 0.042847856879234314, 0.13786594569683075, 0.07344738394021988, -0.10566367208957672, 0.011177453212440014, -0.09593386203050613, 0.16322623193264008, -0.02243717573583126, -0.006745549850165844, -0.1483781784772873, -0.07678641378879547, -0.06506521999835968, 0.059263523668050766, 0.11374818533658981, 0.15321855247020721, -0.16068129241466522, -0.010580970905721188, 0.21289677917957306, -0.09817574918270111, -0.040650732815265656, 0.12039458751678467, -0.03724249452352524, 0.1468917280435562, 0.0873812586069107, 0.1722119003534317, 0.052555423229932785, -0.08590857684612274, -0.009147955104708672, -0.039860982447862625, -0.004017935134470463, -0.009605741128325462, 0.0875793844461441, -0.03970177471637726, -0.04395423084497452, -0.006843439768999815, -0.11565174162387848, 0.026054246351122856, -0.06919734179973602, -0.06963510811328888, 0.012776199728250504, -0.06678266823291779, -0.03793685883283615, 0.06486356258392334, 0.055468205362558365, -0.05068044364452362, -0.13412952423095703, -0.03848609700798988, 0.10694961994886398, -0.08096995949745178, 0.02820061333477497, -0.06646890938282013, -0.012915203347802162, -0.060900572687387466, -0.011516590602695942, -0.16084074974060059, 0.035049691796302795, 0.05906764417886734, -0.006506549194455147, 0.03854253515601158, 0.034877244383096695, 0.03997272253036499, 0.05883780121803284, -0.005992081947624683, -0.05711686238646507, -0.04268985986709595, -0.04352141544222832, -0.12702897191047668, -0.11703407764434814, -0.044765159487724304, -0.01601267233490944, 0.11272815614938736, -0.1868106573820114, 0.03835729882121086, -0.07825425267219543, 0.055963851511478424, -0.0269265566021204, -0.0493503212928772, 0.010465948842465878, 0.029867438599467278, 0.018636824563145638, -0.06714150309562683, 0.04714157432317734, 0.03247940167784691, 0.03763120621442795, 0.07927043735980988, -0.09919513761997223, -0.13756047189235687, 0.08321958035230637, 0.04739344120025635, -0.14849211275577545, -0.006460624281316996, -0.04101661965250969, -0.06275422871112823, -0.05149279162287712, 0.006232640706002712, 0.22374267876148224, 0.01419212855398655, 0.1356734037399292, -0.11194052547216415, -0.027316387742757797, 0.0029818119946867228, -0.016660859808325768, 0.007198646664619446, 0.13712821900844574, 0.07873938232660294, -0.09017592668533325, 0.053736474364995956, 0.023921199142932892, -0.033175159245729446, 0.15421298146247864, 0.00853816419839859, -0.13947661221027374, 0.019910821691155434, 0.0849158838391304, -0.022212442010641098, 0.08892425149679184, -0.1500367969274521, 0.007074576802551746, 0.012608888559043407, 0.04446927830576897, 0.0622188001871109, -0.15990574657917023, 0.0211781095713377, 0.04972118139266968, -0.048741478472948074, 0.012908338569104671, -0.0253788810223341, -0.06426642835140228, 0.0888272076845169, 0.019958289340138435, -0.04432549700140953, -0.016642969101667404, -0.038127996027469635, -0.1306881457567215, 0.20224051177501678, -0.0569411925971508, -0.139183908700943, -0.10187964141368866, 0.09823444485664368, 0.0628260150551796, 0.0015509588411077857, 0.031146986410021782, -0.07830451428890228, -0.04229185730218887, -0.08543805778026581, 0.08478198945522308, -0.051033131778240204, -0.0513639859855175, -0.09941022098064423, -0.0012837137328460813, -0.022244315594434738, -0.11892426013946533, 0.03310883790254593, -0.0453474298119545, -0.09647602587938309, -0.006895453203469515, -0.07267316430807114, 0.08985299617052078, 0.16958695650100708, -0.011427691206336021, 0.03407156467437744, -0.004909746814519167, 0.16380183398723602, -0.1431572586297989, 0.01692245900630951, 0.0863376334309578, 0.035457439720630646, 0.003399480599910021, 0.10009647905826569, -0.007432915735989809, -0.08845894038677216, 0.03528619185090065, 0.04705255478620529, -0.021387308835983276, -0.2706397771835327, -0.03612427040934563, -0.029534384608268738, -0.03622144088149071, 0.10474375635385513, 0.03707868978381157, 0.013613627292215824, 0.05251546576619148, -0.029182417318224907, -0.03273787349462509, 0.034833263605833054, 0.04887515679001808, -0.03869311511516571, 0.0043257782235741615, 0.0753369927406311, -0.049041908234357834, -0.0159163735806942, 0.052682388573884964, 0.028009753674268723, 0.2592799961566925, -0.057987041771411896, 0.13546167314052582, 0.07596749067306519, 0.127146914601326, 0.005493358708918095, 0.07761545479297638, -0.012052573263645172, 0.006181562785059214, -0.0002885986177716404, -0.03307916596531868, -0.07090244442224503, 0.03590882197022438, 0.015138012357056141, -0.0035608913749456406, -0.10926125943660736, -0.017503133043646812, 0.021578093990683556, 0.351484090089798, 0.05467407405376434, -0.24070391058921814, -0.05857644975185394, -0.01032277848571539, -0.05450296774506569, -0.09208402782678604, 0.05454271286725998, 0.0948881283402443, -0.13413208723068237, -0.03463069722056389, -0.02824040688574314, 0.09924998134374619, -0.10950140655040741, -0.05752074345946312, 0.05762656405568123, 0.05297673121094704, -0.01258007064461708, 0.10187556594610214, -0.2780762314796448, 0.20755650103092194, -0.011127164587378502, 0.14146338403224945, -0.03094199113547802, 0.023541554808616638, -0.04953838139772415, 0.027546603232622147, 0.17085042595863342, -0.0020889199804514647, 0.025913136079907417, -0.08020108193159103, -0.11339595913887024, 0.013801113702356815, 0.060809098184108734, -0.06280085444450378, 0.08423388749361038, 0.014784825965762138, 0.027905061841011047, -0.0075577786192297935, -0.08773671090602875, -0.11753252893686295, -0.11006766557693481, 0.006182156968861818, -0.06910321861505508, 0.040892671793699265, -0.03832658752799034, -0.06549220532178879, 0.001108734286390245, 0.16210795938968658, -0.10729049891233444, -0.07688459008932114, -0.10192502290010452, 0.027532240375876427, 0.09083598107099533, -0.04684905707836151, -0.0005024804268032312, 0.01102404948323965, 0.021760324016213417, -0.0038965523708611727, 0.024502180516719818, 0.10357705503702164, -0.07092577964067459, -0.11871969699859619, -0.045337215065956116, 0.1243186742067337, 0.12753382325172424, 0.0553780272603035, -0.01807941496372223, 0.007446132600307465, -0.008444196544587612, -0.07108058035373688, 0.0047529395669698715, -0.009091876447200775, 0.049763742834329605, 0.034351058304309845, -0.08262093365192413, -0.015208565630018711, -0.09635020047426224, -0.05017874762415886, 0.1005346029996872, 0.14370456337928772, -0.05664664879441261, 0.04488448053598404, 0.16853363811969757, -0.11052443832159042, -0.17787595093250275, 0.04158671572804451, 0.10013602674007416, 0.0781574696302414, -0.08527768403291702, -0.216670960187912, 0.01554129458963871, 0.10027853399515152, 0.0025331892538815737, -0.006705798674374819, -0.42523375153541565, -0.12234030663967133, 0.08766322582960129, 0.09529022872447968, -0.03199101611971855, -0.07678108662366867, -0.00888736266642809, 0.043617717921733856, -0.03087291680276394, 0.06075391545891762, -0.007593824062496424, 0.08937829732894897, 0.028941864147782326, -0.06797579675912857, 0.04516872763633728, -0.06229523941874504, 0.11004526913166046, 0.07667770236730576, 0.04384850710630417, -0.04082844406366348, 0.03396587073802948, 0.003871163120493293, -0.008494559675455093, 0.15071170032024384, 0.03449289873242378, 0.024085111916065216, -0.19055786728858948, -0.0657728835940361, -0.08434999734163284, 0.0029230283107608557, -0.07126547396183014, -0.04809111729264259, -0.03703230619430542, 0.09404019266366959, 0.0495629757642746, 0.005255541298538446, -0.03349157050251961, -0.08004520833492279, -0.021548695862293243, 0.09283791482448578, 0.09770780056715012, 0.0640304833650589, -0.07968270778656006, 0.016851507127285004, 0.045518774539232254, 0.08800879865884781, -0.1218750849366188, -0.021042337641119957, 0.12586408853530884, -0.018876805901527405, 0.1273965984582901, -0.003947087563574314, -0.14271879196166992, -0.00263559864833951, 0.044682685285806656, -0.09034278243780136, -0.11970487982034683, -0.013161967508494854, -0.05358392000198364, -0.04611223191022873, -0.034843090921640396, 0.06283613294363022, -0.1132081151008606, -0.019336288794875145, -0.024570338428020477, 0.039592236280441284, -0.07617941498756409, 0.22920095920562744, 0.039069756865501404, 0.050938695669174194, -0.06656762957572937, 0.13978494703769684, 0.10080832988023758, -0.12281151860952377, 0.014341653324663639, 0.17420470714569092, -0.09708167612552643, -0.05500677973031998, 0.024952756240963936, 0.11891096085309982, -0.025625178590416908, -0.08141554147005081, -0.07501091808080673, -0.042132724076509476, 0.057696301490068436, -0.010120335035026073, 0.04760466143488884, 0.019543783739209175, -0.039566028863191605, -0.008275862783193588, -0.14173638820648193, 0.06458329409360886, 0.11478406190872192, -0.003437038976699114, -0.01379763800650835, 0.191160187125206, 0.05722592771053314, 0.034062791615724564, -0.007868196815252304, -0.04451063647866249, -0.042612988501787186, 0.07517688721418381, 0.009201535023748875, -0.03634020313620567, -0.04532402381300926, -0.0138528011739254, -0.036313582211732864, -0.010534659959375858, -0.00769300851970911, 0.01626662351191044, -0.08062930405139923, -0.02749396674335003, -0.041442908346652985, 0.03555366396903992, -0.06942546367645264, 0.0027361514512449503, -0.0033763451501727104, -0.06734228879213333, 0.07733229547739029, 0.028189485892653465, -0.004272079560905695, 0.008689092472195625, -0.023421019315719604, 0.075556181371212, -0.0457126647233963, 0.004390376154333353, -0.017281929031014442, -0.07562805712223053, 0.052006106823682785, 0.007225160952657461, -0.014959708787500858, -0.01049969345331192, 0.05045183748006821, -0.12474086135625839, 0.07950004935264587, -0.02015024609863758, -0.01620975323021412, -0.07187413424253464, 0.135001078248024, 0.02876981534063816, 0.0864894911646843, 0.08668866753578186, -0.06214912608265877, 0.06849045306444168, -0.13412275910377502, -0.03940325975418091, 0.03233896195888519, 0.007884391583502293, -0.03693601116538048, -0.06032273545861244, 0.06167631223797798, -0.03858987241983414, 0.0574476532638073, 0.0753064826130867, 0.04718674719333649, 0.03134539723396301, -0.11276227980852127, 0.006628550123423338, 0.047062210738658905, 0.06449902802705765, -0.004394914023578167, 0.01723133958876133, 0.004953497089445591, 0.05252751708030701, -0.02876747027039528, 0.0877317413687706, 0.1293603926897049, 0.22469599545001984, 0.0859764814376831, 0.10726156830787659, -0.03779946267604828, -0.11958060413599014, -0.10281749069690704, 0.10490790754556656, -0.025059456005692482, 0.027776990085840225, -0.034597933292388916, 0.1532643437385559, 0.0891711562871933, -0.16347873210906982, 0.06462220847606659, 0.004943858366459608, -0.11444298177957535, -0.0840066596865654, -0.0808551013469696, -0.026740990579128265, -0.07008516043424606, -0.0054893484339118, -0.092368483543396, 0.04045846313238144, 0.06593496352434158, 0.07125704735517502, -0.04996616020798683, 0.1521838903427124, 0.0011243867920711637, -0.06390701979398727, 0.10685624927282333, -0.01262502372264862, 0.11356592923402786, -0.10048244148492813, -0.013283018954098225, 0.008038366213440895, -0.0187943484634161, 0.0678696259856224, 0.007407494820654392, -0.029874132946133614, 0.026528259739279747, 0.04737852141261101, -0.042671699076890945, -0.0007660193368792534, 0.030264727771282196, 0.12494583427906036, 0.10761483758687973, 0.0564577616751194, -0.041145261377096176, -0.027431102469563484, 0.20308049023151398, -0.03202056139707565, -0.08144970238208771, -0.16975674033164978, 0.155523881316185, 0.07429669797420502, 0.021220741793513298, 0.030605943873524666, -0.09444469958543777, -0.00039208182715810835, 0.23846706748008728, 0.12397800385951996, -0.06074051931500435, -0.04971260949969292, 0.03573273867368698, -0.0033866814337670803, 0.01705072820186615, 0.11617060750722885, 0.03702990710735321, 0.1921018362045288, -0.11511994898319244, 0.0516250915825367, -0.07477029412984848, -0.05400246009230614, -0.016069138422608376, 0.16552695631980896, 0.007091729436069727, -0.010540765710175037, -0.07275479286909103, 0.1136539950966835, 0.019223032519221306, -0.17718592286109924, 0.06592732667922974, -0.07399556040763855, -0.1405438333749771, -0.022297237068414688, -0.01149167213588953, -0.0021046712063252926, 0.06836781650781631, -0.014304575510323048, -0.008180718868970871, 0.12767454981803894, 0.03700434789061546, -0.052181974053382874, -0.1787826120853424, 0.07105066627264023, -0.025457438081502914, 0.18851004540920258, -0.009039635770022869, 0.08690668642520905, 0.07391288131475449, 0.02583525888621807, -0.10743504017591476, 0.07142986357212067, 0.05131974071264267, 0.0248880535364151, 0.04647371917963028, 0.10478252172470093, -0.045370426028966904, 0.08676981925964355, 0.024449393153190613, -0.11510562896728516, 0.07195655256509781, -0.15103590488433838, -0.0475255586206913, -0.16416965425014496, 0.034109245985746384, -0.045629553496837616, 0.13202250003814697, 0.21617881953716278, -0.024779893457889557, 0.006107797380536795, -0.0688842236995697, 0.047968871891498566, -0.014201232232153416, 0.15885144472122192, 0.012792796827852726, -0.20492497086524963, 0.006447248160839081, -0.07073573023080826, 0.033132705837488174, -0.1906513273715973, -0.021672876551747322, 0.007999260909855366, -0.08178902417421341, -0.03177576884627342, 0.11806967854499817, 0.04945868253707886, 0.061527151614427567, -0.0306242685765028, -0.045756954699754715, -0.013089035637676716, 0.13860741257667542, -0.11556529998779297, -0.09149665385484695 ]
null
null
transformers
# legal_t5_small_trans_cs_it_small_finetuned model Model on translating legal text from Cszech to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_it_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_cs_it_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Italian. ### How to use Here is how to use this model to translate legal text from Cszech to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_it_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_it", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Členové přítomní při závěrečném hlasování" pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_it_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_it_small_finetuned | 46.367| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Italian", "tags": ["translation Cszech Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "\u010clenov\u00e9 p\u0159\u00edtomn\u00ed p\u0159i z\u00e1v\u011bre\u010dn\u00e9m hlasov\u00e1n\u00ed"}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_it_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_it\_small\_finetuned model ======================================================= Model on translating legal text from Cszech to Italian. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_it\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_cs\_it\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Italian. ### How to use Here is how to use this model to translate legal text from Cszech to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_it\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_it\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_it\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_it\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.08006434142589569, 0.0991489440202713, -0.002857780084013939, 0.08355254679918289, 0.05779130384325981, 0.018092049285769463, 0.04409777373075485, 0.10748573392629623, -0.011659294366836548, 0.09682350605726242, 0.030571648851037025, -0.02710305154323578, 0.07490713894367218, 0.03717191889882088, 0.05454135686159134, -0.21427682042121887, 0.010467110201716423, -0.040633223950862885, -0.0031932457350194454, 0.10602155327796936, 0.09377346932888031, -0.06863628327846527, 0.04668610543012619, -0.04369181767106056, -0.045242492109537125, 0.026520051062107086, -0.09333864599466324, -0.040903862565755844, 0.08261880278587341, 0.09448075294494629, 0.08794921636581421, -0.014884816482663155, 0.07541144639253616, -0.18283139169216156, -0.0021246508695185184, 0.08533095568418503, -0.0021507213823497295, 0.03829599916934967, 0.14440365135669708, 0.001587470411323011, 0.1637459099292755, -0.03720523789525032, 0.01796666532754898, 0.03956715017557144, -0.118852898478508, -0.09457167983055115, -0.04892195388674736, 0.024529846385121346, 0.07946298271417618, 0.13700486719608307, -0.05283351242542267, 0.07493581622838974, -0.0650838166475296, 0.07162901014089584, 0.0772705078125, -0.2079995721578598, -0.04112507775425911, 0.01992706023156643, 0.03874210640788078, 0.08677748590707779, -0.04218050092458725, -0.012111938558518887, 0.047415416687726974, 0.05639505013823509, 0.005827718880027533, -0.05361061543226242, -0.05118625983595848, -0.05900022014975548, -0.13048246502876282, -0.06214044988155365, 0.16221781075000763, 0.02002948522567749, -0.04944390058517456, -0.07864853739738464, -0.0596839115023613, -0.06537559628486633, -0.009729920886456966, -0.03321539983153343, 0.018831871449947357, 0.00038841436617076397, 0.08361112326383591, -0.0018445705063641071, -0.11664154380559921, -0.0663130059838295, -0.055666711181402206, 0.15246087312698364, 0.052312176674604416, 0.010368860326707363, 0.032121509313583374, 0.08668500185012817, -0.11656107753515244, -0.0809755027294159, 0.00747480196878314, 0.02444886788725853, -0.11029230058193207, -0.006285605952143669, -0.013283200562000275, -0.1673177182674408, -0.011143806390464306, 0.03238459676504135, -0.043302226811647415, 0.04640170931816101, 0.07789977639913559, 0.04819014295935631, 0.05718744173645973, 0.10670211166143417, -0.1324281245470047, -0.12051648646593094, -0.03344341740012169, 0.003950619138777256, 0.004731842782348394, 0.012363695539534092, -0.06662365049123764, -0.040082577615976334, -0.0013288091868162155, 0.03702900558710098, 0.021384423598647118, 0.020095590502023697, -0.027197793126106262, -0.04195917397737503, 0.10916522890329361, -0.11663738638162613, 0.002685910789296031, 0.007783681154251099, -0.10358785092830658, -0.014252418652176857, 0.057844407856464386, -0.020724941045045853, -0.12564849853515625, 0.07723964005708694, -0.03939036652445793, -0.02744593285024166, -0.1170661523938179, -0.16401180624961853, -0.0026551629416644573, -0.002867360832169652, -0.06492528319358826, -0.09902690351009369, -0.1158401146531105, -0.09498702734708786, 0.035357605665922165, -0.0702512189745903, 0.010644198395311832, -0.06063913553953171, 0.005210976116359234, -0.011338145472109318, -0.006491086445748806, 0.10335824638605118, -0.04204194247722626, 0.03653578832745552, 0.024748679250478745, 0.06753894686698914, 0.01981353387236595, 0.029656900092959404, -0.12555518746376038, 0.043489448726177216, -0.14202487468719482, 0.1358831822872162, -0.02649817056953907, 0.0032827865798026323, -0.12759855389595032, -0.050811585038900375, -0.08510277420282364, 0.056333448737859726, 0.07159477472305298, 0.1372288167476654, -0.2025383561849594, -0.0076569910161197186, 0.19607561826705933, -0.09483425319194794, -0.08087027817964554, 0.1375385969877243, -0.01621132716536522, 0.043980300426483154, 0.07956191152334213, 0.11456681787967682, 0.06897034496068954, -0.01740412972867489, -0.053153470158576965, 0.0018430536147207022, 0.02467821165919304, 0.06783780455589294, 0.09036124497652054, -0.06164764612913132, 0.05548984184861183, 0.016375498846173286, -0.009167320095002651, -0.0035240857396274805, -0.029281683266162872, -0.037856969982385635, 0.0041431887075304985, -0.05623525753617287, -0.043538857251405716, 0.03634735941886902, 0.011384785175323486, -0.05598130449652672, -0.09345000982284546, -0.07521791756153107, 0.11416200548410416, -0.061788205057382584, 0.018730483949184418, -0.01987362839281559, -0.025420820340514183, -0.09694990515708923, 0.013455347158014774, -0.1634143441915512, -0.017559096217155457, 0.04563049599528313, -0.06312539428472519, 0.10289422422647476, 0.05281208083033562, 0.053211186081171036, 0.09410010278224945, -0.059448469430208206, -0.03134657070040703, 0.005614580120891333, -0.025127174332737923, -0.1022743359208107, -0.13013480603694916, -0.05432147905230522, -0.016584431752562523, 0.006662371102720499, -0.12481997162103653, 0.0018897330155596137, -0.052643295377492905, 0.09618675708770752, -0.00391416298225522, -0.023244960233569145, 0.012065103277564049, 0.0582188256084919, -0.03347732126712799, -0.039802297949790955, 0.022454554215073586, -0.007578526623547077, -0.035310737788677216, 0.08032778650522232, -0.16176186501979828, -0.10178226977586746, 0.07719255238771439, 0.00021085185289848596, -0.12394457310438156, -0.017954489216208458, -0.01591235026717186, -0.06499288231134415, -0.05515223369002342, -0.056453317403793335, 0.22515380382537842, 0.033854104578495026, 0.12926781177520752, -0.11201845109462738, -0.014212348498404026, 0.017207277938723564, -0.022216318175196648, -0.00405094912275672, 0.16331170499324799, 0.09441620111465454, -0.14790740609169006, 0.08819829672574997, 0.008750220760703087, -0.03835399076342583, 0.1035318449139595, 0.06732138991355896, -0.1130920946598053, -0.003145643277093768, 0.05471196398139, 0.006025438196957111, 0.057245831936597824, -0.10653649270534515, -0.00041881564538925886, 0.02401488833129406, 0.04905341938138008, 0.06541374325752258, -0.08767607063055038, 0.05939503014087677, 0.057736266404390335, -0.022485066205263138, 0.04001602530479431, -0.06101973354816437, -0.0447484627366066, 0.10871659964323044, 0.02381032332777977, -0.04909748211503029, -0.04681738093495369, -0.0527593269944191, -0.10589337348937988, 0.1943122148513794, -0.053447652608156204, -0.20862114429473877, -0.11143958568572998, 0.0761733204126358, -0.03537769615650177, 0.04285748675465584, 0.0162996556609869, -0.031410131603479385, -0.06634240597486496, -0.11897175014019012, 0.07159564644098282, -0.07744064927101135, -0.04695648327469826, -0.13239525258541107, 0.035818420350551605, -0.003574804635718465, -0.10914452373981476, 0.024937670677900314, -0.0021957075223326683, -0.036859381943941116, -0.008797905407845974, -0.04996926337480545, 0.13050232827663422, 0.14726291596889496, -0.017959974706172943, -0.02333245985209942, 0.0031372802332043648, 0.1116705983877182, -0.11091286689043045, 0.06600777059793472, 0.07613130658864975, 0.03173940256237984, 0.02383514866232872, 0.14683514833450317, 0.023240722715854645, -0.06375489383935928, 0.037462782114744186, 0.055418603122234344, -0.020778054371476173, -0.21673278510570526, -0.11431058496236801, -0.07432543486356735, 0.02183803729712963, 0.11409605294466019, 0.0376027449965477, -0.0483829528093338, 0.032234352082014084, -0.06781505793333054, 0.030706821009516716, -0.000024797791411401704, 0.053518906235694885, 0.009000041522085667, -0.009153534658253193, 0.08253923803567886, -0.05905323103070259, -0.051251258701086044, 0.1066206619143486, 0.04493355751037598, 0.19973930716514587, -0.05651117116212845, 0.21779723465442657, 0.049503449350595474, 0.045631930232048035, 0.011895938776433468, 0.05565101280808449, -0.02775048464536667, 0.016121676191687584, -0.024799834936857224, -0.06556662172079086, -0.03196239843964577, 0.06049948185682297, 0.01594126783311367, 0.0007790116360411048, -0.0558738075196743, -0.05935844033956528, 0.05462510138750076, 0.22404249012470245, 0.051724500954151154, -0.20739497244358063, -0.04937582463026047, 0.0014620113652199507, -0.05604427680373192, -0.06483687460422516, 0.010812653228640556, 0.15788285434246063, -0.08904652297496796, 0.004318307153880596, 0.018872229382395744, 0.12095904350280762, -0.13998635113239288, -0.01687791757285595, 0.041688911616802216, 0.04995868727564812, -0.018315693363547325, 0.1385388821363449, -0.23060308396816254, 0.1544436812400818, 0.014948541298508644, 0.06426253169775009, -0.055468492209911346, 0.014307217672467232, -0.03613204509019852, 0.02463439479470253, 0.12940086424350739, 0.012893328443169594, -0.006516206543892622, -0.12736229598522186, -0.10601343959569931, -0.0032326371874660254, 0.08424632996320724, -0.06747885048389435, 0.09237226098775864, 0.043639712035655975, 0.011789984069764614, -0.016337405890226364, 0.029200395569205284, -0.04779629409313202, -0.16049538552761078, 0.021465066820383072, -0.014893639832735062, -0.035318922251462936, -0.011803700588643551, -0.032563019543886185, -0.05016311630606651, 0.20419716835021973, -0.09226848185062408, -0.06035316735506058, -0.08038398623466492, 0.006926956120878458, 0.13390038907527924, -0.07334857434034348, 0.00011285235814284533, -0.005304188467562199, 0.05592513456940651, -0.027964506298303604, -0.018295427784323692, 0.09726826101541519, -0.08960047364234924, -0.09718199819326401, -0.07538814097642899, 0.10550916939973831, 0.07653119415044785, 0.04457993060350418, -0.006852907594293356, 0.027573522180318832, -0.011837631464004517, -0.11051564663648605, -0.024954181164503098, 0.01651833951473236, 0.11088574677705765, 0.07587550580501556, -0.05882558226585388, -0.03760937228798866, -0.06889613717794418, -0.05481654778122902, 0.0751708447933197, 0.1587851494550705, -0.046606019139289856, 0.008167724125087261, 0.19027209281921387, -0.10478249937295914, -0.19614098966121674, -0.036681681871414185, 0.07624664157629013, 0.07184092700481415, -0.025055794045329094, -0.1689826101064682, 0.03938505798578262, 0.10985657572746277, -0.0003332404012326151, 0.06826519966125488, -0.3915545642375946, -0.12706412374973297, 0.05967666208744049, 0.038675516843795776, -0.037948403507471085, -0.1199062243103981, -0.048404842615127563, -0.05777875706553459, -0.016392847523093224, 0.07590492069721222, -0.03096814639866352, 0.09678241610527039, 0.0016915155574679375, -0.0053946697153151035, 0.05008426308631897, -0.04243161901831627, 0.1296115666627884, 0.04086761549115181, 0.04030197858810425, -0.06184368208050728, 0.05935255438089371, 0.01375284232199192, -0.0066218189895153046, 0.14727729558944702, -0.04106177017092705, 0.041694868355989456, -0.13052916526794434, -0.059179093688726425, -0.07023651897907257, 0.029202910140156746, -0.0413145087659359, -0.06648565083742142, -0.04869356378912926, 0.052082985639572144, 0.04895346239209175, 0.010700982995331287, -0.019539164379239082, -0.06789454817771912, 0.0020258151926100254, 0.1784837245941162, 0.11391258984804153, 0.02255355194211006, -0.1085105612874031, 0.02848452888429165, 0.011062558740377426, 0.07408367097377777, -0.04870879277586937, 0.01113716047257185, 0.13761447370052338, 0.016375184059143066, 0.11045439541339874, -0.007497571874409914, -0.14306065440177917, -0.020196055993437767, 0.06128852441906929, -0.09200949221849442, -0.12476188689470291, -0.008659944869577885, 0.04576850309967995, -0.07955321669578552, -0.02432452328503132, 0.09192614257335663, -0.0934806689620018, -0.0159638412296772, 0.0006734663038514555, 0.036863621324300766, -0.03416476026177406, 0.2087836116552353, 0.03555794060230255, 0.03636767715215683, -0.06373605132102966, 0.12488258630037308, 0.13043102622032166, -0.15783579647541046, 0.005122391972690821, 0.19514602422714233, -0.08476514369249344, -0.06733518093824387, 0.022634144872426987, 0.11468001455068588, -0.035995420068502426, -0.07066158205270767, -0.04585524648427963, -0.04891964793205261, 0.026287565007805824, -0.008220041170716286, 0.046212706714868546, 0.03830632567405701, -0.020863858982920647, -0.020152797922492027, -0.11172937601804733, 0.09251481294631958, 0.08856508880853653, 0.03369177132844925, -0.01790548674762249, 0.1415025293827057, 0.0317579060792923, -0.03604744002223015, -0.012548654340207577, 0.0004694294766522944, -0.06402464956045151, 0.019013432785868645, -0.05599575117230415, 0.00624917121604085, -0.044592682272195816, -0.015986165031790733, -0.028187045827507973, 0.0023975870572030544, -0.02416159212589264, -0.003409310709685087, -0.04183102026581764, -0.047885261476039886, -0.041766926646232605, 0.019878411665558815, -0.09043735265731812, -0.02703341282904148, 0.015224174596369267, -0.027654752135276794, 0.053502120077610016, 0.0029525815043598413, 0.012088140472769737, -0.011604252271354198, -0.03148793429136276, 0.07018852978944778, 0.00033329942380078137, 0.048389703035354614, -0.01376260258257389, -0.07439782470464706, 0.025765247642993927, 0.03182792291045189, -0.009520772844552994, -0.013828292489051819, 0.020023304969072342, -0.14320111274719238, 0.016371265053749084, -0.01103983074426651, -0.04022189602255821, -0.07598839700222015, 0.10256882011890411, 0.04582766070961952, 0.06367851793766022, 0.08941058069467545, -0.06419629603624344, 0.0815775990486145, -0.15999628603458405, -0.007130470126867294, 0.028437966480851173, 0.0003562027122825384, -0.026387318968772888, -0.0020883255638182163, 0.05239277705550194, -0.055869776755571365, 0.12446127831935883, 0.03959713503718376, 0.06797830015420914, 0.02016569674015045, -0.05641748011112213, 0.004913006909191608, 0.027632299810647964, 0.0763319581747055, -0.02810022234916687, -0.02287726290524006, -0.05568099766969681, 0.08988513052463531, -0.007745503913611174, 0.07858797162771225, 0.04945549741387367, 0.13309454917907715, 0.11856894195079803, 0.04306856542825699, 0.005172867327928543, -0.11372119933366776, -0.06643549352884293, 0.05417843535542488, -0.018897460773587227, 0.04499134048819542, -0.02595846727490425, 0.107582688331604, 0.11975327879190445, -0.14304251968860626, 0.11468996852636337, -0.008878365159034729, -0.08865499496459961, -0.03902300074696541, -0.15543623268604279, -0.03953348472714424, -0.016256116330623627, -0.04437832906842232, -0.11440517753362656, 0.024685470387339592, 0.07803399860858917, 0.04042934626340866, -0.04122334346175194, 0.14523769915103912, -0.057734955102205276, -0.10697043687105179, 0.05653749778866768, 0.018519792705774307, 0.10002928972244263, 0.01772785559296608, 0.011372183449566364, 0.043792568147182465, 0.0011246842332184315, 0.05149383470416069, 0.05168834328651428, -0.004612897522747517, -0.003067081794142723, 0.015543763525784016, -0.05282926931977272, -0.03913336619734764, -0.001801609992980957, 0.08047908544540405, 0.19933906197547913, 0.03973813354969025, -0.0615592785179615, -0.023429354652762413, 0.19442711770534515, -0.06128733232617378, -0.06557132303714752, -0.10112397372722626, 0.20708592236042023, 0.031854718923568726, 0.032350584864616394, -0.007863776758313179, -0.10150457173585892, -0.005177563522011042, 0.156823992729187, 0.17975619435310364, -0.0673760250210762, -0.031060989946126938, 0.026973484084010124, -0.003138564759865403, 0.034468308091163635, 0.04633241891860962, 0.01868644915521145, 0.2660166025161743, -0.0880904570221901, 0.10571631789207458, -0.03996577858924866, 0.017017336562275887, -0.008496020920574665, 0.1740729659795761, 0.005317043513059616, 0.02651156298816204, -0.0863456279039383, 0.08055532723665237, -0.004670252092182636, -0.16786976158618927, 0.024020235985517502, -0.09868130087852478, -0.12157104164361954, 0.009011099115014076, -0.01879681833088398, 0.050841156393289566, 0.08608843386173248, 0.0007842467166483402, 0.028657181188464165, 0.06291189044713974, 0.01696205884218216, -0.11141009628772736, -0.14910881221294403, 0.007735790219157934, -0.026941688731312752, 0.15036660432815552, -0.003827326465398073, 0.13056249916553497, 0.08315636217594147, 0.006861972156912088, -0.10520056635141373, 0.0831788033246994, 0.02636326476931572, 0.0199101734906435, 0.07925993204116821, 0.09669042378664017, -0.012994436547160149, 0.0789465680718422, 0.0418500192463398, -0.0871996060013771, 0.03395296260714531, -0.05687073618173599, -0.012322898022830486, -0.13443979620933533, 0.07913026958703995, -0.04287495091557503, 0.15337489545345306, 0.18680956959724426, -0.019927598536014557, -0.02615928277373314, -0.04761799797415733, 0.018239814788103104, -0.004186864942312241, 0.1069074347615242, -0.009702937677502632, -0.1759163737297058, 0.01881808415055275, -0.08089018613100052, 0.037818543612957, -0.24172793328762054, -0.023750577121973038, 0.017104921862483025, -0.05703515186905861, -0.010428243316709995, 0.07079735398292542, 0.04722294211387634, 0.0486353375017643, -0.052544161677360535, -0.10453660041093826, 0.0076828342862427235, 0.09989694505929947, -0.0970880389213562, -0.11689277738332748 ]
null
null
transformers
# legal_t5_small_trans_cs_sv model Model on translating legal text from Cszech to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_sv is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Swedish. ### How to use Here is how to use this model to translate legal text from Cszech to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "Odborná příprava je v sektoru minimální a tradiční, postrádá specifické kurzy nebo výukové plány." pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_sv model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_sv | 47.9| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Swedish", "tags": ["translation Cszech Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Odborn\u00e1 p\u0159\u00edprava je v sektoru minim\u00e1ln\u00ed a tradi\u010dn\u00ed, postr\u00e1d\u00e1 specifick\u00e9 kurzy nebo v\u00fdukov\u00e9 pl\u00e1ny."}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_sv model ===================================== Model on translating legal text from Cszech to Swedish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_sv is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Swedish. ### How to use Here is how to use this model to translate legal text from Cszech to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.12031272053718567, 0.05628182366490364, -0.0017247953219339252, 0.0781685933470726, 0.08691238611936569, -0.005516937933862209, 0.06783731281757355, 0.08837474882602692, -0.07163278758525848, 0.0752049908041954, 0.07628144323825836, 0.030846411362290382, 0.09967293590307236, 0.10968311876058578, 0.04253578558564186, -0.24259717762470245, 0.03066851571202278, -0.030272386968135834, -0.005555768497288227, 0.13773250579833984, 0.12364029139280319, -0.0822865217924118, 0.010869507677853107, -0.027998320758342743, -0.0935499519109726, -0.014606413431465626, -0.05829659476876259, -0.06161431595683098, 0.07442058622837067, 0.0427011176943779, 0.11149588227272034, 0.02753179892897606, 0.09405819326639175, -0.15859554708003998, -0.004721236880868673, 0.05654023587703705, 0.048852551728487015, 0.03232099488377571, 0.07032117247581482, 0.01480114459991455, 0.1680302619934082, -0.024838410317897797, 0.05233907699584961, 0.009802372194826603, -0.08855557441711426, -0.13587777316570282, -0.06449062377214432, 0.03814084455370903, 0.14393585920333862, 0.14550475776195526, -0.0663815587759018, 0.06582760810852051, -0.1271516978740692, 0.07656221836805344, 0.06757496297359467, -0.27333229780197144, -0.07097704708576202, 0.06083791330456734, 0.0541582889854908, 0.09418771415948868, -0.05369047075510025, -0.02283927984535694, 0.03748901188373566, 0.034304823726415634, 0.024422897025942802, -0.027758024632930756, -0.024929122999310493, -0.01235196739435196, -0.1755576729774475, -0.081937275826931, 0.18022462725639343, -0.0026067085564136505, -0.06824885308742523, -0.100735604763031, -0.0205239187926054, -0.1425732523202896, 0.013053207658231258, -0.05438479781150818, 0.042923878878355026, -0.01571064069867134, 0.033326756209135056, -0.03640741854906082, -0.13138511776924133, -0.09975463151931763, 0.05311013013124466, 0.07020258903503418, 0.07364162057638168, -0.007421925663948059, 0.032768357545137405, 0.14422392845153809, -0.0015941786114126444, -0.09232642501592636, -0.030031437054276466, -0.001793121569789946, -0.12621815502643585, -0.03007569909095764, -0.03724361211061478, -0.18445302546024323, -0.058851081877946854, 0.08982513099908829, -0.007750615943223238, 0.052487168461084366, 0.045101068913936615, 0.047635518014431, 0.009052797220647335, 0.17680871486663818, -0.08391353487968445, -0.06993277370929718, -0.06765387952327728, 0.037903402000665665, -0.049520157277584076, 0.00814931932836771, -0.026377003639936447, -0.015045348554849625, 0.07756704837083817, 0.0657024011015892, -0.0893051028251648, 0.013492172583937645, -0.029743332415819168, -0.01445711124688387, 0.02273758128285408, -0.12378132343292236, -0.045700930058956146, -0.0067829350009560585, -0.10005523264408112, -0.04118063673377037, 0.08376018702983856, -0.0024385065771639347, -0.13696593046188354, 0.08230986446142197, -0.01587199606001377, -0.00655365688726306, -0.11501123011112213, -0.1104494109749794, -0.017579181119799614, -0.08116710931062698, -0.04133986309170723, -0.05699554458260536, -0.14093557000160217, -0.11656701564788818, 0.07886725664138794, -0.052302952855825424, -0.0523282065987587, -0.10451946407556534, -0.03194267302751541, 0.0046568363904953, -0.04950361326336861, 0.1351318359375, -0.029665395617485046, 0.07902299612760544, -0.031771134585142136, 0.04890093207359314, 0.14190836250782013, 0.06615356355905533, -0.11281104385852814, 0.006266691721975803, -0.11926258355379105, 0.17195403575897217, -0.06175320968031883, -0.003929019905626774, -0.14048582315444946, -0.08394025266170502, -0.04153427109122276, 0.069949671626091, 0.10081298649311066, 0.1323607861995697, -0.16397176682949066, -0.012318129651248455, 0.20481422543525696, -0.1038096621632576, -0.031830139458179474, 0.10428495705127716, -0.03025878220796585, 0.1351047307252884, 0.09762795269489288, 0.19665561616420746, 0.05228548124432564, -0.08182960003614426, -0.007543646730482578, -0.010278335772454739, -0.012898695655167103, -0.017049891874194145, 0.09204529225826263, -0.05469503998756409, -0.05310489237308502, 0.005682711489498615, -0.10185924172401428, 0.015444162301719189, -0.047050293534994125, -0.06726328283548355, 0.021300218999385834, -0.05322211980819702, -0.054147008806467056, 0.06615962088108063, 0.04859938472509384, -0.06246159225702286, -0.11216461658477783, 0.006200490519404411, 0.07942980527877808, -0.07340281456708908, 0.03873012959957123, -0.03023623488843441, -0.04835032671689987, -0.07988700270652771, -0.014052491635084152, -0.1583608239889145, 0.019315030425786972, 0.03321114927530289, -0.00033663606154732406, 0.04198986291885376, 0.07421092689037323, 0.0477549210190773, 0.059882573783397675, -0.006927612703293562, -0.0488663949072361, -0.03448833152651787, -0.0435124970972538, -0.13031728565692902, -0.11772535741329193, -0.032089170068502426, -0.026592345908284187, 0.10076583921909332, -0.17980381846427917, 0.020634636282920837, -0.08592185378074646, 0.04403046518564224, -0.01690734177827835, -0.04895288869738579, 0.04852161183953285, 0.03914620727300644, 0.03295974060893059, -0.06330311298370361, 0.053089480847120285, 0.027325887233018875, 0.002673201495781541, 0.11396637558937073, -0.07446401566267014, -0.16284474730491638, 0.07758555561304092, 0.05389092490077019, -0.15031075477600098, 0.001451513497158885, -0.040103424340486526, -0.0620076023042202, -0.05643207207322121, 0.015587106347084045, 0.22539563477039337, 0.025392411276698112, 0.13538818061351776, -0.11115759611129761, -0.03715258836746216, -0.003361484734341502, -0.044625453650951385, 0.012975973077118397, 0.16234585642814636, 0.05468054860830307, -0.10535203665494919, 0.051196105778217316, -0.01569240167737007, -0.030264610424637794, 0.20096585154533386, 0.00606254069134593, -0.12788863480091095, 0.02444341406226158, 0.06942672282457352, -0.03199687600135803, 0.10968364030122757, -0.12061525136232376, 0.007651954423636198, 0.023029884323477745, 0.04488261416554451, 0.05366629362106323, -0.15016397833824158, 0.013903342187404633, 0.04907165840268135, -0.06283865123987198, 0.026322288438677788, -0.007423605304211378, -0.06363425403833389, 0.06453227996826172, 0.010394612327218056, -0.03571585193276405, -0.018929678946733475, -0.04184059053659439, -0.14441411197185516, 0.21268686652183533, -0.06411212682723999, -0.14906850457191467, -0.11748155951499939, 0.11199455708265305, 0.04703997075557709, -0.00011115876986877993, 0.05401987209916115, -0.08878879994153976, -0.05664631724357605, -0.10476379841566086, 0.1074276715517044, -0.023323601111769676, -0.05894042178988457, -0.10959754139184952, -0.0101805180311203, -0.0057647437788546085, -0.12469043582677841, 0.031212549656629562, -0.05953008309006691, -0.07097375392913818, 0.00463102338835597, -0.04872993752360344, 0.07460255175828934, 0.13685111701488495, -0.00021133861446287483, 0.02380436658859253, -0.0031373172532767057, 0.1854766458272934, -0.11985567957162857, 0.02266538515686989, 0.05851345881819725, 0.020033935084939003, 0.007170998957008123, 0.1181933805346489, -0.006816685199737549, -0.07518632709980011, 0.03336188942193985, 0.05635948106646538, -0.03603196516633034, -0.27858853340148926, -0.0342002734541893, -0.022218363359570503, -0.022409597411751747, 0.10200875997543335, 0.04268565773963928, -0.012811374850571156, 0.060512520372867584, -0.01886303909122944, -0.041256580501794815, 0.03635275363922119, 0.047151822596788406, -0.04420549049973488, -0.001341992407105863, 0.08179920166730881, -0.04573875665664673, 0.0032964625861495733, 0.045241571962833405, 0.0025738650001585484, 0.23762421309947968, -0.06424961239099503, 0.09818874299526215, 0.07966198027133942, 0.11316927522420883, 0.000349065667251125, 0.08049347996711731, -0.024465758353471756, 0.005701380781829357, 0.012427940033376217, -0.03563975542783737, -0.08733513951301575, 0.036872535943984985, 0.012043734081089497, 0.003376932116225362, -0.09300041198730469, -0.004659601487219334, 0.014535298570990562, 0.34062910079956055, 0.06723365187644958, -0.20540842413902283, -0.08571278303861618, 0.0058568064123392105, -0.06862778216600418, -0.08661134541034698, 0.05300210416316986, 0.10309300571680069, -0.15058927237987518, -0.0021240869536995888, -0.043729186058044434, 0.10160961002111435, -0.08618314564228058, -0.05257118493318558, 0.05054382607340813, 0.046707794070243835, -0.00807944219559431, 0.09760269522666931, -0.27204757928848267, 0.20536062121391296, -0.015992367640137672, 0.13174638152122498, -0.02570226415991783, 0.025266390293836594, -0.056022047996520996, 0.019023243337869644, 0.18259970843791962, 0.01715361326932907, -0.019913794472813606, -0.06483788043260574, -0.10332406312227249, 0.021329151466488838, 0.03189970552921295, -0.06976448744535446, 0.0877966582775116, 0.027833132073283195, 0.03990859538316727, -0.028414972126483917, -0.11228954046964645, -0.11395363509654999, -0.09320946782827377, -0.008968536742031574, -0.09405803680419922, 0.04846157506108284, -0.036673787981271744, -0.06227603182196617, -0.04249728471040726, 0.1635197103023529, -0.1221204325556755, -0.12008447200059891, -0.10433384031057358, 0.025260761380195618, 0.08740346133708954, -0.04739468917250633, -0.0072893560864031315, 0.022226963192224503, 0.019770773127675056, -0.023279713466763496, 0.027834484353661537, 0.08680624514818192, -0.06450960040092468, -0.12142543494701385, -0.012554915621876717, 0.1375429779291153, 0.13073986768722534, 0.05422656238079071, -0.020513053983449936, 0.028089342638850212, -0.008213209919631481, -0.08588455617427826, 0.006319732405245304, 0.005654452368617058, 0.05714954808354378, 0.014798382297158241, -0.055256519466638565, -0.009382413700222969, -0.0841856449842453, -0.055623628199100494, 0.1186523586511612, 0.14765843749046326, -0.05498164892196655, 0.09349072724580765, 0.17832675576210022, -0.1033540591597557, -0.19939514994621277, 0.014819951727986336, 0.11680010706186295, 0.08928490430116653, -0.056708965450525284, -0.19878140091896057, 0.03607822209596634, 0.08303193747997284, 0.004432925954461098, -0.04192057251930237, -0.3847748637199402, -0.1395338624715805, 0.09193301200866699, 0.08852217346429825, -0.03568541258573532, -0.05367427319288254, -0.01873955689370632, 0.04280367121100426, -0.029561948031187057, 0.06499820947647095, -0.0196318831294775, 0.10139772295951843, 0.03271033242344856, -0.037548404186964035, 0.05219029262661934, -0.06582903116941452, 0.11889886111021042, 0.062675841152668, 0.033332642167806625, -0.06719831377267838, 0.06888674199581146, -0.0001515199546702206, -0.022671956568956375, 0.18328320980072021, 0.01310641784220934, 0.023931678384542465, -0.19784808158874512, -0.06510194391012192, -0.09035474807024002, 0.01771431788802147, -0.070035919547081, -0.061505869030952454, -0.05261726304888725, 0.09702213853597641, 0.0678381621837616, -0.014563404954969883, 0.011274504475295544, -0.10118371248245239, -0.02157854475080967, 0.09221930056810379, 0.11572357267141342, 0.06318175792694092, -0.07974584400653839, 0.025370243936777115, 0.03764256089925766, 0.09286890923976898, -0.1379583179950714, -0.02633119747042656, 0.12913180887699127, -0.020818116143345833, 0.12276938557624817, -0.030019395053386688, -0.13994842767715454, 0.0017264048801735044, 0.03196503967046738, -0.1080528199672699, -0.11441768705844879, -0.01185469701886177, -0.10273095965385437, -0.0353766493499279, -0.05229061841964722, 0.08049231767654419, -0.1302727311849594, -0.007749895565211773, -0.01907307468354702, 0.04778381437063217, -0.07571981847286224, 0.22197875380516052, 0.047957152128219604, 0.06603626161813736, -0.07325004786252975, 0.13999196887016296, 0.080534927546978, -0.10664220154285431, 0.03823097422719002, 0.16923072934150696, -0.1047574058175087, -0.05289093032479286, 0.03222595155239105, 0.14510084688663483, -0.030407268553972244, -0.07457702606916428, -0.05489659681916237, -0.058302707970142365, 0.05433458834886551, -0.005194460973143578, 0.05276840180158615, 0.009124177508056164, -0.02658735401928425, -0.011142643168568611, -0.11794069409370422, 0.07784180343151093, 0.09452811628580093, -0.010982298292219639, -0.02018033154308796, 0.1858491599559784, 0.03784023970365524, 0.0538855642080307, -0.01703726314008236, -0.018487369641661644, -0.03880074992775917, 0.061188843101263046, -0.0025382195599377155, -0.027417533099651337, -0.05470497906208038, -0.0010772725800052285, -0.03494136035442352, -0.010008834302425385, -0.0006356663652695715, 0.02035701647400856, -0.0728212296962738, -0.023241156712174416, -0.05553746968507767, 0.029459919780492783, -0.07146942615509033, -0.015662502497434616, -0.014446970075368881, -0.04692798852920532, 0.06706002354621887, 0.020092692226171494, -0.007937170565128326, 0.03503362089395523, -0.007391230668872595, 0.08646102994680405, -0.02747388556599617, -0.005633004475384951, -0.008082535117864609, -0.05940116569399834, 0.013159262016415596, -0.0034974494483321905, -0.012074394151568413, -0.004846802446991205, 0.046797122806310654, -0.1282859891653061, 0.0534118190407753, -0.01647273078560829, -0.00827713217586279, -0.07526624947786331, 0.1360563188791275, 0.02132938802242279, 0.09398417919874191, 0.12837523221969604, -0.08258618414402008, 0.07291574031114578, -0.13724549114704132, -0.039816904813051224, 0.03593045845627785, 0.0013019090984016657, -0.039744533598423004, -0.05944954231381416, 0.05298605561256409, -0.0628584697842598, 0.06675181537866592, 0.08083698153495789, 0.04652637243270874, 0.03604314476251602, -0.11523027718067169, 0.016558215022087097, 0.04779853671789169, 0.06696231663227081, -0.01374843344092369, 0.018155477941036224, -0.0016859631286934018, 0.05228590965270996, -0.025100775063037872, 0.06004616618156433, 0.14761337637901306, 0.1963089406490326, 0.08607813715934753, 0.11437279731035233, -0.042553603649139404, -0.09272783994674683, -0.10671775043010712, 0.11168690770864487, -0.0026359804905951023, 0.03301097825169563, -0.015034651383757591, 0.134970560669899, 0.10067297518253326, -0.17310841381549835, 0.0775466114282608, 0.011951981112360954, -0.1157325729727745, -0.09880761057138443, -0.10260117053985596, -0.047603756189346313, -0.050910308957099915, 0.001097586820833385, -0.10183744132518768, 0.028973164036870003, 0.07366440445184708, 0.08222056180238724, -0.03040912188589573, 0.14956705272197723, -0.011229784227907658, -0.06888604164123535, 0.09618785232305527, -0.0141829252243042, 0.09516390413045883, -0.08966023474931717, 0.0012268981663510203, 0.027559872716665268, 0.0012540101306512952, 0.04384562000632286, 0.002518980298191309, -0.028581924736499786, 0.015120577067136765, 0.04475253447890282, -0.05169001966714859, 0.001173609052784741, 0.04996266961097717, 0.1372106820344925, 0.07688022404909134, 0.07435233145952225, -0.04776831716299057, -0.038332924246788025, 0.21337716281414032, -0.03470728546380997, -0.10247528553009033, -0.17835794389247894, 0.151495561003685, 0.05542716756463051, 0.038401566445827484, 0.03029906004667282, -0.0923149585723877, 0.00859699584543705, 0.2278139442205429, 0.13709233701229095, -0.032740335911512375, -0.0389392226934433, 0.006225842982530594, -0.010016017593443394, 0.011722957715392113, 0.11715399473905563, 0.01987682282924652, 0.17505358159542084, -0.10862331092357635, 0.043914880603551865, -0.08412227034568787, -0.06349653750658035, -0.02136516571044922, 0.15819182991981506, 0.013430900871753693, -0.015444482676684856, -0.06266114860773087, 0.12161451578140259, -0.00938509963452816, -0.18473049998283386, 0.06005506217479706, -0.04616996645927429, -0.15064014494419098, -0.02492898516356945, 0.02071823552250862, 0.010262991301715374, 0.04658642038702965, 0.002695054979994893, 0.012317229993641376, 0.10778172314167023, 0.03869892656803131, -0.04829324036836624, -0.14674250781536102, 0.06259604543447495, 0.0009513598051853478, 0.15952041745185852, 0.009880020283162594, 0.07573849707841873, 0.06661675125360489, 0.019963547587394714, -0.0944349467754364, 0.082295723259449, 0.05936597287654877, 0.013877335004508495, 0.0555243119597435, 0.1346723884344101, -0.03422962874174118, 0.09552152454853058, 0.023960113525390625, -0.13556526601314545, 0.04891595244407654, -0.11710453778505325, -0.05253703519701958, -0.1371544748544693, 0.051172833889722824, -0.05109694227576256, 0.12784934043884277, 0.22797514498233795, -0.015522651374340057, 0.0033298267517238855, -0.08339380472898483, 0.05111777037382126, -0.036541737616062164, 0.13414116203784943, 0.02249341644346714, -0.18528050184249878, -0.0011502285487949848, -0.061537694185972214, 0.017480211332440376, -0.18007947504520416, -0.02031385526061058, 0.0044886707328259945, -0.0759936049580574, -0.0153579693287611, 0.12248284369707108, 0.036379653960466385, 0.04985376447439194, -0.022415753453969955, -0.0035171445924788713, 0.007660095114260912, 0.137178435921669, -0.1257261484861374, -0.09604669362306595 ]
null
null
transformers
# legal_t5_small_trans_cs_sv_small_finetuned model Model on translating legal text from Cszech to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_cs_sv_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_cs_sv_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Cszech to Swedish. ### How to use Here is how to use this model to translate legal text from Cszech to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_cs_sv_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_cs_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) cs_text = "10 Ukončení denního zasedání" pipeline([cs_text], max_length=512) ``` ## Training data The legal_t5_small_trans_cs_sv_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_cs_sv_small_finetuned | 48.159| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Cszech Swedish", "tags": ["translation Cszech Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "10 Ukon\u010den\u00ed denn\u00edho zased\u00e1n\u00ed"}]}
text2text-generation
SEBIS/legal_t5_small_trans_cs_sv_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Cszech Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Cszech Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_cs\_sv\_small\_finetuned model ======================================================= Model on translating legal text from Cszech to Swedish. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_cs\_sv\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_cs\_sv\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Cszech to Swedish. ### How to use Here is how to use this model to translate legal text from Cszech to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_cs\_sv\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_sv\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_sv\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Cszech Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Cszech to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_cs\\_sv\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06462854146957397, 0.06685435771942139, -0.0026233552489429712, 0.08714288473129272, 0.046055298298597336, -0.004766710568219423, 0.02767620049417019, 0.08746808022260666, -0.023780345916748047, 0.08636043965816498, 0.05098120495676994, -0.026913601905107498, 0.07749883085489273, 0.02759966067969799, 0.06321682780981064, -0.23456428945064545, 0.010882915928959846, -0.04944619536399841, -0.002929412992671132, 0.09872380644083023, 0.09667739272117615, -0.05856475979089737, 0.038741324096918106, -0.03842083364725113, -0.009079531766474247, 0.01729803904891014, -0.09367866069078445, -0.03567969426512718, 0.0795312151312828, 0.0881478562951088, 0.08335549384355545, -0.009389882907271385, 0.0746646374464035, -0.1823134571313858, -0.005114923231303692, 0.05442076921463013, -0.0021621915511786938, 0.03289676457643509, 0.13064180314540863, 0.02311413735151291, 0.1775340586900711, -0.05001389980316162, 0.02314707264304161, 0.0317535437643528, -0.0971289873123169, -0.1181725412607193, -0.0480593740940094, 0.022046344354748726, 0.08166363835334778, 0.13322967290878296, -0.06191391125321388, 0.06103004887700081, -0.06267447769641876, 0.08662444353103638, 0.0729781910777092, -0.21810556948184967, -0.03809307515621185, 0.060950685292482376, 0.05806982144713402, 0.09182780236005783, -0.04983735829591751, -0.0031038071028888226, 0.053758393973112106, 0.07481315732002258, 0.037940941751003265, -0.067030169069767, -0.08182980865240097, -0.055518507957458496, -0.13614799082279205, -0.04408080503344536, 0.16728778183460236, 0.021039975807070732, -0.043979618698358536, -0.08716440200805664, -0.048410046845674515, -0.06176752224564552, -0.0001885719393612817, -0.03635428100824356, 0.027214735746383667, -0.012883524410426617, 0.07354803383350372, -0.03128843009471893, -0.1263313591480255, -0.04782753437757492, -0.05114905163645744, 0.1150583028793335, 0.030323095619678497, 0.013928852044045925, 0.046979330480098724, 0.0674683079123497, -0.13123248517513275, -0.08697313070297241, -0.008134031668305397, 0.01016215793788433, -0.11766047030687332, -0.008025424554944038, -0.016904639080166817, -0.20672813057899475, -0.012029267847537994, 0.029043681919574738, -0.049514953047037125, 0.0488409698009491, 0.08324088901281357, 0.0471031628549099, 0.061435919255018234, 0.12155579775571823, -0.12388616055250168, -0.12961390614509583, -0.0377984344959259, -0.015544303692877293, 0.01701800711452961, 0.005104188807308674, -0.0649636760354042, -0.0398700088262558, 0.02089952491223812, 0.018912216648459435, 0.0020509110763669014, 0.027826407924294472, -0.011658966541290283, -0.02798500843346119, 0.09603077173233032, -0.11117430776357651, -0.013028097338974476, -0.004899631254374981, -0.09090009331703186, -0.03502245619893074, 0.0764511451125145, -0.006837666500359774, -0.12112435698509216, 0.09411865472793579, -0.02198697254061699, -0.01690870150923729, -0.09358544647693634, -0.17818604409694672, 0.0019873459823429585, -0.007970507256686687, -0.05614287033677101, -0.09533043205738068, -0.11132565140724182, -0.10668007284402847, 0.04497053846716881, -0.059345588088035583, -0.007252220064401627, -0.07377547025680542, -0.01511832419782877, -0.00014329238911159337, -0.025198090821504593, 0.11635226011276245, -0.04487760365009308, 0.020001525059342384, -0.007444292306900024, 0.0734560489654541, 0.020131269469857216, 0.023154769092798233, -0.13376696407794952, 0.03781725838780403, -0.16400869190692902, 0.14272721111774445, -0.060497183352708817, 0.0057968925684690475, -0.1185314953327179, -0.05612794682383537, -0.06523095816373825, 0.06613482534885406, 0.057764653116464615, 0.11941380053758621, -0.20463527739048004, -0.008474153466522694, 0.18721279501914978, -0.10079611837863922, -0.07264284044504166, 0.122789666056633, -0.008809765800833702, 0.03329567238688469, 0.08961271494626999, 0.13687798380851746, 0.06839492917060852, -0.013694243505597115, -0.05263916403055191, 0.03023868054151535, 0.016254199668765068, 0.061535678803920746, 0.09218297153711319, -0.0742054134607315, 0.04596804827451706, 0.02837638556957245, 0.006046830210834742, -0.012255232781171799, -0.007561323698610067, -0.03368747606873512, 0.01214857492595911, -0.0437653511762619, -0.05646063759922981, 0.03935464099049568, 0.0039780111983418465, -0.06757675856351852, -0.07341044396162033, -0.03417138755321503, 0.08903025835752487, -0.05663558095693588, 0.02840609848499298, 0.014166769571602345, -0.05655506253242493, -0.1132860779762268, 0.01023014821112156, -0.1598629355430603, -0.03143399581313133, 0.022289201617240906, -0.05720988288521767, 0.10651493817567825, 0.09113588184118271, 0.06114023178815842, 0.09359215945005417, -0.060732390731573105, -0.024235734716057777, 0.012806066311895847, -0.023476876318454742, -0.10444316267967224, -0.1300632506608963, -0.041682545095682144, -0.026076210662722588, -0.006355908699333668, -0.11520789563655853, -0.014006339013576508, -0.05858948826789856, 0.08415000885725021, 0.0050675575621426105, -0.022457702085375786, 0.044499050825834274, 0.06631666421890259, -0.02142115868628025, -0.03472709655761719, 0.027157090604305267, -0.011745686642825603, -0.06654736399650574, 0.11164238303899765, -0.1391860842704773, -0.12122009694576263, 0.07313324511051178, 0.008528003469109535, -0.1261792778968811, -0.013845556415617466, -0.013658713549375534, -0.06353797763586044, -0.05965321511030197, -0.050286054611206055, 0.22697101533412933, 0.041896115988492966, 0.12675917148590088, -0.11243342608213425, -0.02214799076318741, 0.011219608597457409, -0.04574280604720116, 0.0030310885049402714, 0.18719908595085144, 0.07157309353351593, -0.16097360849380493, 0.0857972577214241, -0.024417322129011154, -0.037924282252788544, 0.1461792141199112, 0.06710268557071686, -0.10272413492202759, 0.0015928697539493442, 0.03820915147662163, -0.0021719385404139757, 0.07616861909627914, -0.07921461015939713, 0.0021703005768358707, 0.032792385667562485, 0.04866839572787285, 0.05788162350654602, -0.07642728090286255, 0.05268207937479019, 0.055716246366500854, -0.03513664752244949, 0.05198182165622711, -0.045086007565259933, -0.04385519400238991, 0.08680304139852524, 0.016133738681674004, -0.04175737127661705, -0.04841264709830284, -0.05417970195412636, -0.11693030595779419, 0.20197099447250366, -0.062092650681734085, -0.218292236328125, -0.12472033500671387, 0.0889194905757904, -0.05073772370815277, 0.040654901415109634, 0.03532763943076134, -0.03972059115767479, -0.0818149670958519, -0.1353447586297989, 0.09269770234823227, -0.05495045334100723, -0.053956057876348495, -0.14262044429779053, 0.02559010498225689, 0.010413440875709057, -0.11407097429037094, 0.023445047438144684, -0.01559449639171362, -0.014032258652150631, 0.003225402906537056, -0.028545118868350983, 0.11618784815073013, 0.11979459971189499, -0.006732090376317501, -0.03300253301858902, 0.005007868632674217, 0.12903724610805511, -0.09042154997587204, 0.07303200662136078, 0.04961131140589714, 0.01888657920062542, 0.028399404138326645, 0.16246449947357178, 0.024356579408049583, -0.050727393478155136, 0.03593003377318382, 0.06234125792980194, -0.033995307981967926, -0.22106647491455078, -0.11290039867162704, -0.0683603510260582, 0.034234315156936646, 0.11262078583240509, 0.04068155586719513, -0.07159753888845444, 0.03884965926408768, -0.059027232229709625, 0.02200022153556347, -0.00006010041033732705, 0.05030417814850807, 0.004274530336260796, -0.014645656570792198, 0.08672022074460983, -0.05614413693547249, -0.036882515996694565, 0.09898136556148529, 0.02339768037199974, 0.18072256445884705, -0.06335673481225967, 0.18441978096961975, 0.05247769504785538, 0.031176971271634102, 0.00942260306328535, 0.05861020088195801, -0.038691598922014236, 0.012875396758317947, -0.014903317205607891, -0.06791427731513977, -0.046831704676151276, 0.06200466677546501, 0.01497121062129736, 0.008643877692520618, -0.040206778794527054, -0.048797816038131714, 0.04912577196955681, 0.21364007890224457, 0.06200169771909714, -0.17414027452468872, -0.07380132377147675, 0.013359659351408482, -0.06934066861867905, -0.058589231222867966, 0.009436465799808502, 0.165814608335495, -0.10371425747871399, 0.03449703007936478, 0.006079624406993389, 0.11968807131052017, -0.11826079338788986, -0.012664197944104671, 0.0351417176425457, 0.04514175280928612, -0.014839484356343746, 0.13404659926891327, -0.22563976049423218, 0.15008775889873505, 0.011647474952042103, 0.05696156248450279, -0.04992006719112396, 0.014204232022166252, -0.04174121841788292, 0.01593356765806675, 0.13791315257549286, 0.030607223510742188, -0.046911709010601044, -0.11290772259235382, -0.096966452896595, 0.004137516487389803, 0.0570613369345665, -0.07136569172143936, 0.09549141675233841, 0.054926976561546326, 0.022957367822527885, -0.03355944901704788, 0.009480857290327549, -0.04565783962607384, -0.14489802718162537, 0.00785916205495596, -0.03601326420903206, -0.02805742435157299, -0.008929836563766003, -0.031105415895581245, -0.08877698332071304, 0.20437553524971008, -0.10544625669717789, -0.09807134419679642, -0.08160044252872467, 0.00374776404350996, 0.13076868653297424, -0.07511062920093536, -0.005568657536059618, 0.003978084307163954, 0.054352086037397385, -0.04478279501199722, -0.015894150361418724, 0.08214283734560013, -0.08241330087184906, -0.09916793555021286, -0.04539070278406143, 0.11510942876338959, 0.08109075576066971, 0.04449579492211342, -0.008967353031039238, 0.04551132395863533, -0.013330372981727123, -0.12463131546974182, -0.023750662803649902, 0.02949349395930767, 0.11636039614677429, 0.0599568635225296, -0.034532543271780014, -0.032136302441358566, -0.05541807785630226, -0.05797133967280388, 0.09146682173013687, 0.1622641682624817, -0.045038316398859024, 0.05060865730047226, 0.20156528055667877, -0.09674198925495148, -0.21629948914051056, -0.05937567353248596, 0.09029847383499146, 0.08224944770336151, -0.00009508070797892287, -0.15221090614795685, 0.06055179610848427, 0.09520222991704941, 0.0016516140894964337, 0.03845910727977753, -0.3524623215198517, -0.1440853476524353, 0.06558697670698166, 0.03346603736281395, -0.03753088414669037, -0.09704453498125076, -0.057465340942144394, -0.05847332626581192, -0.01345104444772005, 0.0806916132569313, -0.04320898652076721, 0.10658053308725357, 0.004049206152558327, 0.02339934930205345, 0.05490315705537796, -0.04619060829281807, 0.13616929948329926, 0.03206479549407959, 0.03087545372545719, -0.08410322666168213, 0.09076864272356033, 0.009045198559761047, -0.018251866102218628, 0.1771697700023651, -0.0622355192899704, 0.04316967353224754, -0.13558737933635712, -0.05956206098198891, -0.07669147849082947, 0.045411452651023865, -0.04050566628575325, -0.07835093140602112, -0.06108585745096207, 0.05553199350833893, 0.0627145990729332, -0.007370125036686659, 0.020723264664411545, -0.08759135007858276, 0.002310040406882763, 0.17861750721931458, 0.1310282051563263, 0.020056458190083504, -0.10857928544282913, 0.03675594553351402, 0.002319987863302231, 0.07884266972541809, -0.060826171189546585, 0.006266914773732424, 0.13950690627098083, 0.015002219006419182, 0.10390286892652512, -0.030658889561891556, -0.1380372941493988, -0.016498643904924393, 0.04956468567252159, -0.10720116645097733, -0.11674078553915024, -0.006896851118654013, 0.0015019811689853668, -0.07177385687828064, -0.039704978466033936, 0.10614501684904099, -0.10733379423618317, -0.005106430966407061, 0.004796306602656841, 0.04088619351387024, -0.03462611883878708, 0.20145368576049805, 0.04322607442736626, 0.05033395439386368, -0.07027725130319595, 0.12410175800323486, 0.11225397884845734, -0.14370328187942505, 0.02550329454243183, 0.18936428427696228, -0.09236650913953781, -0.06630600988864899, 0.02869728021323681, 0.1351492702960968, -0.04051036015152931, -0.06425631791353226, -0.025854088366031647, -0.06413140892982483, 0.021485568955540657, -0.004235086962580681, 0.05104069411754608, 0.027966413646936417, -0.009228918701410294, -0.02437243051826954, -0.08925827592611313, 0.10270900279283524, 0.07016181200742722, 0.029345961287617683, -0.02360088750720024, 0.13620871305465698, 0.015354093164205551, -0.018848400563001633, -0.020866280421614647, 0.022357221692800522, -0.06036532297730446, 0.006049259100109339, -0.06653817743062973, 0.014582309871912003, -0.052851445972919464, -0.0044968221336603165, -0.02721503935754299, 0.0024980211164802313, -0.016755564138293266, -0.000570422678720206, -0.03503908962011337, -0.04253855720162392, -0.05274387449026108, 0.012124965898692608, -0.09096760302782059, -0.04411739483475685, 0.005772524047642946, -0.009513435885310173, 0.0438932441174984, -0.005532664246857166, 0.007071595638990402, 0.012028786353766918, -0.017489539459347725, 0.08046337217092514, 0.01652076095342636, 0.03955796733498573, -0.006834528408944607, -0.056858424097299576, -0.009031536057591438, 0.02445272170007229, -0.005225206259638071, -0.008897009305655956, 0.018385514616966248, -0.14701592922210693, -0.005084334872663021, -0.007187253329902887, -0.03499909117817879, -0.07928203046321869, 0.10182900726795197, 0.041175633668899536, 0.0688871294260025, 0.1263018399477005, -0.08231445401906967, 0.08416900038719177, -0.16250257194042206, -0.006649990566074848, 0.03249167278409004, -0.004055684432387352, -0.030137155205011368, -0.0017814520979300141, 0.04473470151424408, -0.07682718336582184, 0.13387618958950043, 0.04477948322892189, 0.06659410893917084, 0.02376149781048298, -0.05862559750676155, 0.014008511789143085, 0.027566496282815933, 0.07762283086776733, -0.037119586020708084, -0.019391672685742378, -0.06341617554426193, 0.08889185637235641, -0.005288993939757347, 0.05230676382780075, 0.06615815311670303, 0.10623978078365326, 0.11794795095920563, 0.05065815895795822, 0.0006866904441267252, -0.09047943353652954, -0.06909606605768204, 0.05907713621854782, -0.0006986945518292487, 0.0505380779504776, -0.0061541711911559105, 0.08805572986602783, 0.1287829875946045, -0.149290069937706, 0.12488702684640884, -0.0025868399534374475, -0.08814239501953125, -0.05222892761230469, -0.17459054291248322, -0.0583571195602417, 0.0005297246971167624, -0.03835029900074005, -0.12271978706121445, 0.01338195614516735, 0.08421362191438675, 0.051082756370306015, -0.023249268531799316, 0.1427113562822342, -0.0692172423005104, -0.11262239515781403, 0.0485236719250679, 0.017462264746427536, 0.08530037105083466, 0.028847260400652885, 0.022889891639351845, 0.059179794043302536, 0.020014317706227303, 0.028699737042188644, 0.04643050208687782, -0.0038834926672279835, -0.01574794575572014, 0.012410599738359451, -0.062275294214487076, -0.036559101194143295, 0.014150721952319145, 0.09326861053705215, 0.1725122481584549, 0.05519363284111023, -0.06878460198640823, -0.03266331925988197, 0.19963988661766052, -0.0635092481970787, -0.08489687740802765, -0.10708724707365036, 0.20387810468673706, 0.01418338529765606, 0.0474151112139225, -0.007657238282263279, -0.09884214401245117, 0.003992358222603798, 0.14590197801589966, 0.1909697949886322, -0.04418237879872322, -0.022255152463912964, 0.00045871170004829764, -0.009940486401319504, 0.02949739806354046, 0.04807164892554283, 0.0010465418454259634, 0.25373324751853943, -0.0806821659207344, 0.10015010088682175, -0.048401955515146255, 0.006363791413605213, -0.013271719217300415, 0.16439276933670044, 0.010064506903290749, 0.02186398021876812, -0.07890939712524414, 0.08836839348077774, -0.03257603943347931, -0.17137287557125092, 0.018948853015899658, -0.07305008172988892, -0.13187763094902039, 0.007703264243900776, 0.006916130892932415, 0.061829544603824615, 0.06550639867782593, 0.013971489854156971, 0.04654610529541969, 0.044272053986787796, 0.017505519092082977, -0.10919616371393204, -0.11982997506856918, 0.0012007380137220025, -0.0053075868636369705, 0.12277106195688248, 0.012448525987565517, 0.12135625630617142, 0.0772671177983284, 0.001541921286843717, -0.09265798330307007, 0.09227917343378067, 0.03250819072127342, 0.010012486949563026, 0.08838856220245361, 0.12221650034189224, -0.0036597049329429865, 0.08693858981132507, 0.04245389252901077, -0.10407215356826782, 0.012344851158559322, -0.025638466700911522, -0.017128055915236473, -0.11020709574222565, 0.09599262475967407, -0.045821208506822586, 0.149923175573349, 0.1969207525253296, -0.010525868274271488, -0.030244963243603706, -0.06010760739445686, 0.021590018644928932, -0.022961607202887535, 0.08249219506978989, -0.0025333312805742025, -0.15743261575698853, 0.011517255567014217, -0.07182960957288742, 0.02386287972331047, -0.23456573486328125, -0.021976323798298836, 0.013758671469986439, -0.052168767899274826, 0.0042104413732886314, 0.0731780007481575, 0.034714099019765854, 0.038336288183927536, -0.04432746022939682, -0.06285998970270157, 0.025533121079206467, 0.09737137705087662, -0.10665839165449142, -0.12162406742572784 ]
null
null
transformers
# legal_t5_small_trans_de_cs model Model on translating legal text from Deustch to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_cs is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Cszech. ### How to use Here is how to use this model to translate legal text from Deustch to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "17. empfiehlt die Einführung einer spezifischen Strategie zur Unterstützung neuer und demokratisch gewählter Parlamente im Hinblick auf eine dauerhafte Verankerung von Demokratie, Rechtsstaatlichkeit und guter Staatsführung;" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_cs model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_cs | 44.07| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Cszech", "tags": ["translation Deustch Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "17. empfiehlt die Einf\u00fchrung einer spezifischen Strategie zur Unterst\u00fctzung neuer und demokratisch gew\u00e4hlter Parlamente im Hinblick auf eine dauerhafte Verankerung von Demokratie, Rechtsstaatlichkeit und guter Staatsf\u00fchrung;"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_cs model ===================================== Model on translating legal text from Deustch to Cszech. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_cs is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Cszech. ### How to use Here is how to use this model to translate legal text from Deustch to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 61, 166, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.10036938637495041, 0.03841870650649071, -0.0035349498502910137, 0.08008217811584473, 0.08029849082231522, 0.04121009632945061, 0.057553086429834366, 0.09471429884433746, -0.05307912826538086, 0.0806499496102333, 0.05774074047803879, 0.02184371091425419, 0.10077448934316635, 0.04574374109506607, 0.024447603151202202, -0.20577192306518555, 0.01377074420452118, -0.01584082841873169, -0.03460339829325676, 0.13128945231437683, 0.11310165375471115, -0.07626688480377197, 0.04911639168858528, -0.011434497311711311, -0.09906813502311707, -0.016994472593069077, -0.04842812567949295, -0.05880242958664894, 0.06695055961608887, 0.05895347148180008, 0.14340539276599884, 0.012251835316419601, 0.10881941020488739, -0.15907905995845795, -0.0014342787908390164, 0.08074918389320374, 0.03592550382018089, 0.044917572289705276, 0.10073154419660568, 0.027435529977083206, 0.14872533082962036, -0.04037758708000183, 0.06187775731086731, 0.03423021733760834, -0.11565499007701874, -0.11702848970890045, -0.05761627480387688, 0.005939696915447712, 0.14468318223953247, 0.14019300043582916, -0.04276474192738533, 0.08189980685710907, -0.11648167669773102, 0.0663742870092392, 0.03577128425240517, -0.25574007630348206, -0.06774724274873734, 0.061876289546489716, 0.06779959052801132, 0.09278202056884766, -0.03364261984825134, -0.04030026122927666, 0.04095935449004173, 0.024229921400547028, 0.030124971643090248, -0.05144690349698067, 0.046720851212739944, -0.012219464406371117, -0.18136191368103027, -0.091495580971241, 0.1334700882434845, -0.008673428557813168, -0.05224505439400673, -0.08186738938093185, -0.05528858304023743, -0.09939367324113846, -0.009967442601919174, -0.07309570908546448, 0.04201451689004898, 0.012509421445429325, 0.08323050290346146, -0.03263024985790253, -0.11261361092329025, -0.11111793667078018, 0.016088375821709633, 0.07780606299638748, 0.09015541523694992, 0.004701112862676382, -0.03705068677663803, 0.13913241028785706, 0.005263992119580507, -0.09556394815444946, -0.021378112956881523, -0.025267401710152626, -0.1251697689294815, -0.017362911254167557, -0.005430716089904308, -0.17124643921852112, -0.05131252855062485, 0.03971589729189873, -0.014871993102133274, 0.07723182439804077, 0.016091691330075264, 0.027962705120444298, 0.0353664793074131, 0.20352938771247864, -0.0895768254995346, -0.0774804875254631, -0.04806492105126381, 0.022543607279658318, -0.05839215964078903, 0.009700826369225979, -0.03911469131708145, -0.030872412025928497, 0.029375679790973663, 0.07595404237508774, -0.03564360737800598, 0.027353990823030472, -0.030678771436214447, -0.05278345197439194, 0.08627329766750336, -0.11035007238388062, -0.009751048870384693, 0.03446036949753761, -0.09056270122528076, 0.012784378603100777, 0.07819315046072006, -0.03513563051819801, -0.13730287551879883, 0.03946413844823837, -0.03188107907772064, -0.015129346400499344, -0.14872191846370697, -0.12670031189918518, -0.03426449000835419, -0.023390430957078934, -0.03073960170149803, -0.07498414814472198, -0.13343791663646698, -0.10694556683301926, 0.05410342663526535, -0.05936483293771744, -0.018847281113266945, -0.0874805748462677, -0.016656475141644478, 0.010695048607885838, -0.020666910335421562, 0.07799077033996582, -0.039756543934345245, 0.08688917011022568, 0.012639162130653858, 0.026328125968575478, 0.13142278790473938, 0.07197249680757523, -0.09066256880760193, 0.012695010751485825, -0.08763648569583893, 0.19104540348052979, -0.050289254635572433, -0.021115368232131004, -0.14030198752880096, -0.10149787366390228, -0.04094284400343895, 0.0605270080268383, 0.08095091581344604, 0.13342736661434174, -0.17482055723667145, 0.01215366180986166, 0.2251153588294983, -0.07834016531705856, -0.02950616367161274, 0.11126840859651566, -0.06061111018061638, 0.11984484642744064, 0.08950717747211456, 0.15665970742702484, 0.07028746604919434, -0.11244579404592514, 0.007303175050765276, -0.046801045536994934, -0.01863258332014084, 0.02718728594481945, 0.08018971234560013, -0.04841609299182892, -0.05536293610930443, -0.006880226079374552, -0.14091584086418152, 0.038135141134262085, -0.03745654597878456, -0.05086993798613548, 0.012499000877141953, -0.019824393093585968, -0.014872299507260323, 0.042619239538908005, 0.02803037315607071, -0.030926520004868507, -0.10484734922647476, 0.000816859130281955, 0.08899948000907898, -0.06341978907585144, 0.01945086382329464, -0.0311790332198143, -0.04379572346806526, -0.048165373504161835, 0.003744486952200532, -0.13050490617752075, 0.02146575041115284, 0.0539243184030056, -0.011129176244139671, 0.0460902564227581, 0.0358741320669651, 0.03927500545978546, 0.03606954962015152, -0.001591083942912519, -0.04824358597397804, -0.039685118943452835, -0.03347168117761612, -0.11369839310646057, -0.08303473889827728, -0.021513815969228745, -0.021879892796278, -0.00009815119847189635, -0.1961842179298401, 0.030760101974010468, -0.10431890189647675, 0.04230000823736191, -0.00875451136380434, -0.025789953768253326, 0.03901917487382889, 0.03692473843693733, 0.026756929233670235, -0.07586236298084259, 0.0329788438975811, 0.019827838987112045, 0.023298414424061775, 0.12522491812705994, -0.12083849310874939, -0.15250122547149658, 0.08166851103305817, 0.01575983129441738, -0.12904256582260132, 0.014603578485548496, -0.013317780569195747, -0.06131300702691078, -0.04301547259092331, -0.009850122034549713, 0.2568720877170563, 0.01516333781182766, 0.1361645758152008, -0.12148721516132355, -0.0268037598580122, -0.01826648786664009, -0.013465496711432934, 0.01884865202009678, 0.1343507021665573, 0.06590911746025085, -0.13264517486095428, 0.045890796929597855, 0.03211771696805954, -0.01011982373893261, 0.16216573119163513, 0.0005208257352933288, -0.14574693143367767, -0.006324500776827335, 0.07268042117357254, -0.021284181624650955, 0.08940617740154266, -0.11158984899520874, 0.005035967566072941, 0.02947782725095749, 0.06936580687761307, 0.06126183271408081, -0.12212395668029785, 0.04564615711569786, 0.05515746772289276, -0.0723668709397316, -0.014845945872366428, -0.018510665744543076, -0.020829487591981888, 0.05529342219233513, 0.0043319216929376125, 0.0024056192487478256, -0.01654043234884739, -0.03879208862781525, -0.12568598985671997, 0.20540516078472137, -0.07437945902347565, -0.13759870827198029, -0.11784443259239197, 0.07622338831424713, 0.05870326980948448, -0.0016832920955494046, 0.0351836122572422, -0.056707948446273804, -0.06635814905166626, -0.06030602380633354, 0.13574476540088654, -0.07386138290166855, -0.0659918561577797, -0.1231156513094902, -0.019732359796762466, -0.014310719445347786, -0.15821290016174316, 0.03811061009764671, -0.02621089480817318, -0.0745326355099678, 0.015522993169724941, -0.0426255464553833, 0.07923296093940735, 0.13792815804481506, -0.005146566312760115, -0.002544228918850422, -0.010882820934057236, 0.16821430623531342, -0.139377161860466, 0.05214531719684601, 0.0625452920794487, 0.011154194362461567, 0.014347207732498646, 0.10267357528209686, -0.007104991003870964, -0.0849977508187294, 0.02414001151919365, 0.04088035598397255, -0.020316362380981445, -0.31641483306884766, -0.03720106929540634, -0.0429452620446682, -0.030924873426556587, 0.09736590832471848, 0.02697906643152237, 0.007993169128894806, 0.03992058336734772, -0.009480289183557034, 0.01170431263744831, 0.034251146018505096, 0.042921558022499084, 0.041566841304302216, 0.01701701432466507, 0.0694553554058075, -0.03311854600906372, -0.0152220968157053, 0.06041695177555084, 0.008611414581537247, 0.25007346272468567, -0.0541585311293602, 0.14402751624584198, 0.0666474848985672, 0.1247805804014206, 0.018757887184619904, 0.07457107305526733, 0.0020446025300771, 0.0322352834045887, -0.017835112288594246, -0.026696808636188507, -0.04843702167272568, 0.032899290323257446, 0.03408132866024971, -0.022181855514645576, -0.04420309513807297, -0.00455025490373373, 0.01794600486755371, 0.30401620268821716, 0.026131609454751015, -0.19041961431503296, -0.05992182344198227, -0.0060511124320328236, -0.04948772117495537, -0.11665906012058258, 0.08591023832559586, 0.08764398097991943, -0.13622617721557617, -0.027736123651266098, -0.03662111237645149, 0.11569724231958389, -0.11268564313650131, -0.04416302219033241, 0.039427537471055984, 0.066743865609169, 0.0021510757505893707, 0.07940935343503952, -0.3010712265968323, 0.17614784836769104, 0.006688473280519247, 0.10578963160514832, -0.01390522625297308, 0.04651406407356262, -0.030794797465205193, 0.019560901448130608, 0.137964129447937, 0.011581444181501865, -0.03686121106147766, -0.07553700357675552, -0.12195328623056412, 0.016920942813158035, 0.03094147890806198, -0.0283447727560997, 0.08248459547758102, 0.018677838146686554, 0.038496438413858414, -0.002973836148157716, -0.1173415556550026, -0.12388091534376144, -0.10001903772354126, -0.01560933981090784, -0.11503306031227112, 0.03773975372314453, -0.0197363942861557, -0.04135345667600632, -0.029667465016245842, 0.1291210651397705, -0.12122700363397598, -0.1027093157172203, -0.10336107760667801, -0.006366285029798746, 0.12873293459415436, -0.06586679071187973, 0.0031391826923936605, 0.030383644625544548, -0.022222012281417847, 0.019136616960167885, -0.006686042062938213, 0.12028924375772476, -0.0677480697631836, -0.10014907270669937, -0.018224326893687248, 0.09511186182498932, 0.11085563898086548, 0.05150599777698517, -0.02033613622188568, 0.022993238642811775, -0.010536234825849533, -0.10523834824562073, -0.0006331508047878742, 0.013150736689567566, 0.04552309960126877, 0.031176626682281494, -0.04882482439279556, -0.050090935081243515, -0.09375391900539398, -0.026284294202923775, 0.07341181486845016, 0.12597426772117615, -0.052662186324596405, 0.0515909381210804, 0.15734215080738068, -0.12626706063747406, -0.17933930456638336, 0.0013882757630199194, 0.09497155249118805, 0.0821131095290184, -0.024979759007692337, -0.20554548501968384, -0.0067016915418207645, 0.08066734671592712, 0.010824247263371944, 0.0485946387052536, -0.3886130750179291, -0.12732025980949402, 0.11093495041131973, 0.06712939590215683, -0.04102722555398941, -0.08599969744682312, -0.03493447229266167, 0.06864933669567108, -0.06814757734537125, 0.05278846248984337, -0.02237939089536667, 0.08389806747436523, 0.012035222724080086, -0.013059488497674465, 0.04758515581488609, -0.05096400901675224, 0.10756342113018036, 0.024184515699744225, 0.044663283973932266, -0.05616061016917229, 0.03755800053477287, 0.004565355833619833, -0.01397364679723978, 0.1570119708776474, 0.024143917486071587, 0.05610547214746475, -0.1882186233997345, -0.06232254207134247, -0.0712893083691597, -0.01264582946896553, -0.07030968368053436, -0.04755137488245964, -0.06612268090248108, 0.06510592252016068, 0.06428152322769165, -0.02400282956659794, -0.006215368397533894, -0.05878996104001999, -0.06847312301397324, 0.11666218936443329, 0.0901975929737091, 0.08448003232479095, -0.08946076035499573, -0.008451745845377445, 0.04541514813899994, 0.10782039165496826, -0.15619125962257385, 0.005629172548651695, 0.11627873033285141, -0.019053921103477478, 0.11825235188007355, -0.020030857995152473, -0.1331496238708496, 0.01928691379725933, 0.004199462942779064, -0.0570870079100132, -0.15501530468463898, -0.005096353590488434, -0.12562891840934753, -0.03507821634411812, -0.0653296709060669, 0.09948869049549103, -0.09423364698886871, -0.029398003593087196, -0.0014168040361255407, 0.04414822906255722, -0.04538100212812424, 0.2051895260810852, 0.033639609813690186, 0.041097771376371384, -0.06372419744729996, 0.10654087364673615, 0.11490073800086975, -0.09232354164123535, 0.033229123800992966, 0.15773512423038483, -0.09527784585952759, -0.04942750930786133, 0.0425017811357975, 0.143708273768425, -0.02013644203543663, -0.05522585287690163, -0.04129050299525261, -0.040845178067684174, 0.03985297679901123, -0.009279453195631504, 0.02705950103700161, 0.008842794224619865, -0.04130679368972778, 0.00468312157317996, -0.10006546974182129, 0.06815257668495178, 0.08989762514829636, 0.011805991642177105, -0.026779206469655037, 0.1284293234348297, 0.03962717950344086, 0.026856033131480217, -0.022390328347682953, -0.011555624194443226, -0.06504400819540024, 0.046332407742738724, -0.06052495911717415, 0.013890771195292473, -0.048410218209028244, 0.0007314730901271105, -0.029835812747478485, -0.022124281153082848, -0.012335655279457569, 0.030413012951612473, -0.057401977479457855, -0.03810088336467743, -0.03778219223022461, 0.05808240547776222, -0.0640292540192604, -0.02142392285168171, 0.011899922974407673, -0.04982782155275345, 0.06801019608974457, 0.03496302291750908, -0.010119647718966007, 0.02859671786427498, -0.000843455025460571, 0.06352756172418594, -0.024549445137381554, 0.024979515001177788, -0.004503079690039158, -0.08794742077589035, 0.026564886793494225, 0.007373530883342028, -0.018344612792134285, -0.0088624507188797, 0.04934810847043991, -0.114788718521595, 0.05746695399284363, -0.04830574244260788, -0.007024724967777729, -0.05087979510426521, 0.11095379292964935, 0.022910885512828827, 0.08264461159706116, 0.10954184085130692, -0.0838918685913086, 0.08507449179887772, -0.14935828745365143, -0.03930514678359032, 0.031170014292001724, 0.02773384004831314, -0.0014865562552586198, -0.06612864136695862, 0.06435541808605194, -0.05659976974129677, 0.09984183311462402, 0.05241391807794571, 0.06114362180233002, 0.02432757429778576, -0.08230596780776978, -0.006830251310020685, 0.03320262208580971, 0.06246630474925041, -0.024607496336102486, 0.00952428113669157, -0.004780091345310211, 0.06805308163166046, -0.03633379563689232, 0.06017325818538666, 0.12669189274311066, 0.22029420733451843, 0.08220929652452469, 0.08966615796089172, -0.047551967203617096, -0.0963272824883461, -0.09531751275062561, 0.08437793701887131, -0.020009605213999748, 0.01476388517767191, -0.0013697489630430937, 0.1457294374704361, 0.09909091144800186, -0.1490417867898941, 0.09660214930772781, -0.010787115432322025, -0.1177748292684555, -0.07841848582029343, -0.05428486317396164, -0.03705183044075966, -0.10545991361141205, -0.011311115697026253, -0.09491622447967529, 0.03497455269098282, 0.04300198704004288, 0.07356519252061844, -0.05017251521348953, 0.14379464089870453, 0.037637874484062195, -0.09953896701335907, 0.08067141473293304, 0.004586868453770876, 0.0948374792933464, -0.05616702511906624, -0.0014714866410940886, 0.020537756383419037, 0.006858803331851959, 0.06354734301567078, 0.000491261132992804, -0.02798362821340561, 0.025610191747546196, -0.0011549489572644234, -0.029660429805517197, -0.0009214053279720247, 0.07637990266084671, 0.08504179865121841, 0.15319345891475677, 0.06873687356710434, -0.04570883885025978, -0.024571342393755913, 0.17065346240997314, -0.03259691968560219, -0.10249283909797668, -0.17787590622901917, 0.0984356701374054, 0.03915273770689964, 0.02129855751991272, 0.03313359245657921, -0.08306614309549332, -0.014295591972768307, 0.23280245065689087, 0.13151098787784576, -0.049391135573387146, -0.04532672464847565, 0.026580139994621277, -0.006888777017593384, 0.01595565862953663, 0.10877814143896103, 0.03066592663526535, 0.18343313038349152, -0.06836944073438644, 0.0041521163657307625, -0.07756772637367249, -0.04480483755469322, -0.018366200849413872, 0.14461420476436615, -0.002419795375317335, -0.03434271737933159, -0.04149391129612923, 0.11520832031965256, 0.0008460068493150175, -0.18284183740615845, 0.027477046474814415, -0.053675565868616104, -0.11073853820562363, -0.02060023508965969, -0.016928810626268387, 0.011989022605121136, 0.029434874653816223, 0.007628357037901878, 0.003899421775713563, 0.166958287358284, 0.024054205045104027, -0.06666073948144913, -0.12919844686985016, 0.05334446206688881, -0.04677887633442879, 0.173017680644989, 0.026726579293608665, 0.08220537006855011, 0.07174192368984222, 0.03662139177322388, -0.09742957353591919, 0.06464636325836182, 0.05358158424496651, 0.026585279032588005, 0.041069965809583664, 0.0757601335644722, -0.03111695498228073, 0.033172015100717545, 0.03319651633501053, -0.09135705232620239, 0.04460753872990608, -0.13247574865818024, -0.04654184728860855, -0.15896543860435486, 0.04093129187822342, -0.04698365554213524, 0.11063119024038315, 0.21925710141658783, 0.0008492757915519178, 0.01701425015926361, -0.0539894662797451, 0.054993174970149994, -0.04412810876965523, 0.16129106283187866, -0.01520305685698986, -0.1498326063156128, 0.0026607501786202192, -0.010936046950519085, 0.04469556733965874, -0.1572180688381195, -0.011158080771565437, 0.026598989963531494, -0.08025795221328735, -0.014930248260498047, 0.12367565929889679, 0.037711821496486664, 0.05108574777841568, -0.012789640575647354, -0.0676952451467514, -0.012369244359433651, 0.12980274856090546, -0.12217366695404053, -0.07357630133628845 ]
null
null
transformers
# legal_t5_small_trans_de_cs_small_finetuned model Model on translating legal text from Deustch to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_cs_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_de_cs_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Cszech. ### How to use Here is how to use this model to translate legal text from Deustch to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_cs_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Der Rahmenbeschluss sieht ein beschleunigtes Verfahren für die Anerkennung und Vollstreckung von freiheitsentziehenden Maßnahmen oder Maßnahmen der Sicherung (bei Unzurechnungsfähigkeit oder verminderter Schuldfähigkeit), die von einem Gericht eines anderen Mitgliedstaats gegen eine Person verhängt wurden, durch einen Mitgliedstaat vor, dessen Staatsangehörigkeit die Person besitzt, in dem sie ihren rechtmäßigen Aufenthalt hat oder zu dem sie enge Verbindungen hat." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_cs_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_cs_small_finetuned | 43.750| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Cszech", "tags": ["translation Deustch Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Der Rahmenbeschluss sieht ein beschleunigtes Verfahren f\u00fcr die Anerkennung und Vollstreckung von freiheitsentziehenden Ma\u00dfnahmen oder Ma\u00dfnahmen der Sicherung (bei Unzurechnungsf\u00e4higkeit oder verminderter Schuldf\u00e4higkeit), die von einem Gericht eines anderen Mitgliedstaats gegen eine Person verh\u00e4ngt wurden, durch einen Mitgliedstaat vor, dessen Staatsangeh\u00f6rigkeit die Person besitzt, in dem sie ihren rechtm\u00e4\u00dfigen Aufenthalt hat oder zu dem sie enge Verbindungen hat."}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_cs_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_cs\_small\_finetuned model ======================================================= Model on translating legal text from Deustch to Cszech. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_cs\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_de\_cs\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Cszech. ### How to use Here is how to use this model to translate legal text from Deustch to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_cs\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_cs\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_cs\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 61, 212, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_cs\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06938215345144272, 0.08309108763933182, -0.00357769220136106, 0.09248826652765274, 0.05449874699115753, 0.02344897948205471, 0.043610457330942154, 0.09789291024208069, -0.05634845048189163, 0.09116927534341812, 0.041623763740062714, -0.025517866015434265, 0.0905989557504654, 0.03144977241754532, 0.06298941373825073, -0.23130212724208832, 0.004475150723010302, -0.038045190274715424, -0.011816059239208698, 0.08428259938955307, 0.10073316097259521, -0.06593073904514313, 0.04567483067512512, -0.05128903314471245, -0.06009247899055481, 0.013620292767882347, -0.08708901703357697, -0.031210914254188538, 0.08104200661182404, 0.0685257688164711, 0.09118133038282394, -0.003410243196412921, 0.08662775158882141, -0.17522211372852325, -0.0010203449055552483, 0.07221914827823639, 0.006805863231420517, 0.0524316132068634, 0.1145826205611229, 0.02191023714840412, 0.14358824491500854, -0.08178162574768066, 0.03298209607601166, 0.039668384939432144, -0.12015443295240402, -0.128241166472435, -0.06307729333639145, 0.036167263984680176, 0.09062901139259338, 0.13754242658615112, -0.04374437779188156, 0.041501399129629135, -0.039893265813589096, 0.06119555979967117, 0.05408438667654991, -0.23503796756267548, -0.03520273044705391, 0.05480620265007019, 0.05266857519745827, 0.08929834514856339, -0.05005551129579544, -0.033813271671533585, 0.044544972479343414, 0.07287745177745819, 0.028879929333925247, -0.060865115374326706, -0.03398659825325012, -0.04073623567819595, -0.12883974611759186, -0.04747896268963814, 0.14336007833480835, 0.021316416561603546, -0.05094393715262413, -0.09525047987699509, -0.07852364331483841, -0.08119011670351028, -0.017032261937856674, -0.042713697999715805, 0.041705574840307236, 0.006589419208467007, 0.08213541656732559, -0.02235669642686844, -0.11503235995769501, -0.06530473381280899, -0.06125863268971443, 0.06976847350597382, 0.0608946718275547, 0.0017865188419818878, 0.022651050239801407, 0.09603175520896912, -0.07664980739355087, -0.09350533038377762, 0.006875734310597181, 0.011478097178041935, -0.11540346592664719, -0.004617398604750633, -0.004449296742677689, -0.20592309534549713, -0.023870699107646942, 0.04483823850750923, -0.03306207433342934, 0.06496777385473251, 0.06528478115797043, 0.035589464008808136, 0.04269754886627197, 0.15408359467983246, -0.11524949967861176, -0.10871962457895279, -0.025328654795885086, 0.005190032534301281, 0.01358239445835352, 0.009188597090542316, -0.06166883930563927, -0.031497422605752945, 0.0002838662767317146, 0.035527147352695465, -0.0015059239231050014, 0.028252914547920227, -0.025935210287570953, -0.028027940541505814, 0.12481237947940826, -0.10544014722108841, 0.011931190267205238, 0.006256377324461937, -0.08555402606725693, 0.006552194710820913, 0.06505938619375229, -0.029881995171308517, -0.12700502574443817, 0.08163601160049438, -0.029167592525482178, -0.027675015851855278, -0.11885276436805725, -0.15713684260845184, -0.00982776191085577, 0.0329003743827343, -0.05083543434739113, -0.09847024828195572, -0.1234046071767807, -0.09530576318502426, 0.0356101393699646, -0.06460916996002197, -0.005229875911027193, -0.07006815820932388, -0.017300747334957123, -0.00802708137780428, -0.00822325050830841, 0.10352839529514313, -0.040760528296232224, 0.03318654000759125, 0.012887025251984596, 0.06179804354906082, 0.015668749809265137, 0.030596666038036346, -0.10959941893815994, 0.0423123836517334, -0.12251202017068863, 0.12020700424909592, -0.0217448640614748, -0.0008857832872308791, -0.10692331939935684, -0.05470317229628563, -0.07769469171762466, 0.048634111881256104, 0.06658948212862015, 0.11479969322681427, -0.22280104458332062, 0.01278556790202856, 0.20140579342842102, -0.07436753064393997, -0.06721083074808121, 0.13003844022750854, -0.014052695594727993, 0.028876282274723053, 0.06957421451807022, 0.12616123259067535, 0.06954753398895264, -0.02552817203104496, -0.030364040285348892, 0.0025333112571388483, 0.004185905680060387, 0.07307975739240646, 0.08077274262905121, -0.06753133237361908, 0.06123388931155205, 0.008531865663826466, 0.0396132655441761, -0.001450410927645862, -0.020905381068587303, -0.026874782517552376, 0.007898335345089436, -0.038856346160173416, -0.05068967863917351, 0.024014515802264214, 0.012306112796068192, -0.06219872832298279, -0.0790092721581459, -0.08770948648452759, 0.10480613261461258, -0.05449768155813217, 0.03106168657541275, -0.003308034734800458, -0.052546240389347076, -0.06725549697875977, 0.017581963911652565, -0.1514606922864914, -0.01812618598341942, 0.06507077813148499, -0.0867719054222107, 0.09096108376979828, 0.053991612046957016, 0.04312054440379143, 0.09581591933965683, -0.0523885041475296, -0.04051535949110985, -0.018132010474801064, -0.023629631847143173, -0.11133713275194168, -0.10421942919492722, -0.04342779517173767, -0.005270213820040226, 0.007239753846079111, -0.1208299994468689, 0.011407201178371906, -0.07168404757976532, 0.08847901225090027, 0.0036206089425832033, -0.018064433708786964, 0.023655973374843597, 0.0634152963757515, -0.016883330419659615, -0.05402977764606476, 0.017385685816407204, 0.007019283715635538, -0.003258961020037532, 0.08591115474700928, -0.14372634887695312, -0.13720647990703583, 0.07789598405361176, 0.013522527180612087, -0.12246786803007126, 0.039290301501750946, -0.01143055222928524, -0.07331036031246185, -0.03786180540919304, -0.06310126185417175, 0.2375873625278473, 0.02586439996957779, 0.14318539202213287, -0.08950957655906677, -0.027456339448690414, -0.0025002520997077227, -0.02237795479595661, 0.006487579550594091, 0.15171000361442566, 0.07684372365474701, -0.1272161453962326, 0.08504185080528259, 0.00491619436070323, -0.011698933318257332, 0.09203004837036133, 0.03983728960156441, -0.10832197964191437, -0.00813773088157177, 0.034824565052986145, -0.0013608051231130958, 0.08533592522144318, -0.07844960689544678, -0.0038142898119986057, 0.025450116023421288, 0.06741520762443542, 0.07133632898330688, -0.09111012518405914, 0.07170572131872177, 0.06946047395467758, -0.031377170234918594, 0.03836292773485184, -0.04769430309534073, -0.03189878165721893, 0.0927264615893364, 0.027124114334583282, -0.006324208341538906, -0.04074538126587868, -0.05490408092737198, -0.10645227879285812, 0.18399538099765778, -0.07850154489278793, -0.17656807601451874, -0.13404116034507751, 0.05879539996385574, -0.011601874604821205, 0.0348527729511261, 0.026585981249809265, -0.022406868636608124, -0.06982725113630295, -0.12006776034832001, 0.09787526726722717, -0.07230616360902786, -0.0491085946559906, -0.09863466769456863, 0.016207948327064514, -0.004075768403708935, -0.118545763194561, 0.02436850778758526, -0.0011800964130088687, -0.0012801667908206582, -0.004052919335663319, -0.028002221137285233, 0.12265574932098389, 0.13603714108467102, -0.013717967085540295, -0.047619134187698364, 0.0032099459785968065, 0.11435481160879135, -0.08793427050113678, 0.07000192254781723, 0.0784049928188324, 0.02921900898218155, 0.03131415322422981, 0.12455296516418457, 0.015386254526674747, -0.056482065469026566, 0.04026202857494354, 0.04013998061418533, -0.006600451190024614, -0.23299020528793335, -0.0927569791674614, -0.06716041266918182, 0.04759962856769562, 0.08822482824325562, 0.04398885369300842, -0.06750301271677017, 0.019164908677339554, -0.04332217201590538, 0.053629543632268906, -0.0006978781893849373, 0.0614490769803524, 0.01800406537950039, -0.011443297378718853, 0.05650187283754349, -0.0622856430709362, -0.03952751308679581, 0.09739387035369873, 0.021536661311984062, 0.1606276035308838, -0.042743369936943054, 0.22894005477428436, 0.03291445970535278, 0.055726565420627594, 0.010968229733407497, 0.06974242627620697, -0.02500639297068119, 0.025403376668691635, -0.026219310238957405, -0.04511984437704086, -0.0167600829154253, 0.04153519496321678, 0.014625182375311852, 0.02861642837524414, -0.06765355914831161, -0.046675633639097214, 0.05753738433122635, 0.2310815304517746, 0.059434909373521805, -0.18904586136341095, -0.06836136430501938, 0.00450961384922266, -0.06977765262126923, -0.08470705151557922, 0.02174648828804493, 0.14709186553955078, -0.09822510182857513, -0.012133779935538769, 0.012266415171325207, 0.11096329241991043, -0.1230231299996376, -0.014163203537464142, 0.048154428601264954, 0.06369443237781525, -0.012085670605301857, 0.11780198663473129, -0.2631761133670807, 0.10276112705469131, 0.009706451557576656, 0.06208381801843643, -0.034057728946208954, 0.028415754437446594, -0.05725453421473503, 0.00919048860669136, 0.11058478057384491, 0.010783117264509201, -0.045994359999895096, -0.09231047332286835, -0.10744480043649673, 0.008663520216941833, 0.08620316535234451, -0.0580483004450798, 0.09740903228521347, 0.042670246213674545, 0.013345363549888134, -0.008137295953929424, -0.0008344989619217813, -0.04406581073999405, -0.15923738479614258, 0.017553215846419334, -0.04448669031262398, -0.025985300540924072, -0.020410284399986267, -0.010167688131332397, -0.06784476339817047, 0.2016567587852478, -0.13531513512134552, -0.0747380405664444, -0.07882237434387207, -0.009687410667538643, 0.12245609611272812, -0.07559114694595337, 0.005477217026054859, 0.0027297288179397583, 0.042930372059345245, -0.02148343250155449, -0.03575296327471733, 0.09010082483291626, -0.06746614724397659, -0.10819154232740402, -0.07564704120159149, 0.1285448968410492, 0.06106976792216301, 0.04680372029542923, -0.031808704137802124, 0.026348326355218887, -0.02600759267807007, -0.1068829819560051, -0.013418112881481647, 0.01602165773510933, 0.13938556611537933, 0.04594816267490387, -0.06243717297911644, -0.03075304441154003, -0.06340914964675903, -0.054237574338912964, 0.09065259248018265, 0.16761042177677155, -0.04753541573882103, 0.015110958367586136, 0.170842707157135, -0.09593795239925385, -0.20251084864139557, -0.04314452037215233, 0.04851020500063896, 0.08701615780591965, -0.03806968405842781, -0.17415443062782288, 0.023902807384729385, 0.09485258907079697, 0.01146841049194336, 0.05830153077840805, -0.38227784633636475, -0.13594657182693481, 0.05640047416090965, 0.034740496426820755, -0.05058561637997627, -0.11442798376083374, -0.04244377091526985, -0.04618839547038078, -0.07585518062114716, 0.07026300579309464, -0.037297409027814865, 0.09428171068429947, -0.002364575397223234, 0.004031550604850054, 0.043514154851436615, -0.04492807015776634, 0.1378140151500702, 0.031808625906705856, 0.059308454394340515, -0.06191505119204521, 0.042907267808914185, 0.01645360328257084, -0.011774163693189621, 0.15740415453910828, -0.028869442641735077, 0.06049522012472153, -0.12411057204008102, -0.04671544209122658, -0.06063386797904968, 0.0038607947062700987, -0.04663480073213577, -0.07073816657066345, -0.065163753926754, 0.04358066990971565, 0.061034854501485825, -0.012683087028563023, -0.0021800801623612642, -0.04134906083345413, -0.03784516081213951, 0.14126376807689667, 0.09295015037059784, 0.031316425651311874, -0.07470334321260452, 0.022959664463996887, 0.0011729539837688208, 0.07233759760856628, -0.07331421971321106, 0.007800355553627014, 0.14141620695590973, 0.0013044606894254684, 0.1095660924911499, -0.005028683692216873, -0.14785638451576233, -0.01533401757478714, 0.043926309794187546, -0.10445108264684677, -0.14310799539089203, -0.003571359673514962, -0.016475871205329895, -0.06207617372274399, -0.03629158064723015, 0.08192567527294159, -0.08708767592906952, -0.020511828362941742, 0.011458808556199074, 0.03643018379807472, -0.043642133474349976, 0.19540980458259583, 0.023541642352938652, 0.04095481336116791, -0.05418524146080017, 0.13939690589904785, 0.13705044984817505, -0.13782291114330292, -0.0036899198312312365, 0.2204444408416748, -0.08555811643600464, -0.05764477699995041, 0.03193140774965286, 0.13919682800769806, 0.01032985094934702, -0.05251970887184143, -0.04108670726418495, -0.06836774945259094, 0.02015063539147377, -0.023516390472650528, 0.03268564119935036, 0.04508693516254425, -0.014282924123108387, -0.0018301944946870208, -0.12439461797475815, 0.08771763741970062, 0.08115963637828827, 0.052499786019325256, -0.025067651644349098, 0.14923280477523804, 0.00411510793492198, 0.004835331812500954, -0.012365376576781273, 0.03312433511018753, -0.06322253495454788, 0.0036138915456831455, -0.07223378121852875, -0.015068401582539082, -0.033588211983442307, -0.02888178452849388, -0.02273978665471077, 0.011079310439527035, 0.001821820973418653, 0.008894610218703747, -0.02413838729262352, -0.06482606381177902, -0.052376821637153625, 0.021502776071429253, -0.08175788074731827, -0.035145048052072525, 0.01374665554612875, -0.03847121447324753, 0.06549131125211716, 0.018539831042289734, 0.014391344040632248, 0.007563382387161255, -0.027087358757853508, 0.0631384551525116, -0.002123394748196006, 0.055489178746938705, -0.00007632080814801157, -0.06727428734302521, 0.001709958421997726, 0.0014782892540097237, -0.028408417478203773, -0.017178311944007874, 0.025640446692705154, -0.13787797093391418, 0.01250576414167881, -0.0424196794629097, -0.02022516168653965, -0.0722912922501564, 0.07433705031871796, 0.024255886673927307, 0.07144977152347565, 0.09977802634239197, -0.07448014616966248, 0.0843743160367012, -0.15624092519283295, -0.012867587618529797, 0.03126101568341255, 0.00410276185721159, 0.006583862006664276, -0.02072795480489731, 0.060315825045108795, -0.06643444299697876, 0.11792720854282379, 0.023187296465039253, 0.028398782014846802, 0.02184373140335083, -0.0820435956120491, 0.0030520258005708456, 0.04591163992881775, 0.07998592406511307, -0.03515031933784485, -0.03291509300470352, -0.054015353322029114, 0.0781141147017479, 0.0019373768009245396, 0.07870307564735413, 0.07258855551481247, 0.1520005315542221, 0.1092212125658989, 0.05105932801961899, -0.005126830190420151, -0.10573925077915192, -0.07578334957361221, 0.06737138330936432, -0.018877215683460236, 0.027047179639339447, -0.028560103848576546, 0.11313226073980331, 0.09437916427850723, -0.14491872489452362, 0.10402000695466995, -0.015253045596182346, -0.09227535873651505, -0.02853655256330967, -0.1071818545460701, -0.042099278420209885, -0.008920992724597454, -0.050664421170949936, -0.10386813431978226, 0.011394505389034748, 0.08104453235864639, 0.0304397139698267, -0.0427730530500412, 0.153298482298851, -0.04339075833559036, -0.10047262161970139, 0.05447358265519142, 0.026510851457715034, 0.07520586252212524, 0.025751328095793724, 0.010777070187032223, 0.037118203938007355, 0.012064408510923386, 0.04992198571562767, 0.0424773246049881, 0.0003115588042419404, 0.004222271963953972, 0.006484361831098795, -0.05073736235499382, -0.03209180384874344, 0.018875136971473694, 0.06004534661769867, 0.1725906878709793, 0.05892108753323555, -0.05718742683529854, -0.017444351688027382, 0.18981260061264038, -0.061736200004816055, -0.08905478566884995, -0.10522539913654327, 0.19241422414779663, 0.03460242599248886, 0.025904322043061256, 0.013130133040249348, -0.10807187855243683, -0.018307993188500404, 0.15280425548553467, 0.1713709980249405, -0.0807596817612648, -0.03192221000790596, 0.03814055398106575, -0.001775536686182022, 0.010190332308411598, 0.06556948274374008, 0.06018174812197685, 0.21480631828308105, -0.06523451209068298, 0.08105055242776871, -0.04072197899222374, 0.0036018535029143095, -0.03094306029379368, 0.1983664184808731, -0.0066332439891994, 0.022282369434833527, -0.06445043534040451, 0.060166239738464355, 0.001147473114542663, -0.18616905808448792, 0.012286568991839886, -0.09316777437925339, -0.11812970042228699, 0.022319165989756584, -0.017101461067795753, 0.02525552548468113, 0.07270774245262146, 0.008219792507588863, 0.043435513973236084, 0.10501354932785034, 0.011225130409002304, -0.07980720698833466, -0.07363872230052948, -0.002584202913567424, -0.07487086206674576, 0.1471061408519745, 0.016449453309178352, 0.10989420861005783, 0.08201386779546738, 0.02143253944814205, -0.09770891815423965, 0.10095374286174774, 0.026944583281874657, 0.027389347553253174, 0.07338394224643707, 0.10478720813989639, -0.00765366991981864, 0.0475500151515007, 0.03785383328795433, -0.08438730984926224, 0.025185266509652138, -0.05284571647644043, -0.037017498165369034, -0.12945342063903809, 0.06242924928665161, -0.0371909961104393, 0.14704622328281403, 0.19537511467933655, -0.02069994807243347, -0.004071609582751989, -0.03711356595158577, 0.0025221279356628656, -0.009629619307816029, 0.1300649791955948, -0.011567272245883942, -0.14926263689994812, 0.024803990498185158, -0.07967573404312134, 0.041831281036138535, -0.22052381932735443, -0.029086999595165253, 0.01084069162607193, -0.05532464757561684, -0.016796983778476715, 0.08870229870080948, 0.02773287147283554, 0.03591449186205864, -0.05062685161828995, -0.06456262618303299, -0.002594437450170517, 0.11063039302825928, -0.11052729189395905, -0.12065751105546951 ]
null
null
transformers
# legal_t5_small_trans_de_en model Model on translating legal text from Deustch to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_en is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to English. ### How to use Here is how to use this model to translate legal text from Deustch to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_en"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_en", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Eisenbahnunternehmen müssen Fahrkarten über mindestens einen der folgenden Vertriebswege anbieten: an Fahrkartenschaltern oder Fahrkartenautomaten, per Telefon, Internet oder jede andere in weitem Umfang verfügbare Informationstechnik oder in den Zügen." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_en model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_en | 49.1| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch English", "tags": ["translation Deustch English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "(2) Die Richtlinie 80/987/EWG des Rates(4) soll den Arbeitnehmern im Fall der Zahlungsunf\u00e4higkeit ihres Arbeitgebers einen Mindestschutz gew\u00e4hren. Deshalb verpflichtet sie die Mitgliedstaaten zur Schaffung einer Einrichtung, die die Befriedigung der nicht erfuellten Arbeitnehmeranspr\u00fcche garantiert."}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_en
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_en model ===================================== Model on translating legal text from Deustch to English. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_en is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to English. ### How to use Here is how to use this model to translate legal text from Deustch to English in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_en model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.12128479033708572, 0.07445929199457169, -0.002558467211201787, 0.07193563878536224, 0.09489201009273529, 0.019439758732914925, 0.07520447671413422, 0.09494846314191818, -0.08002821356058121, 0.06719966232776642, 0.06600908935070038, 0.038878872990608215, 0.08966393023729324, 0.11295094341039658, 0.04740649461746216, -0.2275572419166565, 0.02348187007009983, -0.01181849092245102, -0.004882992710918188, 0.1343267410993576, 0.12031075358390808, -0.09040981531143188, 0.0223903339356184, -0.026471421122550964, -0.1100781261920929, -0.0033290095161646605, -0.06455589830875397, -0.0695195198059082, 0.0776236355304718, 0.04216277226805687, 0.11440170556306839, 0.0178641639649868, 0.08219008892774582, -0.17522327601909637, -0.000786326767411083, 0.08664556592702866, 0.054124340415000916, 0.041137199848890305, 0.06686872243881226, -0.009880829602479935, 0.14681176841259003, -0.02085038274526596, 0.06446752697229385, 0.022357670590281487, -0.12122728675603867, -0.1378481388092041, -0.06314381211996078, 0.03698915243148804, 0.1520577222108841, 0.14972199499607086, -0.0605308972299099, 0.06207845360040665, -0.1150200366973877, 0.061623021960258484, 0.06124740466475487, -0.2667628824710846, -0.06266584247350693, 0.025192102417349815, 0.04329131171107292, 0.07869483530521393, -0.04989563301205635, -0.048494428396224976, 0.038418032228946686, 0.0333440825343132, 0.009446697309613228, -0.021351786330342293, 0.012351550161838531, -0.008057782426476479, -0.167277529835701, -0.09577412158250809, 0.1583678424358368, 0.0007519883802160621, -0.0765499621629715, -0.09536971896886826, -0.035667575895786285, -0.15449462831020355, 0.009680533781647682, -0.05892563983798027, 0.04317891225218773, 0.00012811656051781029, 0.03153441101312637, -0.02003183774650097, -0.11741437017917633, -0.1130608543753624, 0.026166189461946487, 0.08327069878578186, 0.09778669476509094, -0.01269031036645174, -0.00440213643014431, 0.16588585078716278, 0.019401652738451958, -0.09249015152454376, -0.020920105278491974, -0.0012388542527332902, -0.11268001049757004, -0.02006184495985508, -0.03208862245082855, -0.14096730947494507, -0.06417594850063324, 0.10047157853841782, -0.025714170187711716, 0.06068645045161247, 0.035005275160074234, 0.03847876563668251, 0.017053795978426933, 0.16712555289268494, -0.08033672720193863, -0.061545707285404205, -0.05749141052365303, 0.06153532862663269, -0.058719322085380554, 0.02243582159280777, -0.015397584065794945, -0.0026218954008072615, 0.06582838296890259, 0.07557142525911331, -0.08215506374835968, 0.006488711107522249, -0.04667690396308899, -0.021459953859448433, 0.051982324570417404, -0.11946094781160355, -0.038754262030124664, 0.001417549909092486, -0.10580745339393616, -0.03088671714067459, 0.08140096813440323, -0.009016632102429867, -0.13076359033584595, 0.05524744093418121, -0.029539478942751884, -0.02168159745633602, -0.13250304758548737, -0.10330165922641754, -0.031844593584537506, -0.06505001336336136, -0.036901865154504776, -0.06802543252706528, -0.1697237491607666, -0.10491608083248138, 0.06595144420862198, -0.0505107082426548, -0.047646187245845795, -0.09304926544427872, -0.014141883701086044, -0.004463502671569586, -0.03437143936753273, 0.12028080224990845, -0.023510972037911415, 0.09388122707605362, 0.021485572680830956, 0.04478896036744118, 0.1463860422372818, 0.08076238632202148, -0.10257888585329056, 0.010414846241474152, -0.09602103382349014, 0.178109809756279, -0.016664743423461914, 0.0026762019842863083, -0.14919038116931915, -0.07736171036958694, -0.06092451885342598, 0.07003806531429291, 0.10296827554702759, 0.12665635347366333, -0.15529754757881165, -0.01166271697729826, 0.2085755169391632, -0.08729664981365204, -0.03619906306266785, 0.104769267141819, -0.04728393256664276, 0.14092713594436646, 0.09530024230480194, 0.16912133991718292, 0.0555269829928875, -0.07646597921848297, 0.001513542840257287, -0.038166217505931854, -0.0067397672683000565, 0.00889856182038784, 0.07980695366859436, -0.05093684419989586, -0.06743550300598145, -0.00698190787807107, -0.08847195655107498, 0.035325728356838226, -0.06031472608447075, -0.0610380582511425, 0.022440381348133087, -0.059679578989744186, -0.03989192470908165, 0.060464709997177124, 0.049487773329019547, -0.05170450732111931, -0.12591342628002167, 0.0134396031498909, 0.09837459027767181, -0.06693584471940994, 0.02476760745048523, -0.057682104408741, -0.05764661729335785, -0.07355573773384094, -0.009523747488856316, -0.16545386612415314, 0.04151757434010506, 0.04540492966771126, -0.02585616149008274, 0.047840554267168045, 0.05025395750999451, 0.03654687479138374, 0.055597733706235886, -0.00411992659792304, -0.06037953123450279, -0.04733448848128319, -0.03610539808869362, -0.1267237514257431, -0.10413575917482376, -0.0255372766405344, -0.025740541517734528, 0.08710508048534393, -0.17904594540596008, 0.03525961935520172, -0.10423041135072708, 0.04483218863606453, -0.013301393948495388, -0.04629897326231003, 0.035468053072690964, 0.04033065587282181, 0.024776244536042213, -0.06490755826234818, 0.04410584643483162, 0.028331978246569633, 0.03179749846458435, 0.08657195419073105, -0.1079375296831131, -0.15091469883918762, 0.08893982321023941, 0.053023189306259155, -0.14878962934017181, 0.02140842005610466, -0.04088306054472923, -0.06699523329734802, -0.05435611307621002, -0.0005286150262691081, 0.2548218369483948, 0.011739294044673443, 0.14674127101898193, -0.10252354294061661, -0.045540403574705124, -0.012519601732492447, -0.014875355176627636, 0.013130401261150837, 0.14272502064704895, 0.057863570749759674, -0.09002795070409775, 0.048470549285411835, 0.022768115624785423, -0.02281271666288376, 0.14635232090950012, 0.0007051379652693868, -0.12871229648590088, 0.014197438955307007, 0.06262148171663284, -0.03434383124113083, 0.08752414584159851, -0.14423681795597076, 0.0023434299509972334, 0.014603286981582642, 0.055669236928224564, 0.05817362293601036, -0.15591192245483398, 0.02589474245905876, 0.06269458681344986, -0.05181725323200226, 0.010346298106014729, -0.014517631381750107, -0.04897160083055496, 0.07197767496109009, 0.012342984788119793, -0.021774930879473686, -0.013386018574237823, -0.036676470190286636, -0.13441058993339539, 0.2045326977968216, -0.07038205116987228, -0.1447049081325531, -0.10133154690265656, 0.09356284141540527, 0.06833839416503906, -0.005505950190126896, 0.04825300723314285, -0.08783907443284988, -0.05029827356338501, -0.09307698160409927, 0.11538916081190109, -0.06071213632822037, -0.05345746874809265, -0.09088248759508133, -0.010788148269057274, -0.0047958167269825935, -0.13756413757801056, 0.03318554535508156, -0.04130004718899727, -0.0790649950504303, 0.004899770487099886, -0.04671972244977951, 0.06544855237007141, 0.14853782951831818, -0.0033022575080394745, 0.01793213002383709, -0.01348910667002201, 0.19250527024269104, -0.1305375099182129, 0.013341009616851807, 0.07854922115802765, 0.037588585168123245, 0.006427951157093048, 0.10407678037881851, -0.009414251893758774, -0.09258420765399933, 0.051433712244033813, 0.05380280688405037, -0.02022332325577736, -0.28603804111480713, -0.017638007178902626, -0.020129140466451645, -0.03670988976955414, 0.10815298557281494, 0.04173770546913147, 0.015991732478141785, 0.039995767176151276, -0.020326096564531326, 0.004559037741273642, 0.02985473722219467, 0.0496002621948719, -0.004598462488502264, 0.0014989913906902075, 0.07658129930496216, -0.04885770380496979, -0.019504651427268982, 0.044068459421396255, 0.014085137285292149, 0.24518093466758728, -0.05237394943833351, 0.11046901345252991, 0.08149062842130661, 0.09754475206136703, 0.004340937361121178, 0.07734205573797226, -0.030411208048462868, 0.019820351153612137, -0.00801212526857853, -0.022825492545962334, -0.0654124915599823, 0.038972847163677216, 0.013398733921349049, 0.01198853924870491, -0.10261736810207367, -0.015278421342372894, 0.01272044237703085, 0.3261539340019226, 0.06967059522867203, -0.23203647136688232, -0.06394589692354202, 0.004225194454193115, -0.07609353214502335, -0.09627940505743027, 0.0688263326883316, 0.07225596159696579, -0.12869198620319366, -0.017459683120250702, -0.0239139124751091, 0.09906131774187088, -0.1050373911857605, -0.04896373301744461, 0.050852321088314056, 0.05218115821480751, -0.00799897313117981, 0.0863444060087204, -0.30869579315185547, 0.1868826150894165, -0.015163373202085495, 0.13209150731563568, -0.008200207725167274, 0.027230752632021904, -0.048962175846099854, -0.000028756676329066977, 0.1534547060728073, -0.004306372720748186, 0.026632489636540413, -0.0653826966881752, -0.10031367093324661, 0.021775981411337852, 0.034328218549489975, -0.06279873847961426, 0.08606413006782532, 0.02622629888355732, 0.0356847420334816, -0.007767362520098686, -0.09898234903812408, -0.1438208818435669, -0.10428620874881744, -0.011914430186152458, -0.08827860653400421, 0.06607931107282639, -0.0404561422765255, -0.05476028472185135, -0.013761874288320541, 0.14050114154815674, -0.11858520656824112, -0.09949632734060287, -0.09199211746454239, 0.02659938670694828, 0.0941794291138649, -0.04586757346987724, -0.002241102047264576, 0.018364764750003815, 0.006134092807769775, 0.0004035848251078278, 0.011681963689625263, 0.09531816095113754, -0.0608784519135952, -0.12295648455619812, -0.04423906281590462, 0.12556828558444977, 0.12840324640274048, 0.05826329067349434, -0.032051194459199905, 0.010234677232801914, -0.01718774251639843, -0.07236684858798981, 0.009295664727687836, 0.011833411641418934, 0.04920683801174164, 0.01797405816614628, -0.06177885830402374, -0.010594690218567848, -0.09818900376558304, -0.05531726032495499, 0.10074085742235184, 0.14174334704875946, -0.04430871084332466, 0.06171324849128723, 0.1743929088115692, -0.11036919802427292, -0.16288810968399048, 0.01650088094174862, 0.09650510549545288, 0.08210291713476181, -0.07274765521287918, -0.2237129509449005, 0.02588302083313465, 0.0870540514588356, 0.010681810788810253, -0.010267974808812141, -0.40882715582847595, -0.13525690138339996, 0.117347851395607, 0.09007676690816879, -0.038594938814640045, -0.08715423196554184, -0.015006396919488907, 0.03950538486242294, -0.042910169810056686, 0.07309293001890182, -0.01575198583304882, 0.08856911212205887, 0.02182454615831375, -0.03766651451587677, 0.03864526376128197, -0.05506082996726036, 0.11318586766719818, 0.07060762494802475, 0.05146113038063049, -0.04527803510427475, 0.03855135291814804, -0.0063810632564127445, -0.020649511367082596, 0.16155847907066345, 0.0296026561409235, 0.03877119719982147, -0.20298035442829132, -0.06584414094686508, -0.08054644614458084, -0.008852546103298664, -0.06895235925912857, -0.05678552761673927, -0.04506118968129158, 0.0860457718372345, 0.04428132623434067, -0.009845961816608906, -0.02154940366744995, -0.0685180276632309, -0.02520904131233692, 0.0726684182882309, 0.10262089967727661, 0.0721176341176033, -0.08444612473249435, 0.018759584054350853, 0.03539926931262016, 0.09890531003475189, -0.15663346648216248, -0.022714227437973022, 0.12523852288722992, -0.021418817341327667, 0.14067788422107697, -0.008364163339138031, -0.14940817654132843, 0.010348094627261162, 0.0316559374332428, -0.08394831418991089, -0.1281745582818985, -0.008504345081746578, -0.0679701715707779, -0.04115481674671173, -0.04703936725854874, 0.0687435045838356, -0.11595452576875687, -0.02326761744916439, -0.016292540356516838, 0.03941534832119942, -0.07189435511827469, 0.22176997363567352, 0.04368695616722107, 0.056398194283246994, -0.06348049640655518, 0.13475938141345978, 0.10737074166536331, -0.12388211488723755, 0.02688394859433174, 0.17049120366573334, -0.0943809226155281, -0.05506986379623413, 0.005129426252096891, 0.1359308809041977, -0.011088436469435692, -0.07257184386253357, -0.053424984216690063, -0.05102530121803284, 0.06573616713285446, -0.008911993354558945, 0.039798181504011154, 0.0250160600990057, -0.03519228845834732, 0.0024158903397619724, -0.12593260407447815, 0.06757432967424393, 0.10076884180307388, 0.0031413778197020292, -0.023299232125282288, 0.17807328701019287, 0.05564111843705177, 0.0528968907892704, -0.005380505695939064, -0.037011317908763885, -0.044463369995355606, 0.07070955634117126, -0.028391947969794273, -0.028413021937012672, -0.05340281128883362, -0.010814364068210125, -0.023398734629154205, -0.008625661954283714, 0.0023955143988132477, 0.019501708447933197, -0.06384200602769852, -0.03399102762341499, -0.03518090024590492, 0.0438392236828804, -0.07125361263751984, -0.010629202239215374, -0.004263621289283037, -0.06242725998163223, 0.07659249007701874, 0.02216244861483574, -0.0038619169499725103, 0.01031928788870573, 0.0029699895530939102, 0.07467357069253922, -0.026918208226561546, 0.001757283927872777, -0.016854621469974518, -0.09094051271677017, 0.035574641078710556, -0.009778792038559914, -0.01416797935962677, -0.014291627332568169, 0.04833120107650757, -0.13676375150680542, 0.06552775204181671, -0.0213627852499485, -0.005313402507454157, -0.07482472062110901, 0.10486336797475815, 0.022688115015625954, 0.08645899593830109, 0.10959387570619583, -0.0710504874587059, 0.06711231172084808, -0.14364776015281677, -0.03987472504377365, 0.024716362357139587, 0.020198902115225792, -0.031326379626989365, -0.06236046925187111, 0.059131741523742676, -0.049010954797267914, 0.07518526911735535, 0.06188330799341202, 0.03658095374703407, 0.03631804883480072, -0.13044168055057526, -0.004374784417450428, 0.04731283709406853, 0.06344078481197357, -0.003938945010304451, 0.010742527432739735, -0.011043848469853401, 0.048366110771894455, -0.014552116394042969, 0.07799874246120453, 0.13186167180538177, 0.22187381982803345, 0.08514480292797089, 0.10106661170721054, -0.05212628096342087, -0.11101832240819931, -0.10482742637395859, 0.09948804974555969, -0.005916002206504345, 0.02203967049717903, -0.021415170282125473, 0.1433790922164917, 0.09564542025327682, -0.15422651171684265, 0.07317075878381729, 0.002797778695821762, -0.1044655293226242, -0.08726850152015686, -0.05280890315771103, -0.030175024643540382, -0.06634610891342163, -0.007505963556468487, -0.08874674141407013, 0.02402009628713131, 0.06308585405349731, 0.07518496364355087, -0.03929021954536438, 0.1442243754863739, 0.018177630379796028, -0.06825453042984009, 0.09107157588005066, -0.013190923258662224, 0.09753681719303131, -0.08381032198667526, 0.011740585789084435, 0.018575480207800865, -0.009669778868556023, 0.060658544301986694, 0.009648379869759083, -0.05249038338661194, 0.0348176583647728, 0.041799187660217285, -0.045492835342884064, -0.001919784233905375, 0.04758968949317932, 0.11153289675712585, 0.10958461463451385, 0.07555018365383148, -0.042750388383865356, -0.02329503372311592, 0.19589325785636902, -0.03419481962919235, -0.1025051400065422, -0.16792802512645721, 0.17206351459026337, 0.08317277580499649, 0.02037705108523369, 0.03640526905655861, -0.10059042274951935, -0.006640846841037273, 0.23175446689128876, 0.1391991823911667, -0.048855897039175034, -0.047356486320495605, 0.0346052311360836, -0.007272551767528057, 0.010325239971280098, 0.10875976830720901, 0.05568304657936096, 0.1970788538455963, -0.10590101033449173, 0.03784498572349548, -0.07753178477287292, -0.061270035803318024, -0.010536730289459229, 0.15758226811885834, 0.01063905656337738, -0.010666009038686752, -0.053802698850631714, 0.10740213096141815, -0.0028618418145924807, -0.188789963722229, 0.05503467097878456, -0.05841513350605965, -0.12832137942314148, -0.022002780809998512, -0.0034796285908669233, 0.0033656053710728884, 0.04695931822061539, -0.0015180822229012847, 0.001208204310387373, 0.14145387709140778, 0.02907733805477619, -0.05819827690720558, -0.14048705995082855, 0.07314033806324005, -0.0023735433351248503, 0.16003966331481934, -0.001645103096961975, 0.08449593186378479, 0.07197511196136475, 0.028980892151594162, -0.1077900305390358, 0.08630578219890594, 0.045757513493299484, 0.042374297976493835, 0.04756232351064682, 0.11043144762516022, -0.02376413717865944, 0.07519259303808212, 0.028237152844667435, -0.10623810440301895, 0.05609961599111557, -0.13094140589237213, -0.05476055666804314, -0.16571886837482452, 0.039273373782634735, -0.03955911099910736, 0.1257590651512146, 0.21232225000858307, -0.024340668693184853, 0.012070282362401485, -0.06615535914897919, 0.035710107535123825, -0.024748994037508965, 0.15155169367790222, 0.009079751558601856, -0.17829346656799316, 0.012957370840013027, -0.06351620703935623, 0.030611326918005943, -0.22216451168060303, -0.02361113578081131, 0.01421800535172224, -0.0825560912489891, -0.027056243270635605, 0.12472547590732574, 0.04350423440337181, 0.06543330103158951, -0.03413185104727745, -0.018155537545681, -0.018109314143657684, 0.14504724740982056, -0.12959913909435272, -0.09777216613292694 ]
null
null
transformers
# legal_t5_small_trans_de_en_small_finetuned model Model on translating legal text from Deustch to English. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_en_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_de_en_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to English. ### How to use Here is how to use this model to translate legal text from Deustch to English in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_en_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_en", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "In welchen anderen EU-Ländern ist von ähnlichen Listen mit Parteikadern und Regierungsmitgliedern berichtet worden, die die „Schirmherrschaft“ über Vorschläge für private von der Europäischen Union kofinanzierte Investitionen übernommen haben?" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_en_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_en_small_finetuned | 48.674| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch English", "tags": ["translation Deustch English model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "In welchen anderen EU-L\u00e4ndern ist von \u00e4hnlichen Listen mit Parteikadern und Regierungsmitgliedern berichtet worden, die die \u201eSchirmherrschaft\u201c \u00fcber Vorschl\u00e4ge f\u00fcr private von der Europ\u00e4ischen Union kofinanzierte Investitionen \u00fcbernommen haben?"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_en_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch English model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch English" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_en\_small\_finetuned model ======================================================= Model on translating legal text from Deustch to English. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_en\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_de\_en\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to English. ### How to use Here is how to use this model to translate legal text from Deustch to English in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_en\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_en\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_en\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch English model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to English in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_en\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06815037131309509, 0.0861968994140625, -0.003414592007175088, 0.08080258965492249, 0.053536854684352875, 0.01624855399131775, 0.03477976843714714, 0.09432367235422134, -0.03304044529795647, 0.08005276322364807, 0.0419362410902977, -0.02150493487715721, 0.06872718781232834, 0.03052225150167942, 0.06880360841751099, -0.2186434119939804, 0.0031331926584243774, -0.036481548100709915, -0.006493287160992622, 0.093968465924263, 0.09464426338672638, -0.06486446410417557, 0.04576713964343071, -0.036866456270217896, -0.022734835743904114, 0.026829060167074203, -0.0996195375919342, -0.04457760602235794, 0.08452894538640976, 0.0899755209684372, 0.08473283052444458, -0.01712931878864765, 0.06687286496162415, -0.19541391730308533, -0.0008465337450616062, 0.0830019935965538, 0.0025042318738996983, 0.04047359898686409, 0.13029631972312927, 0.001432732678949833, 0.15884879231452942, -0.05113432556390762, 0.03380568325519562, 0.041260797530412674, -0.12643319368362427, -0.12384235858917236, -0.04947848618030548, 0.02350304089486599, 0.08942015469074249, 0.1360289454460144, -0.05625108629465103, 0.05766258016228676, -0.048056915402412415, 0.07305862009525299, 0.06512051075696945, -0.2099199891090393, -0.0289109256118536, 0.029749879613518715, 0.048944227397441864, 0.07599630951881409, -0.04618474468588829, -0.02527647092938423, 0.05231700465083122, 0.07273123413324356, 0.02264799177646637, -0.06042443588376045, -0.046909529715776443, -0.050659406930208206, -0.12902964651584625, -0.05851379409432411, 0.15082421898841858, 0.02371395193040371, -0.0501539520919323, -0.08301189541816711, -0.06164027005434036, -0.06956134736537933, -0.0051872581243515015, -0.042537871748209, 0.026984261348843575, 0.0008986445609480143, 0.07559742778539658, -0.020851144567131996, -0.11332771182060242, -0.058087896555662155, -0.07294126600027084, 0.12501536309719086, 0.050060439854860306, 0.011974292807281017, 0.015077733434736729, 0.08712172508239746, -0.11111006885766983, -0.0857948586344719, 0.0010269865160807967, 0.012605836614966393, -0.10528355091810226, -0.0020324671640992165, -0.012932399287819862, -0.1681436449289322, -0.014593292027711868, 0.03926866501569748, -0.06524726003408432, 0.055414773523807526, 0.07328784465789795, 0.03696693852543831, 0.07008633762598038, 0.11423830687999725, -0.1240648627281189, -0.12115102261304855, -0.02564222738146782, 0.005546899978071451, 0.0079737463966012, 0.019106389954686165, -0.05421557277441025, -0.028123358264565468, 0.011600101366639137, 0.031259361654520035, 0.013630586676299572, 0.022014429792761803, -0.02354419231414795, -0.03425143286585808, 0.12678714096546173, -0.10963214188814163, -0.0057440283708274364, 0.003924754913896322, -0.09489010274410248, -0.025835378095507622, 0.07499046623706818, -0.013110599480569363, -0.11724772304296494, 0.07045344263315201, -0.03300177678465843, -0.03193838894367218, -0.10633037984371185, -0.1710975617170334, -0.010079738683998585, 0.006939583923667669, -0.05435122922062874, -0.10379073768854141, -0.13816560804843903, -0.09676395356655121, 0.031865864992141724, -0.05862557515501976, -0.0011187432100996375, -0.06498846411705017, -0.0005404502735473216, -0.007850640453398228, -0.010251333005726337, 0.10534492880105972, -0.04063275083899498, 0.032680265605449677, 0.03602241352200508, 0.0676761195063591, 0.024168966338038445, 0.03634316846728325, -0.12375134974718094, 0.0432780496776104, -0.14321397244930267, 0.15135054290294647, -0.025640826672315598, 0.010623177513480186, -0.1287834495306015, -0.05107482522726059, -0.08214015513658524, 0.06585313379764557, 0.06286181509494781, 0.11752239614725113, -0.20279602706432343, -0.007502844091504812, 0.18831685185432434, -0.08526739478111267, -0.07580159604549408, 0.12221360206604004, -0.025813672691583633, 0.037723056972026825, 0.08698925375938416, 0.11115740239620209, 0.07382109016180038, -0.008039780892431736, -0.04579365998506546, 0.003987620584666729, 0.01944126933813095, 0.08726242929697037, 0.08184542506933212, -0.07383264601230621, 0.036081425845623016, 0.016995474696159363, 0.015251356177031994, 0.003994208760559559, -0.01733986847102642, -0.03003579191863537, 0.01317355502396822, -0.04832148551940918, -0.04239485040307045, 0.030322423204779625, 0.004703442566096783, -0.05748741701245308, -0.08719611167907715, -0.02759823575615883, 0.10662680119276047, -0.049930326640605927, 0.01576615869998932, -0.0096550602465868, -0.06426060199737549, -0.10527920722961426, 0.016048211604356766, -0.16681721806526184, -0.013147760182619095, 0.03267819434404373, -0.07844043523073196, 0.113924540579319, 0.06607865542173386, 0.050100646913051605, 0.08985960483551025, -0.05936175957322121, -0.035287242382764816, 0.0021363995037972927, -0.01610754430294037, -0.10245358198881149, -0.11891268193721771, -0.03778015449643135, -0.025651123374700546, -0.018842622637748718, -0.11571745574474335, -0.0014829190913587809, -0.07034735381603241, 0.08739718794822693, 0.007191984914243221, -0.021723154932260513, 0.03462272509932518, 0.06657590717077255, -0.028498174622654915, -0.041207849979400635, 0.01792622171342373, -0.011814513243734837, -0.042813222855329514, 0.08970005810260773, -0.16807030141353607, -0.11557698994874954, 0.08379896730184555, 0.007149540353566408, -0.12558753788471222, 0.006541363429278135, -0.014688728377223015, -0.06798727810382843, -0.055772848427295685, -0.06511571258306503, 0.2519197165966034, 0.03225202113389969, 0.13705678284168243, -0.10464639961719513, -0.029717544093728065, 0.0028789814095944166, -0.018524307757616043, 0.0021340639796108007, 0.1683627963066101, 0.07581491023302078, -0.1476397067308426, 0.08324733376502991, 0.010267847217619419, -0.02610090933740139, 0.09343890100717545, 0.06138274446129799, -0.10153450816869736, -0.0071580796502530575, 0.03199825808405876, -0.004362279083579779, 0.057679034769535065, -0.09646522998809814, -0.0048483856953680515, 0.02526889182627201, 0.05887317284941673, 0.06173297390341759, -0.08198072016239166, 0.06563010066747665, 0.06876835972070694, -0.02505888231098652, 0.038157910108566284, -0.05147159844636917, -0.03083406761288643, 0.0956837460398674, 0.01930864155292511, -0.026220669969916344, -0.043663423508405685, -0.04956220090389252, -0.10962086170911789, 0.195028156042099, -0.06733452528715134, -0.2121763527393341, -0.1119992733001709, 0.07543016970157623, -0.03113168105483055, 0.0362887866795063, 0.030753672122955322, -0.03877092897891998, -0.07366863638162613, -0.12576331198215485, 0.0980217233300209, -0.08561301231384277, -0.050894614309072495, -0.12700679898262024, 0.02658742479979992, 0.012756707146763802, -0.1261192262172699, 0.025803128257393837, 0.003528464352712035, -0.016987819224596024, -0.0006597962928935885, -0.02569286711513996, 0.10663366317749023, 0.12964703142642975, -0.01140114851295948, -0.03787688538432121, -0.005183670669794083, 0.13718187808990479, -0.09835341572761536, 0.06726954132318497, 0.06793241947889328, 0.03326694667339325, 0.026710230857133865, 0.1503847986459732, 0.02260192297399044, -0.0670107901096344, 0.049408238381147385, 0.05871546268463135, -0.01839238964021206, -0.2285371720790863, -0.10111896693706512, -0.06640425324440002, 0.022181637585163116, 0.11680123955011368, 0.042227596044540405, -0.0477268248796463, 0.019610682502388954, -0.061224315315485, 0.06480719894170761, -0.005223320331424475, 0.05397080257534981, 0.04114795848727226, -0.013289138674736023, 0.08204755932092667, -0.05982229486107826, -0.05095288157463074, 0.09894917905330658, 0.030675413087010384, 0.18526212871074677, -0.053166329860687256, 0.1951860934495926, 0.0547700859606266, 0.0205569788813591, 0.01086388248950243, 0.05399307236075401, -0.041074175387620926, 0.028586408123373985, -0.03239712119102478, -0.058896973729133606, -0.027753956615924835, 0.0631873831152916, 0.01713373325765133, 0.016078505665063858, -0.04711918905377388, -0.06128539890050888, 0.04533631354570389, 0.1993703842163086, 0.06389112770557404, -0.19853346049785614, -0.05538978427648544, 0.013549069873988628, -0.07431928813457489, -0.06757017225027084, 0.02235044166445732, 0.1360296607017517, -0.08668750524520874, 0.019269775599241257, 0.023729735985398293, 0.11960648000240326, -0.13387152552604675, -0.01128578931093216, 0.035814959555864334, 0.048169903457164764, -0.014835161156952381, 0.1251833587884903, -0.25369367003440857, 0.13513167202472687, 0.01305440440773964, 0.05654881149530411, -0.03437488153576851, 0.01681153103709221, -0.033831626176834106, 0.00044540181988850236, 0.11241498589515686, 0.013125264085829258, -0.006015648599714041, -0.11261393874883652, -0.09761760383844376, 0.00395979592576623, 0.05906610190868378, -0.06636194884777069, 0.0933600589632988, 0.053423523902893066, 0.017085444182157516, -0.016385801136493683, 0.018000314012169838, -0.07140710204839706, -0.15518656373023987, 0.004845225252211094, -0.032880015671253204, -0.011567962355911732, -0.014425423927605152, -0.023240724578499794, -0.06574101001024246, 0.18558649718761444, -0.1032065823674202, -0.08183840662240982, -0.07401702553033829, 0.006060904357582331, 0.13893795013427734, -0.07328186929225922, 0.0006355444784276187, 0.0007755759288556874, 0.04241744428873062, -0.024050364270806313, -0.03280036523938179, 0.08709204941987991, -0.08083541691303253, -0.10156602412462234, -0.07422003149986267, 0.10956908017396927, 0.07760373502969742, 0.047645773738622665, -0.01852862723171711, 0.032398905605077744, -0.018936648964881897, -0.1113799586892128, -0.01853560283780098, 0.0355512797832489, 0.10990555584430695, 0.06260011345148087, -0.03910700976848602, -0.036340005695819855, -0.07260780781507492, -0.05903872102499008, 0.07382240146398544, 0.15746638178825378, -0.03794538602232933, 0.024844205006957054, 0.19412918388843536, -0.10483215004205704, -0.18633541464805603, -0.05762103945016861, 0.07238160818815231, 0.07551847398281097, -0.011888953857123852, -0.1732235550880432, 0.04982380568981171, 0.09644272178411484, 0.006993168964982033, 0.06750606000423431, -0.3738352656364441, -0.14132153987884521, 0.08650447428226471, 0.03200044855475426, -0.04953561723232269, -0.12668955326080322, -0.055132005363702774, -0.06082870066165924, -0.02184506505727768, 0.0863250344991684, -0.03591473400592804, 0.09209646284580231, -0.004233637824654579, 0.024530719965696335, 0.04444935917854309, -0.03632231056690216, 0.12980519235134125, 0.03585266321897507, 0.04670341685414314, -0.06472066044807434, 0.061884406954050064, 0.006330172065645456, -0.017240282148122787, 0.15678007900714874, -0.04333258420228958, 0.05748936906456947, -0.13869665563106537, -0.05823928490281105, -0.06701157242059708, 0.021443812176585197, -0.03991462662816048, -0.07286880910396576, -0.0572219043970108, 0.04313872009515762, 0.04348326846957207, -0.004109715577214956, -0.011428180150687695, -0.05620744824409485, -0.0031187478452920914, 0.1608288735151291, 0.11858901381492615, 0.029359355568885803, -0.11175066977739334, 0.030255787074565887, 0.0010219714604318142, 0.08153394609689713, -0.07793211191892624, 0.008319683372974396, 0.13669434189796448, 0.017197616398334503, 0.11967886239290237, -0.01174641028046608, -0.14805226027965546, -0.008328227326273918, 0.04785187914967537, -0.08716452866792679, -0.1338346004486084, -0.0042031677439808846, 0.03110966458916664, -0.07676100730895996, -0.03508688881993294, 0.09725982695817947, -0.09380096197128296, -0.019274231046438217, 0.009299430064857006, 0.035388700664043427, -0.031091760843992233, 0.1999943107366562, 0.03859055042266846, 0.04040863737463951, -0.061056651175022125, 0.11736063659191132, 0.13692738115787506, -0.15954749286174774, 0.018732162192463875, 0.18894261121749878, -0.08295774459838867, -0.06869253516197205, 0.0038564903661608696, 0.12518596649169922, -0.022685525938868523, -0.06164874881505966, -0.023024583235383034, -0.05691003426909447, 0.03311823308467865, -0.006806927267462015, 0.038455065339803696, 0.04423738270998001, -0.01825687289237976, -0.010921533219516277, -0.09614083915948868, 0.09563317894935608, 0.07610515505075455, 0.039977725595235825, -0.024982821196317673, 0.13135023415088654, 0.030570758506655693, -0.018434951081871986, -0.011680110357701778, 0.007020534481853247, -0.06409835070371628, 0.013897065073251724, -0.09068314731121063, 0.012964794412255287, -0.05022541433572769, -0.012120181694626808, -0.017717132344841957, 0.0035915556363761425, -0.013091613538563251, -0.0009157810127362609, -0.02573946863412857, -0.05400785431265831, -0.034713104367256165, 0.024716893211007118, -0.09337475150823593, -0.040134456008672714, 0.01487854402512312, -0.023037083446979523, 0.05262225866317749, -0.001380361383780837, 0.011781003326177597, -0.010602477937936783, -0.004461787175387144, 0.06931648403406143, 0.016669917851686478, 0.04624610021710396, -0.01219241414219141, -0.0857311263680458, 0.008764408528804779, 0.017855141311883926, -0.009103423915803432, -0.01693660393357277, 0.01867612451314926, -0.15289486944675446, 0.00489193806424737, -0.012072371318936348, -0.02984284609556198, -0.08014696091413498, 0.07635018974542618, 0.04173656553030014, 0.06282742321491241, 0.10998933017253876, -0.0723176822066307, 0.08203333616256714, -0.16793625056743622, -0.008281311020255089, 0.01976158283650875, 0.013620402663946152, -0.023782072588801384, -0.002052192809060216, 0.05033329874277115, -0.06430558860301971, 0.1414489597082138, 0.026495590806007385, 0.05830938369035721, 0.024831483140587807, -0.07210320234298706, -0.006738243158906698, 0.02774672955274582, 0.07650782912969589, -0.029242465272545815, -0.02919958531856537, -0.07233578711748123, 0.0861567035317421, 0.002394641749560833, 0.06504064798355103, 0.049057938158512115, 0.13018321990966797, 0.11833664774894714, 0.0365595743060112, -0.007994286715984344, -0.10647519677877426, -0.06680960208177567, 0.04552901163697243, -0.00136660342104733, 0.03830171376466751, -0.01242868322879076, 0.10181718319654465, 0.12173643708229065, -0.13444365561008453, 0.12292260676622391, -0.008809822611510754, -0.0789838507771492, -0.039460115134716034, -0.13075034320354462, -0.0419313870370388, -0.011281639337539673, -0.04762737825512886, -0.11030204594135284, 0.007872291840612888, 0.07132746279239655, 0.04342564567923546, -0.03324020653963089, 0.13894660770893097, -0.046702370047569275, -0.11254159361124039, 0.04278544709086418, 0.01906411536037922, 0.08358743041753769, 0.03325045481324196, 0.03366389870643616, 0.053714968264102936, 0.010866164229810238, 0.043738577514886856, 0.054248910397291183, -0.02247540093958378, 0.004057223908603191, 0.010129574686288834, -0.05442507192492485, -0.03945817053318024, 0.014371083118021488, 0.06582867354154587, 0.20156188309192657, 0.05910822004079819, -0.06246600300073624, -0.019794799387454987, 0.1843477189540863, -0.061893921345472336, -0.08528479933738708, -0.09701261669397354, 0.22285865247249603, 0.03954903781414032, 0.03161300718784332, -0.0018497486598789692, -0.1079627126455307, -0.009166290052235126, 0.14817772805690765, 0.19654083251953125, -0.05748546123504639, -0.02968696877360344, 0.02373741939663887, -0.005376503802835941, 0.02771092765033245, 0.039591867476701736, 0.03344140946865082, 0.2704254388809204, -0.0801805779337883, 0.0931108146905899, -0.04272150993347168, 0.009956223890185356, -0.005722932983189821, 0.1662634015083313, 0.005243572406470776, 0.024191979318857193, -0.06888333708047867, 0.07523003965616226, -0.02445804700255394, -0.184157595038414, 0.013885391876101494, -0.08413037657737732, -0.11160499602556229, 0.011417020112276077, -0.017721740528941154, 0.056707389652729034, 0.06460478156805038, 0.012285105884075165, 0.03549744561314583, 0.07733318209648132, 0.009420191869139671, -0.11529935151338577, -0.11290694773197174, 0.01017717458307743, -0.011811834760010242, 0.12468621879816055, 0.0027717307675629854, 0.1292220503091812, 0.08108777552843094, 0.00976752582937479, -0.10763542354106903, 0.09374074637889862, 0.02251621149480343, 0.03845314681529999, 0.08270315080881119, 0.10019727051258087, 0.007792027201503515, 0.0678795874118805, 0.04656670242547989, -0.07818187773227692, 0.020443061366677284, -0.035946037620306015, -0.02123289741575718, -0.13658438622951508, 0.08559499680995941, -0.037497274577617645, 0.14685717225074768, 0.17898650467395782, -0.01867591217160225, -0.020914407446980476, -0.044640474021434784, 0.007441181223839521, -0.009458047337830067, 0.0971100777387619, -0.012040886096656322, -0.14995644986629486, 0.023449115455150604, -0.07668399065732956, 0.0346798449754715, -0.26891350746154785, -0.02595745585858822, 0.02164613828063011, -0.05695967748761177, -0.002074703574180603, 0.07670018076896667, 0.03908814862370491, 0.05237064138054848, -0.05553147941827774, -0.07435078918933868, 0.0012263137614354491, 0.10464083403348923, -0.10935146361589432, -0.11918134242296219 ]
null
null
transformers
# legal_t5_small_trans_de_es model Model on translating legal text from Deustch to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_es is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Spanish. ### How to use Here is how to use this model to translate legal text from Deustch to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_es"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_es", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "7. betont, dass die Kommission und die Mitgliedstaaten die Rolle der Frauen in der Sozialwirtschaft aufgrund der hohen Frauenerwerbstätigkeit in dem Sektor und der Bedeutung der Dienstleistungen, die er für die Förderung der Vereinbarkeit von Beruf und Privatleben bietet, aufwerten, unterstützen und verstärken müssen;" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_es model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_es | 47.24| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Spanish", "tags": ["translation Deustch Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "7. betont, dass die Kommission und die Mitgliedstaaten die Rolle der Frauen in der Sozialwirtschaft aufgrund der hohen Frauenerwerbst\u00e4tigkeit in dem Sektor und der Bedeutung der Dienstleistungen, die er f\u00fcr die F\u00f6rderung der Vereinbarkeit von Beruf und Privatleben bietet, aufwerten, unterst\u00fctzen und verst\u00e4rken m\u00fcssen;"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_es
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_es model ===================================== Model on translating legal text from Deustch to Spanish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_es is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Spanish. ### How to use Here is how to use this model to translate legal text from Deustch to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_es model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.13381285965442657, 0.07981884479522705, -0.002927522175014019, 0.07861074060201645, 0.09905951470136642, 0.013387870974838734, 0.06271225959062576, 0.09368661046028137, -0.0827169194817543, 0.0747959241271019, 0.057586781680583954, 0.05499066784977913, 0.08636070787906647, 0.0984562411904335, 0.034462571144104004, -0.2234477549791336, 0.023277945816516876, -0.017568156123161316, -0.010190489701926708, 0.12841427326202393, 0.1132325530052185, -0.0839766338467598, 0.018161876127123833, -0.032223671674728394, -0.10874143987894058, 0.007096050772815943, -0.05611056834459305, -0.08756975829601288, 0.0822441503405571, 0.046532683074474335, 0.12032542377710342, 0.024567725136876106, 0.07898911833763123, -0.1559602916240692, -0.003995854407548904, 0.08145521581172943, 0.04754810035228729, 0.04329048469662666, 0.0815148651599884, -0.018298504874110222, 0.14843259751796722, -0.028352905064821243, 0.06230894848704338, 0.02170460857450962, -0.13981880247592926, -0.14365747570991516, -0.0702132061123848, 0.017034579068422318, 0.14712558686733246, 0.14369510114192963, -0.054252855479717255, 0.05600310489535332, -0.11518058180809021, 0.046133141964673996, 0.06165210157632828, -0.260981947183609, -0.06646427512168884, 0.018998973071575165, 0.044789914041757584, 0.08464851975440979, -0.034906353801488876, -0.039605386555194855, 0.047202836722135544, 0.02490081638097763, 0.0034470974933356047, -0.029609015211462975, -0.0048199910670518875, -0.009811840020120144, -0.16122505068778992, -0.10718332976102829, 0.14686933159828186, -0.007821490988135338, -0.07151655107736588, -0.09137244522571564, -0.03962882235646248, -0.1435229629278183, 0.01359543390572071, -0.060754310339689255, 0.048349227756261826, -0.0062896194867789745, 0.04014776647090912, -0.010667966678738594, -0.11297472566366196, -0.11295482516288757, 0.02459687367081642, 0.08394304662942886, 0.09180574119091034, -0.022273331880569458, 0.009052405133843422, 0.17097032070159912, 0.031420569866895676, -0.09835516661405563, -0.01187230460345745, 0.009586341679096222, -0.10884547233581543, -0.01512419618666172, -0.03374103829264641, -0.14623796939849854, -0.0565185472369194, 0.08422873169183731, -0.047265276312828064, 0.05529483035206795, 0.035275429487228394, 0.03784537315368652, 0.01547545101493597, 0.16212603449821472, -0.07321911305189133, -0.05892156437039375, -0.0619858019053936, 0.05059969425201416, -0.05682964622974396, 0.02021041326224804, -0.027888428419828415, -0.012577489949762821, 0.05951833724975586, 0.08791152387857437, -0.06821267306804657, -0.0031337542459368706, -0.051201239228248596, -0.0308915376663208, 0.050778549164533615, -0.11555814743041992, -0.031769927591085434, 0.002184966579079628, -0.10702347010374069, -0.025562772527337074, 0.06779684126377106, -0.015040040016174316, -0.13073380291461945, 0.04406532645225525, -0.0278030876070261, -0.029515577480196953, -0.14257800579071045, -0.10469406843185425, -0.02109519951045513, -0.07279203087091446, -0.03520471975207329, -0.0725928395986557, -0.1564026176929474, -0.11956135928630829, 0.06670769304037094, -0.07097269594669342, -0.0392695851624012, -0.10244540125131607, -0.004100864753127098, -0.002402112353593111, -0.03976915404200554, 0.1204095259308815, -0.02358671836555004, 0.09935244172811508, 0.031747836619615555, 0.048367418348789215, 0.144674152135849, 0.07876399159431458, -0.09894382953643799, 0.018001869320869446, -0.11073146015405655, 0.19149598479270935, -0.017008556053042412, 0.0020978455431759357, -0.1574108749628067, -0.07517373561859131, -0.06181822344660759, 0.07722290605306625, 0.10620753467082977, 0.14068497717380524, -0.15212030708789825, -0.01775050163269043, 0.21594718098640442, -0.07851105183362961, -0.030201014131307602, 0.09604736417531967, -0.04384179040789604, 0.15005731582641602, 0.0959613099694252, 0.17635247111320496, 0.04920549690723419, -0.0793989822268486, 0.01217163447290659, -0.04298262298107147, 0.0008541345596313477, -0.0034366941545158625, 0.08789964765310287, -0.06233225390315056, -0.06519345194101334, -0.0054883696138858795, -0.1075541079044342, 0.031743019819259644, -0.053922273218631744, -0.05999867990612984, 0.0327657088637352, -0.045508693903684616, -0.04035625234246254, 0.06000211462378502, 0.05530698224902153, -0.04716434329748154, -0.11956990510225296, 0.015632912516593933, 0.08552435040473938, -0.061699990183115005, 0.02078453078866005, -0.05710907280445099, -0.04473790526390076, -0.08997044712305069, -0.01206672191619873, -0.16723927855491638, 0.04323045164346695, 0.04284297674894333, -0.013525193557143211, 0.043455854058265686, 0.04559146985411644, 0.033509235829114914, 0.051393378525972366, -0.0020622003357857466, -0.05752215534448624, -0.052678920328617096, -0.038454651832580566, -0.11871512234210968, -0.10712449997663498, -0.024154003709554672, -0.02450023777782917, 0.0818653553724289, -0.17949330806732178, 0.030276071280241013, -0.0996331125497818, 0.03164847567677498, -0.02100616693496704, -0.04010884091258049, 0.02737753838300705, 0.051702987402677536, 0.02431519702076912, -0.07243920862674713, 0.05341324955224991, 0.0342794768512249, 0.04314248636364937, 0.08356090635061264, -0.11170327663421631, -0.1487763375043869, 0.08393710106611252, 0.05259539186954498, -0.14949968457221985, 0.006049484945833683, -0.03376857191324234, -0.05853045731782913, -0.05703895539045334, 0.002646509325131774, 0.26149171590805054, 0.009466095827519894, 0.1547861248254776, -0.12167245149612427, -0.0392354279756546, -0.008190941996872425, -0.01875990256667137, 0.005297055467963219, 0.14913450181484222, 0.06467507779598236, -0.08727628737688065, 0.0557703860104084, 0.010282967239618301, -0.018681775778532028, 0.15126487612724304, 0.006295470986515284, -0.12881392240524292, 0.014147200621664524, 0.07335031032562256, -0.018599800765514374, 0.0889560803771019, -0.14303068816661835, 0.002340401755645871, 0.012168102897703648, 0.05594677850604057, 0.06657978147268295, -0.15802393853664398, 0.022908205166459084, 0.06206387281417847, -0.05163436010479927, 0.007797723636031151, -0.012148743495345116, -0.051475152373313904, 0.07556643337011337, 0.02532312646508217, -0.02833707444369793, -0.016496721655130386, -0.034785233438014984, -0.1355370134115219, 0.2064545899629593, -0.06803474575281143, -0.1578325778245926, -0.11185300350189209, 0.09681374579668045, 0.07159001380205154, 0.012568436563014984, 0.052369147539138794, -0.09091375768184662, -0.036212459206581116, -0.06973031163215637, 0.12397127598524094, -0.04787779971957207, -0.063752681016922, -0.09187092632055283, -0.0025433246046304703, -0.013049948960542679, -0.1348777562379837, 0.03288990631699562, -0.035050950944423676, -0.08328938484191895, -0.0037645664997398853, -0.05906205251812935, 0.0819302424788475, 0.15878063440322876, 0.005076216999441385, 0.016058223322033882, -0.013157365843653679, 0.18300645053386688, -0.1342581808567047, 0.004689536988735199, 0.09102980047464371, 0.048746850341558456, 0.000824448186904192, 0.1014227494597435, -0.009120725095272064, -0.09602680057287216, 0.03559695556759834, 0.048141807317733765, -0.025945018976926804, -0.28759506344795227, -0.025584178045392036, -0.02108684554696083, -0.05257398262619972, 0.1153046116232872, 0.03757861256599426, 0.021083639934659004, 0.05611150339245796, -0.016169773414731026, 0.0042831553146243095, 0.02318120002746582, 0.051127079874277115, 0.0015431639039888978, 0.0005057186936028302, 0.06543964147567749, -0.05007510259747505, -0.03118230402469635, 0.050069790333509445, 0.032310932874679565, 0.24050672352313995, -0.05208529531955719, 0.11693786829710007, 0.08623264729976654, 0.09726495295763016, -0.006620433181524277, 0.07603631913661957, -0.026450583711266518, 0.019264526665210724, -0.018006881698966026, -0.028070509433746338, -0.07268553972244263, 0.025393513962626457, 0.00515673728659749, 0.009936158545315266, -0.12140566855669022, -0.03335157409310341, 0.011408410966396332, 0.3182235062122345, 0.05820463225245476, -0.2404365837574005, -0.058875054121017456, 0.002813671249896288, -0.06040431186556816, -0.09925702214241028, 0.06029343977570534, 0.07840180397033691, -0.13598524034023285, -0.013538654893636703, -0.028172042220830917, 0.10290578007698059, -0.10672123730182648, -0.04831138253211975, 0.0462103933095932, 0.05431951582431793, -0.001674334635026753, 0.09341114014387131, -0.29861095547676086, 0.1994308978319168, -0.012375800870358944, 0.13001380860805511, -0.009924291633069515, 0.029794177040457726, -0.06183251738548279, -0.006549533922225237, 0.15722279250621796, -0.005769823212176561, 0.03120666928589344, -0.062244340777397156, -0.09559736400842667, 0.019945316016674042, 0.022402048110961914, -0.07227900624275208, 0.08954395353794098, 0.029143942520022392, 0.0338684618473053, -0.013286571949720383, -0.10330583155155182, -0.1273866593837738, -0.1176968589425087, -0.014262128621339798, -0.10049738734960556, 0.07097763568162918, -0.03814944997429848, -0.05174819007515907, -0.013235331512987614, 0.13678127527236938, -0.11472204327583313, -0.09885338693857193, -0.09647931158542633, 0.025122864171862602, 0.09580255299806595, -0.04457951337099075, 0.0005715892766602337, 0.020037364214658737, 0.00044463781523518264, -0.00016673223581165075, 0.01603374071419239, 0.10985926538705826, -0.07103558629751205, -0.1147211343050003, -0.04887157678604126, 0.12409214675426483, 0.12449724972248077, 0.05467572063207626, -0.02025502175092697, 0.006602148525416851, -0.007514742203056812, -0.0741695761680603, -0.0016096943290904164, 0.000996710848994553, 0.05376357212662697, 0.013741242699325085, -0.07044026255607605, -0.018896058201789856, -0.09331170469522476, -0.05685710534453392, 0.09724694490432739, 0.1337679922580719, -0.04566933959722519, 0.06674084812402725, 0.176254540681839, -0.12012047320604324, -0.15548649430274963, 0.018206115812063217, 0.11045011878013611, 0.08037164062261581, -0.0716707706451416, -0.2315942496061325, 0.0007959893555380404, 0.09999679028987885, 0.011481226421892643, -0.017497099936008453, -0.4410743713378906, -0.12584543228149414, 0.10005351155996323, 0.08945632725954056, -0.03027188591659069, -0.08504584431648254, -0.030263379216194153, 0.029550908133387566, -0.03127986192703247, 0.07576099783182144, -0.013819837011396885, 0.08506865054368973, 0.027555199339985847, -0.04422885179519653, 0.045625727623701096, -0.04976291209459305, 0.13019739091396332, 0.05914288014173508, 0.04708097502589226, -0.0406181663274765, 0.036907851696014404, -0.0014553444925695658, -0.01836201548576355, 0.14632238447666168, 0.04410485923290253, 0.031153259798884392, -0.20646703243255615, -0.06975209712982178, -0.0819772332906723, -0.006367616355419159, -0.07345221191644669, -0.04663538560271263, -0.035035140812397, 0.07699074596166611, 0.05192691460251808, -0.01133258081972599, -0.02217746153473854, -0.06925825029611588, -0.017883533611893654, 0.07807885110378265, 0.10252390801906586, 0.07746339589357376, -0.09099864959716797, 0.016691604629158974, 0.046618230640888214, 0.1006493866443634, -0.1409345418214798, -0.02562580071389675, 0.12463761866092682, -0.025091296061873436, 0.12496224790811539, -0.008984911255538464, -0.14810262620449066, 0.017840195447206497, 0.04694876819849014, -0.06868899613618851, -0.12314295768737793, -0.011435439810156822, -0.06288150697946548, -0.03009011037647724, -0.05239778757095337, 0.07013502717018127, -0.1001943051815033, -0.036101438105106354, -0.018619239330291748, 0.03029656410217285, -0.06833327561616898, 0.22842402756214142, 0.04220091551542282, 0.052562057971954346, -0.06460124254226685, 0.13021476566791534, 0.1106000691652298, -0.13860470056533813, 0.023420218378305435, 0.16840218007564545, -0.08985771238803864, -0.05054648220539093, 0.01767401583492756, 0.14111435413360596, -0.05041512846946716, -0.08264170587062836, -0.06993500143289566, -0.050279900431632996, 0.06258413940668106, -0.0031329537741839886, 0.0364232063293457, 0.02018115296959877, -0.030433131381869316, 0.0014333545695990324, -0.11418230831623077, 0.050597283989191055, 0.10086816549301147, -0.0007024184451438487, -0.024067116901278496, 0.17876404523849487, 0.05494356155395508, 0.04278426244854927, -0.0066065010614693165, -0.03462176024913788, -0.059906333684921265, 0.07502572983503342, -0.023899955675005913, -0.015575519762933254, -0.050777215510606766, -0.012027928605675697, -0.03024166077375412, -0.0020470828749239445, 0.006961698178201914, 0.013966108672320843, -0.06445986777544022, -0.027822697535157204, -0.04075229912996292, 0.04780995100736618, -0.06322117149829865, -0.0028010683599859476, -0.012942940928041935, -0.06156158447265625, 0.06456267088651657, 0.0111681018024683, -0.001958901062607765, 0.015428990125656128, -0.01957845501601696, 0.08270572125911713, -0.019700776785612106, 0.003777459729462862, -0.005743488669395447, -0.08873439580202103, 0.04760514199733734, 0.00008725890074856579, -0.014789077453315258, -0.01543144416064024, 0.04674991965293884, -0.14001518487930298, 0.07073383033275604, -0.013551570475101471, -0.02029890939593315, -0.06818518787622452, 0.1238764151930809, 0.029443543404340744, 0.07513082772493362, 0.10243766009807587, -0.0809321328997612, 0.06479714065790176, -0.13860493898391724, -0.04637455567717552, 0.02052018977701664, 0.02898821234703064, -0.03113587573170662, -0.05732210353016853, 0.06248913332819939, -0.034515224397182465, 0.06917119026184082, 0.08251366019248962, 0.0647611990571022, 0.026000654324889183, -0.1252800077199936, 0.00272655813023448, 0.04461316764354706, 0.06506361067295074, 0.0005123993614688516, 0.021894488483667374, -0.018122423440217972, 0.06955842673778534, -0.012978935614228249, 0.08060560375452042, 0.11783022433519363, 0.22750554978847504, 0.09298188984394073, 0.10380588471889496, -0.04226211458444595, -0.11219074577093124, -0.09818188846111298, 0.10355806350708008, 0.006809200160205364, 0.016406843438744545, -0.01999492570757866, 0.11789770424365997, 0.08964012563228607, -0.1576870232820511, 0.08085913956165314, 0.010307406075298786, -0.09724494069814682, -0.08596017956733704, -0.060983289033174515, -0.022794637829065323, -0.07585747539997101, -0.014796785078942776, -0.08616157621145248, 0.027906101197004318, 0.053229279816150665, 0.06774294376373291, -0.04509587958455086, 0.14229291677474976, 0.014041903428733349, -0.08643337339162827, 0.0910378098487854, -0.003180808387696743, 0.12353840470314026, -0.08996614068746567, 0.021665895357728004, 0.014459321275353432, 0.009331363253295422, 0.05759076029062271, 0.01792563684284687, -0.04151299595832825, 0.030396653339266777, 0.045494768768548965, -0.03690233826637268, -0.00021383863349910825, 0.04829256609082222, 0.11870457231998444, 0.12336955219507217, 0.06975676864385605, -0.04604298993945122, -0.022398674860596657, 0.2139436900615692, -0.03278487175703049, -0.09480641782283783, -0.16697683930397034, 0.17484493553638458, 0.08136739581823349, 0.03253662958741188, 0.03222620114684105, -0.10624708235263824, -0.007318580057471991, 0.2306792438030243, 0.13329428434371948, -0.04450622573494911, -0.052306052297353745, 0.02904173731803894, -0.0023736509028822184, 0.02403295412659645, 0.1165950819849968, 0.043775077909231186, 0.2253771424293518, -0.10993187874555588, 0.03211071342229843, -0.0760536715388298, -0.04393528774380684, -0.017242563888430595, 0.16796424984931946, 0.0010635365033522248, -0.014266288839280605, -0.04609484225511551, 0.11989924311637878, 0.0019708522595465183, -0.2018870711326599, 0.0680384561419487, -0.06301509588956833, -0.136224627494812, -0.03229624405503273, 0.004493870306760073, 0.005838486365973949, 0.057247284799814224, 0.0045241741463541985, -0.005406249314546585, 0.13638006150722504, 0.037175215780735016, -0.05674497038125992, -0.15360799431800842, 0.07152838259935379, 0.001899657305330038, 0.16853918135166168, -0.00849964376538992, 0.0774221196770668, 0.07485655695199966, 0.027030780911445618, -0.11000024527311325, 0.08329622447490692, 0.04581807553768158, 0.043821122497320175, 0.07147596776485443, 0.07893896847963333, -0.023197446018457413, 0.06717755645513535, 0.02361820638179779, -0.11506471782922745, 0.0615280382335186, -0.11474582552909851, -0.04319721832871437, -0.1666661649942398, 0.04402175918221474, -0.04515014588832855, 0.11559572070837021, 0.2108372300863266, -0.02495543844997883, 0.01919216476380825, -0.0680951401591301, 0.0518830344080925, -0.017032885923981667, 0.16973009705543518, 0.010418513789772987, -0.18967129290103912, 0.005093165207654238, -0.06477001309394836, 0.02526482194662094, -0.2196788340806961, -0.012342613190412521, 0.019508324563503265, -0.08567919582128525, -0.02645375393331051, 0.11371379345655441, 0.029591228812932968, 0.07331667095422745, -0.03043682500720024, -0.01986110210418701, -0.025608684867620468, 0.14541803300380707, -0.11905694752931595, -0.0890338197350502 ]
null
null
transformers
# legal_t5_small_trans_de_es_small_finetuned model Model on translating legal text from Deustch to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_es_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_de_es_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Spanish. ### How to use Here is how to use this model to translate legal text from Deustch to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_es_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_es", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Bei einer Kombination von Artikel 124 Absatz 14 mit Artikel 136 AEUV scheint die in den Artikeln 121 und 126 AEUV" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_es_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_es_small_finetuned | 47.006| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Spanish", "tags": ["translation Deustch Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Bei einer Kombination von Artikel 124 Absatz 14 mit Artikel 136 AEUV scheint die in den Artikeln 121 und 126 AEUV"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_es_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_es\_small\_finetuned model ======================================================= Model on translating legal text from Deustch to Spanish. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_es\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_de\_es\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Spanish. ### How to use Here is how to use this model to translate legal text from Deustch to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_es\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07810819894075394, 0.08982646465301514, -0.0036759059876203537, 0.08712786436080933, 0.0578426718711853, 0.011744852177798748, 0.024366019293665886, 0.09446462988853455, -0.03466985747218132, 0.08783165365457535, 0.03427926450967789, -0.006688620895147324, 0.06590284407138824, 0.016955936327576637, 0.0582607127726078, -0.2170662134885788, 0.00486500421538949, -0.03978864103555679, -0.01079324446618557, 0.08891087770462036, 0.08851274847984314, -0.05853652581572533, 0.04365454986691475, -0.041484732180833817, -0.022208258509635925, 0.03484365716576576, -0.09187070280313492, -0.060487646609544754, 0.08715607225894928, 0.09288620948791504, 0.09020768851041794, -0.011496071703732014, 0.0636914074420929, -0.17857573926448822, -0.00404355488717556, 0.07848353683948517, -0.0036360782105475664, 0.04285143315792084, 0.14276175200939178, -0.005410896148532629, 0.16189351677894592, -0.05730432644486427, 0.03213651850819588, 0.04097679629921913, -0.14234186708927155, -0.12650763988494873, -0.05348211154341698, 0.0037458331789821386, 0.08657030761241913, 0.13030779361724854, -0.05173184350132942, 0.0530887134373188, -0.04934138432145119, 0.05980003625154495, 0.0656566172838211, -0.2046874314546585, -0.034107763320207596, 0.02582898549735546, 0.05060257762670517, 0.0818362683057785, -0.032564543187618256, -0.01692083291709423, 0.05988297238945961, 0.06608980149030685, 0.016805727034807205, -0.06848683953285217, -0.06020474061369896, -0.0529283732175827, -0.12445462495088577, -0.06819906830787659, 0.14018882811069489, 0.01618126779794693, -0.044622112065553665, -0.08048874139785767, -0.06669294089078903, -0.06224779784679413, -0.0020593192894011736, -0.042713191360235214, 0.030818980187177658, -0.005536895710974932, 0.08253059536218643, -0.009850920177996159, -0.10956180840730667, -0.05786421522498131, -0.07293771952390671, 0.12453849613666534, 0.045215506106615067, 0.004770298488438129, 0.025420377030968666, 0.09207157790660858, -0.10149608552455902, -0.09092140942811966, 0.009369305334985256, 0.022052936255931854, -0.10249447077512741, 0.0023632340598851442, -0.014007787220180035, -0.1735372096300125, -0.007374851498752832, 0.024512864649295807, -0.0831490308046341, 0.050427600741386414, 0.07371155172586441, 0.03673544153571129, 0.06841683387756348, 0.10943818092346191, -0.1196523904800415, -0.1191067025065422, -0.03162115439772606, -0.004281677771359682, 0.009231457486748695, 0.017117811366915703, -0.06676295399665833, -0.03888171911239624, 0.0053696767427027225, 0.04191359132528305, 0.0251377671957016, 0.013813505880534649, -0.027700698003172874, -0.04254192113876343, 0.12507203221321106, -0.10555572807788849, -0.0003191525174770504, 0.003031652420759201, -0.09569035470485687, -0.02185738831758499, 0.06313712894916534, -0.019019654020667076, -0.11667933315038681, 0.06152206286787987, -0.031774915754795074, -0.039534639567136765, -0.11503483355045319, -0.17152465879917145, -0.001213578274473548, 0.0002782463561743498, -0.05343662202358246, -0.10660068690776825, -0.12455597519874573, -0.10917138308286667, 0.032844770699739456, -0.07619471102952957, 0.006085174158215523, -0.07170753180980682, 0.009237129241228104, -0.007166185881942511, -0.014174802228808403, 0.10568253695964813, -0.04203694686293602, 0.038341421633958817, 0.046500369906425476, 0.0717637687921524, 0.02243390865623951, 0.0341348871588707, -0.12004188448190689, 0.050263743847608566, -0.1538199782371521, 0.1616695672273636, -0.02468138001859188, 0.009792832657694817, -0.13459941744804382, -0.04922088608145714, -0.08441299200057983, 0.0716424435377121, 0.0656169205904007, 0.12699095904827118, -0.19898055493831635, -0.011756119318306446, 0.1976955085992813, -0.07732115685939789, -0.07106183469295502, 0.11501612514257431, -0.022065555676817894, 0.04582656919956207, 0.08729482442140579, 0.11723072826862335, 0.06797323375940323, -0.010240288451313972, -0.034803181886672974, 0.0006522013572975993, 0.026946334168314934, 0.07495135068893433, 0.08823060989379883, -0.0821409523487091, 0.038166653364896774, 0.018406933173537254, -0.0003095848369412124, 0.0008059490937739611, -0.01381455734372139, -0.028781428933143616, 0.021522851660847664, -0.03750259056687355, -0.044464655220508575, 0.031185245141386986, 0.008976773358881474, -0.053010109812021255, -0.08050326257944107, -0.026126889511942863, 0.09487300366163254, -0.045700110495090485, 0.01223415695130825, -0.009483439847826958, -0.05152652785181999, -0.11938116699457169, 0.013792076148092747, -0.16818562150001526, -0.011391280218958855, 0.029824966564774513, -0.06910117715597153, 0.10797800868749619, 0.06305549293756485, 0.04784926772117615, 0.08792965859174728, -0.05750172585248947, -0.031807396560907364, -0.0012981309555470943, -0.01883680932223797, -0.09308378398418427, -0.12146750092506409, -0.03726625815033913, -0.024053500965237617, -0.020836403593420982, -0.11737333238124847, -0.0048983353190124035, -0.06691086292266846, 0.07636219263076782, 0.0005908770835958421, -0.014800171367824078, 0.02753831259906292, 0.07691311091184616, -0.028329022228717804, -0.047566771507263184, 0.026731187477707863, -0.006613181438297033, -0.03298001363873482, 0.08526692539453506, -0.17224855720996857, -0.11242502927780151, 0.07938476651906967, 0.005698479246348143, -0.1276520937681198, -0.008227254264056683, -0.009391088970005512, -0.06155850365757942, -0.058233994990587234, -0.0611063688993454, 0.254260778427124, 0.03046991676092148, 0.14390124380588531, -0.11956283450126648, -0.02315872721374035, 0.008367610163986683, -0.022698085755109787, -0.004392803646624088, 0.17385706305503845, 0.08272569626569748, -0.14541351795196533, 0.08962355554103851, -0.002262475900352001, -0.022788919508457184, 0.09784200042486191, 0.06528082489967346, -0.10272704064846039, -0.007162946276366711, 0.043377701193094254, 0.0102244196459651, 0.05882241576910019, -0.09566143155097961, -0.003009323263540864, 0.023451143875718117, 0.0580085925757885, 0.07023453712463379, -0.08520372956991196, 0.06202045828104019, 0.06836412847042084, -0.023342227563261986, 0.03502437472343445, -0.04912913963198662, -0.032675839960575104, 0.09784393757581711, 0.03019976243376732, -0.0328526608645916, -0.046636227518320084, -0.047852955758571625, -0.10972213745117188, 0.1960013210773468, -0.06392090767621994, -0.22365067899227142, -0.12038197368383408, 0.07791594415903091, -0.028461121022701263, 0.05171455070376396, 0.03364247828722, -0.04215557500720024, -0.062161728739738464, -0.10644479095935822, 0.10471010208129883, -0.07470934838056564, -0.059018537402153015, -0.12923870980739594, 0.034829989075660706, 0.005190242547541857, -0.1235116720199585, 0.025359531864523888, 0.006990527734160423, -0.022048061713576317, -0.006302626803517342, -0.038274772465229034, 0.12237536907196045, 0.13773930072784424, -0.004260988440364599, -0.039433740079402924, -0.004507455509155989, 0.12824666500091553, -0.10254759341478348, 0.05919409170746803, 0.07822871208190918, 0.04412396252155304, 0.021433832123875618, 0.14847879111766815, 0.023694084957242012, -0.07145725935697556, 0.03687085211277008, 0.05501430109143257, -0.02434309758245945, -0.2299858182668686, -0.10838419198989868, -0.06813012808561325, 0.008833573199808598, 0.1245410367846489, 0.037552837282419205, -0.040991514921188354, 0.03379841148853302, -0.058272834867239, 0.06415490061044693, -0.011583576910197735, 0.05530056357383728, 0.04669148474931717, -0.012831236235797405, 0.07278908044099808, -0.0604432076215744, -0.061499446630477905, 0.10485059022903442, 0.045801009982824326, 0.18184159696102142, -0.05269889906048775, 0.2031286209821701, 0.05772264301776886, 0.021797258406877518, 0.0017726607620716095, 0.05302973464131355, -0.03783150762319565, 0.02770625613629818, -0.0404125414788723, -0.06226819008588791, -0.03542657569050789, 0.0509071908891201, 0.008756288327276707, 0.0128489900380373, -0.06358586996793747, -0.07376617193222046, 0.044418878853321075, 0.19352208077907562, 0.053550612181425095, -0.20623941719532013, -0.05294879525899887, 0.011207425966858864, -0.06080657243728638, -0.07182565331459045, 0.014657820574939251, 0.14260469377040863, -0.09239137917757034, 0.021559828892350197, 0.017840318381786346, 0.12305741757154465, -0.1354992836713791, -0.009138652123510838, 0.03281252086162567, 0.04995466768741608, -0.009610620327293873, 0.1313565969467163, -0.24578744173049927, 0.1463858187198639, 0.01456680428236723, 0.05373956263065338, -0.03640983626246452, 0.018176646903157234, -0.04684321954846382, -0.0031635023187845945, 0.11469268798828125, 0.012333817780017853, -0.0018564603524282575, -0.11018885672092438, -0.09231961518526077, 0.0018504427280277014, 0.048925455659627914, -0.07332000136375427, 0.09534523636102676, 0.056317366659641266, 0.016388073563575745, -0.020236046984791756, 0.013105184771120548, -0.05573626980185509, -0.16737714409828186, 0.004002132918685675, -0.043834757059812546, -0.009305272251367569, -0.012722729705274105, -0.020240122452378273, -0.06337512284517288, 0.18335378170013428, -0.09861583262681961, -0.0794016420841217, -0.07670244574546814, 0.005516891833394766, 0.14099200069904327, -0.07129400968551636, 0.003412338439375162, 0.0015656605828553438, 0.03755750134587288, -0.023377545177936554, -0.027540577575564384, 0.10131832212209702, -0.08893810212612152, -0.0941501185297966, -0.07833202183246613, 0.10674279928207397, 0.07326851785182953, 0.04354437068104744, -0.008676647208631039, 0.028689859434962273, -0.011227916926145554, -0.11213117092847824, -0.029726142063736916, 0.02674758806824684, 0.11285018920898438, 0.05664525181055069, -0.04671033099293709, -0.040497422218322754, -0.06702215224504471, -0.06006588041782379, 0.07004823535680771, 0.14943280816078186, -0.038871318101882935, 0.028274081647396088, 0.19509395956993103, -0.10968245565891266, -0.17824332416057587, -0.05642428621649742, 0.08465898782014847, 0.07418044656515121, -0.012004135176539421, -0.17989473044872284, 0.027838408946990967, 0.10738816857337952, 0.00811653770506382, 0.06180417165160179, -0.4029639959335327, -0.13193254172801971, 0.07010240107774734, 0.03231697157025337, -0.040732935070991516, -0.12517841160297394, -0.06755789369344711, -0.06902439892292023, -0.0156528539955616, 0.08856762200593948, -0.03483392670750618, 0.09151208400726318, -0.00045053462963551283, 0.017992310225963593, 0.05149675905704498, -0.03271324560046196, 0.1449766606092453, 0.02595849335193634, 0.04093317687511444, -0.06212931498885155, 0.06073116138577461, 0.009797142818570137, -0.015205810777842999, 0.14306038618087769, -0.030530264601111412, 0.0500306598842144, -0.14198589324951172, -0.062077268958091736, -0.06694594025611877, 0.0228294488042593, -0.04311920329928398, -0.06541293859481812, -0.047626372426748276, 0.036400195211172104, 0.049945488572120667, -0.00469166086986661, -0.013494209386408329, -0.057436540722846985, 0.0033660901244729757, 0.16604913771152496, 0.11783196777105331, 0.034046538174152374, -0.1155928298830986, 0.02870967425405979, 0.011216721497476101, 0.08336114138364792, -0.06505189090967178, 0.005884523503482342, 0.1372222751379013, 0.013520421460270882, 0.10588538646697998, -0.012862900272011757, -0.14730340242385864, -0.003965293988585472, 0.06152798607945442, -0.07257634401321411, -0.1279337853193283, -0.006486725527793169, 0.036534663289785385, -0.06673284620046616, -0.039151694625616074, 0.09654409438371658, -0.08044968545436859, -0.028227046132087708, 0.007710787933319807, 0.026759453117847443, -0.028451938182115555, 0.2063785344362259, 0.03627024218440056, 0.037884555757045746, -0.06286060065031052, 0.11269690841436386, 0.13932225108146667, -0.16966867446899414, 0.0148016307502985, 0.1876743733882904, -0.07927924394607544, -0.06481819599866867, 0.015766598284244537, 0.1295376420021057, -0.0577063113451004, -0.06970357894897461, -0.038781385868787766, -0.05578883737325668, 0.02984517440199852, -0.002880553947761655, 0.036716777831315994, 0.03853904455900192, -0.013948667794466019, -0.012457020580768585, -0.08710932731628418, 0.07963584363460541, 0.0766616240143776, 0.038438621908426285, -0.025973405689001083, 0.13092140853405, 0.02993686869740486, -0.02866280823945999, -0.012332519516348839, 0.009343105368316174, -0.07757647335529327, 0.017852747812867165, -0.08765605092048645, 0.02306654304265976, -0.04876728355884552, -0.013752713799476624, -0.02283555455505848, 0.010888800024986267, -0.010720004327595234, -0.006445429287850857, -0.026927130296826363, -0.04786137863993645, -0.04085675999522209, 0.02856038324534893, -0.08491206914186478, -0.03177354112267494, 0.006479363888502121, -0.022578204050660133, 0.0430288165807724, -0.011785829439759254, 0.01320821326225996, -0.0074441321194171906, -0.02609342895448208, 0.07564596831798553, 0.02237461879849434, 0.04806003347039223, -0.004034171812236309, -0.0826745331287384, 0.020206745713949203, 0.025984425097703934, -0.009007950313389301, -0.017704160884022713, 0.01711096614599228, -0.156402587890625, 0.009428600780665874, -0.004725984297692776, -0.04349435120820999, -0.07400541752576828, 0.09349904209375381, 0.04687919840216637, 0.05336510017514229, 0.10265649855136871, -0.07970152795314789, 0.07879497110843658, -0.16351336240768433, -0.014682220295071602, 0.01680614985525608, 0.020504191517829895, -0.022021397948265076, 0.002695034258067608, 0.05248259752988815, -0.05098232626914978, 0.1376381516456604, 0.044213324785232544, 0.08386778831481934, 0.015915397554636, -0.06827440112829208, -0.00025319206179119647, 0.025478458032011986, 0.07896558940410614, -0.023462219163775444, -0.018411168828606606, -0.07873527705669403, 0.10383978486061096, 0.0027483939193189144, 0.06952738016843796, 0.03757914900779724, 0.13454537093639374, 0.12246797978878021, 0.037893299013376236, 0.00030263426015153527, -0.10652527958154678, -0.06081365421414375, 0.0506393201649189, 0.006887309718877077, 0.033871736377477646, -0.011225261725485325, 0.07902979850769043, 0.11654693633317947, -0.1369265913963318, 0.1291910707950592, -0.0039356607012450695, -0.07293207943439484, -0.039481326937675476, -0.13716916739940643, -0.03670249506831169, -0.01945120096206665, -0.05337667092680931, -0.10741742700338364, 0.013096326030790806, 0.06475405395030975, 0.03685484454035759, -0.03770926594734192, 0.13812483847141266, -0.04864463582634926, -0.12639600038528442, 0.04174257814884186, 0.028439350426197052, 0.10723002254962921, 0.026630528271198273, 0.04121381789445877, 0.04916169494390488, 0.028485620394349098, 0.04276007041335106, 0.0612930990755558, -0.012753437273204327, -0.000428137049311772, 0.013077056035399437, -0.04672669619321823, -0.03762846812605858, 0.014441112987697124, 0.07164768129587173, 0.2121027708053589, 0.05241267383098602, -0.06608805060386658, -0.01885717362165451, 0.1993379294872284, -0.06011386215686798, -0.0798250064253807, -0.09691374003887177, 0.22338874638080597, 0.03666079044342041, 0.04194021597504616, -0.006725947372615337, -0.11112586408853531, -0.009906762279570103, 0.14776214957237244, 0.19026616215705872, -0.05350619927048683, -0.03432941809296608, 0.02083231694996357, -0.0017432125750929117, 0.038698047399520874, 0.046911414712667465, 0.022276146337389946, 0.29324600100517273, -0.08267346024513245, 0.0876305028796196, -0.04171931743621826, 0.02414173260331154, -0.011599973775446415, 0.1746376007795334, -0.0010755903786048293, 0.02145390957593918, -0.06272675842046738, 0.08773037046194077, -0.01944049634039402, -0.19344526529312134, 0.02627975307404995, -0.08883285522460938, -0.11851031333208084, 0.0013018331956118345, -0.010844198986887932, 0.057994890958070755, 0.07373391836881638, 0.01601497083902359, 0.029729750007390976, 0.07087650895118713, 0.016779592260718346, -0.11370596289634705, -0.12399863451719284, 0.007883098907768726, -0.006316046230494976, 0.13196103274822235, -0.0029809752013534307, 0.1216677725315094, 0.08405500650405884, 0.00814717449247837, -0.10888036340475082, 0.09048980474472046, 0.022430280223488808, 0.03826814517378807, 0.10219297558069229, 0.0723215788602829, 0.006671376526355743, 0.05956786125898361, 0.04271109402179718, -0.08675554394721985, 0.02534535340964794, -0.02393931709229946, -0.008308453485369682, -0.1377800554037094, 0.08985164016485214, -0.04138440266251564, 0.13768847286701202, 0.1803404837846756, -0.019369255751371384, -0.01421177014708519, -0.04650199040770531, 0.0210666973143816, -0.004828456789255142, 0.11168581992387772, -0.01228468306362629, -0.15988610684871674, 0.018221884965896606, -0.07728678733110428, 0.030231166630983353, -0.26694151759147644, -0.015660399571061134, 0.025344280526041985, -0.05931902676820755, -0.0031101463828235865, 0.06782498955726624, 0.02816946431994438, 0.058370936661958694, -0.051686570048332214, -0.07793077826499939, -0.0031312641222029924, 0.10473768413066864, -0.0998472273349762, -0.11290087550878525 ]
null
null
transformers
# legal_t5_small_trans_de_fr model Model on translating legal text from Deustch to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_fr is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to French. ### How to use Here is how to use this model to translate legal text from Deustch to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_fr"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "stellt fest, dass Leistung und Effizienz nicht in einer standardisierten Art und Weise gemessen werden; fordert die interinstitutionelle Arbeitsgruppe für die Agenturen auf, sich mit dieser Frage zu befassen;" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_fr model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_fr | 47.78| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch French", "tags": ["translation Deustch French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "stellt fest, dass Leistung und Effizienz nicht in einer standardisierten Art und Weise gemessen werden; fordert die interinstitutionelle Arbeitsgruppe f\u00fcr die Agenturen auf, sich mit dieser Frage zu befassen;"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_fr
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_fr model ===================================== Model on translating legal text from Deustch to French. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_fr is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to French. ### How to use Here is how to use this model to translate legal text from Deustch to French in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_fr model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.12200912088155746, 0.08348623663187027, -0.0026780746411532164, 0.08037551492452621, 0.08333821594715118, 0.003156584920361638, 0.07882110029459, 0.08302754163742065, -0.07923481613397598, 0.06652433425188065, 0.07152310758829117, 0.022155918180942535, 0.07903017848730087, 0.11457498371601105, 0.05725158751010895, -0.21406781673431396, 0.027073362842202187, -0.01684270054101944, -0.01830393262207508, 0.1342780739068985, 0.12441212683916092, -0.07709881663322449, 0.028013361617922783, -0.013188441284000874, -0.106595478951931, 0.011928007937967777, -0.06151210144162178, -0.06787680834531784, 0.08577442914247513, 0.047387417405843735, 0.11111443489789963, 0.021323740482330322, 0.08238741010427475, -0.17182056605815887, -0.0028847467619925737, 0.08694490790367126, 0.04036325961351395, 0.038894299417734146, 0.07794895023107529, -0.019564392045140266, 0.14460448920726776, -0.02777804248034954, 0.06027376651763916, 0.02260899543762207, -0.1128712072968483, -0.1339830607175827, -0.06947365403175354, 0.032273899763822556, 0.1300942748785019, 0.15157760679721832, -0.05291697010397911, 0.05420517921447754, -0.1296522468328476, 0.05689619854092598, 0.06757964193820953, -0.26456794142723083, -0.06457259505987167, 0.013907238841056824, 0.047661732882261276, 0.0718708410859108, -0.05310976505279541, -0.039078034460544586, 0.032987695187330246, 0.024925783276557922, 0.013554833829402924, -0.028553830459713936, 0.012706974521279335, -0.013250176794826984, -0.16358989477157593, -0.09817273914813995, 0.15746936202049255, 0.0030060994904488325, -0.08350539207458496, -0.09266635030508041, -0.032931823283433914, -0.15302889049053192, -0.006239257752895355, -0.06878366321325302, 0.05832667648792267, -0.004002540837973356, 0.03533800691366196, -0.01578267477452755, -0.10668489336967468, -0.11348532140254974, 0.0264249537140131, 0.07240665704011917, 0.09290053695440292, -0.01593838445842266, 0.010546891950070858, 0.16685837507247925, -0.006493628025054932, -0.0781094878911972, -0.019552214071154594, 0.0003561756166163832, -0.11496178805828094, -0.01980345882475376, -0.0270902868360281, -0.1201980784535408, -0.05457298085093498, 0.11839540302753448, -0.04593343287706375, 0.0578334666788578, 0.027580946683883667, 0.03777362033724785, 0.008989929221570492, 0.16643060743808746, -0.08385177701711655, -0.07456959784030914, -0.06496740877628326, 0.0618802011013031, -0.07311348617076874, 0.03369787707924843, -0.011827532202005386, -0.014554322697222233, 0.059138067066669464, 0.07313434779644012, -0.07791236788034439, -0.010006910189986229, -0.03977038711309433, -0.019009873270988464, 0.06382288783788681, -0.11764296889305115, -0.02862975001335144, 0.00025003065820783377, -0.11292020976543427, -0.022277969866991043, 0.07561049610376358, -0.0057344259694218636, -0.13598787784576416, 0.05229170620441437, -0.028443410992622375, -0.02248680219054222, -0.1355883926153183, -0.10451814532279968, -0.03654668852686882, -0.04645967856049538, -0.04872353374958038, -0.06288690865039825, -0.15220996737480164, -0.10687792301177979, 0.08117873221635818, -0.06129331514239311, -0.041109468787908554, -0.10266312211751938, -0.020877523347735405, 0.004916797392070293, -0.0199156254529953, 0.11343802511692047, -0.02975626289844513, 0.0783640444278717, 0.0028239309322088957, 0.04266439005732536, 0.13889537751674652, 0.07893072068691254, -0.09146711230278015, 0.018713919445872307, -0.10858047008514404, 0.18600109219551086, -0.02275599166750908, -0.024199295789003372, -0.15943995118141174, -0.09013677388429642, -0.06912585347890854, 0.06317984312772751, 0.11478304117918015, 0.14012521505355835, -0.1547924280166626, -0.020588884130120277, 0.2225940078496933, -0.0869891345500946, -0.03092154487967491, 0.13148759305477142, -0.04932233691215515, 0.12179454416036606, 0.08519723266363144, 0.16950128972530365, 0.06014095991849899, -0.08288031071424484, 0.000590487034060061, -0.023397788405418396, -0.000838186708278954, 0.007989752106368542, 0.09452908486127853, -0.053152795881032944, -0.07667209953069687, -0.019195327535271645, -0.09330616146326065, 0.0313570536673069, -0.06221006065607071, -0.0622907355427742, 0.02446300908923149, -0.04822450876235962, -0.018132314085960388, 0.059183575212955475, 0.047064680606126785, -0.052359312772750854, -0.13156050443649292, -0.005059964023530483, 0.09599927067756653, -0.06854023039340973, 0.020161420106887817, -0.04628913104534149, -0.05257205665111542, -0.05904287099838257, -0.012849906459450722, -0.1575886458158493, 0.03472938388586044, 0.04516144096851349, -0.03635295480489731, 0.04601883143186569, 0.04811470955610275, 0.04090914502739906, 0.06705602258443832, -0.008313172496855259, -0.057864412665367126, -0.06407789885997772, -0.040745895355939865, -0.11207079887390137, -0.1111067607998848, -0.017145290970802307, -0.024725865572690964, 0.09907174110412598, -0.18033277988433838, 0.031923260539770126, -0.09893934428691864, 0.042258135974407196, -0.01668190397322178, -0.039668720215559006, 0.011248771101236343, 0.03635663539171219, 0.025298232212662697, -0.05967867374420166, 0.0441771037876606, 0.021113041788339615, 0.0563395693898201, 0.08939191699028015, -0.084828682243824, -0.13778622448444366, 0.08948466926813126, 0.05480114743113518, -0.15783347189426422, 0.011837008409202099, -0.04609305411577225, -0.0598461888730526, -0.0476280115544796, 0.007177327293902636, 0.2522194981575012, 0.015884030610322952, 0.15890368819236755, -0.10744206607341766, -0.04882970452308655, -0.005293005611747503, -0.009447637014091015, 0.008210866712033749, 0.1488182246685028, 0.06798006594181061, -0.07206788659095764, 0.05164603143930435, 0.029924213886260986, -0.02172546461224556, 0.14358167350292206, 0.002801603637635708, -0.12628784775733948, 0.012308504432439804, 0.07830444723367691, -0.029165392741560936, 0.08991742134094238, -0.14167095720767975, -0.003502712119370699, 0.010516343638300896, 0.05018662288784981, 0.05721483752131462, -0.15750685334205627, 0.032781947404146194, 0.054196376353502274, -0.05971360206604004, 0.015811359509825706, -0.00769011490046978, -0.05380461364984512, 0.0833527222275734, 0.023896008729934692, -0.05103284865617752, -0.020901354029774666, -0.038702014833688736, -0.13072896003723145, 0.21542087197303772, -0.06967934966087341, -0.14416028559207916, -0.09290793538093567, 0.09814590960741043, 0.043582282960414886, -0.003612764412537217, 0.046544399112463, -0.08675117045640945, -0.04354904964566231, -0.09559262543916702, 0.09561488777399063, -0.060732126235961914, -0.04946880787611008, -0.0978492796421051, -0.007547551300376654, -0.011546011082828045, -0.14351704716682434, 0.028005000203847885, -0.04307207092642784, -0.08395300060510635, -0.011806120164692402, -0.04825452342629433, 0.0700797364115715, 0.14710251986980438, -0.017269087955355644, 0.016780836507678032, -0.005685481242835522, 0.19855934381484985, -0.13076964020729065, 0.010772565379738808, 0.07197905331850052, 0.06029438227415085, 0.015038657002151012, 0.10497011244297028, -0.004164901562035084, -0.08306115865707397, 0.045761313289403915, 0.06237272173166275, -0.017106568440794945, -0.2711167633533478, -0.03491056710481644, -0.023205209523439407, -0.03832141309976578, 0.11225270479917526, 0.04192471504211426, 0.027159184217453003, 0.037388384342193604, -0.02121185138821602, 0.0054086484014987946, 0.03787607699632645, 0.04817789047956467, 0.0012971789110451937, 0.005106154829263687, 0.07423792034387589, -0.05214235931634903, -0.03540455177426338, 0.05137757211923599, 0.015515223145484924, 0.23676523566246033, -0.05695498734712601, 0.10916484892368317, 0.08594601601362228, 0.09766233712434769, -0.006032749079167843, 0.07851184159517288, -0.023008162155747414, 0.018371785059571266, -0.01188120897859335, -0.02708522416651249, -0.05268469452857971, 0.0315006785094738, 0.006618070416152477, 0.0011480954708531499, -0.1176358014345169, -0.021028457209467888, 0.02098674140870571, 0.32271361351013184, 0.06384017318487167, -0.23532544076442719, -0.057528432458639145, -0.010920785367488861, -0.07180365920066833, -0.10014879703521729, 0.07314882427453995, 0.08162327110767365, -0.14031589031219482, -0.023786494508385658, -0.023899681866168976, 0.09693305194377899, -0.09091552346944809, -0.04651184007525444, 0.06800292432308197, 0.05989011377096176, -0.007500498555600643, 0.08431895077228546, -0.2943548858165741, 0.19395874440670013, -0.019466521218419075, 0.11733929812908173, -0.0050009917467832565, 0.03308524191379547, -0.04378310590982437, 0.02097688615322113, 0.17025025188922882, 0.0052293515764176846, 0.029804568737745285, -0.0444490872323513, -0.09195467084646225, 0.014254572801291943, 0.037145987153053284, -0.06695713102817535, 0.07474132627248764, 0.022973977029323578, 0.018683118745684624, -0.012242447584867477, -0.10138484090566635, -0.14941845834255219, -0.11638600379228592, -0.006170982960611582, -0.08940412104129791, 0.05893600732088089, -0.03736042231321335, -0.053671058267354965, -0.018830519169569016, 0.15587733685970306, -0.10706674307584763, -0.08391258865594864, -0.0895240381360054, 0.010669409297406673, 0.09671593457460403, -0.04503306746482849, 0.018437080085277557, 0.009884601458907127, 0.011868258938193321, -0.010081032291054726, 0.014666728675365448, 0.09592901915311813, -0.07835466414690018, -0.10077233612537384, -0.04492487013339996, 0.12853504717350006, 0.12688586115837097, 0.05500092729926109, -0.022099368274211884, 0.009880573488771915, -0.02301289327442646, -0.08612056076526642, -0.005510714370757341, -0.01614692248404026, 0.0472462959587574, 0.019639957696199417, -0.06957442313432693, -0.026153123006224632, -0.09149891883134842, -0.037052419036626816, 0.107203409075737, 0.14027327299118042, -0.05730767175555229, 0.0701877623796463, 0.17727605998516083, -0.10364532470703125, -0.1775117963552475, 0.02318563684821129, 0.11224118620157242, 0.06664972007274628, -0.086620032787323, -0.2261953204870224, 0.01846492476761341, 0.08149158954620361, 0.012486238963901997, -0.013643107376992702, -0.42196840047836304, -0.13339509069919586, 0.10162965953350067, 0.07946639508008957, -0.025153420865535736, -0.07160964608192444, -0.006005052477121353, 0.03296930715441704, -0.03426502272486687, 0.08082741498947144, -0.019039331004023552, 0.07752331346273422, 0.02544460818171501, -0.05777091905474663, 0.036044735461473465, -0.052438754588365555, 0.11893031001091003, 0.08379567414522171, 0.0546644888818264, -0.03195461630821228, 0.04599541053175926, 0.0008534235530532897, -0.015252567827701569, 0.15827468037605286, 0.04102065786719322, 0.038311220705509186, -0.20061174035072327, -0.05846957489848137, -0.08198654651641846, 0.002248343313112855, -0.07334858924150467, -0.04832155629992485, -0.04524477198719978, 0.0896984115242958, 0.05892104282975197, -0.005226908717304468, -0.05459222197532654, -0.05533035099506378, -0.03381725400686264, 0.08345035463571548, 0.09313879162073135, 0.07777410745620728, -0.09443483501672745, 0.04044639319181442, 0.044923894107341766, 0.09597878158092499, -0.14584361016750336, -0.01310097798705101, 0.12515302002429962, -0.024482956156134605, 0.13532060384750366, -0.005416986998170614, -0.14047737419605255, -0.006856980733573437, 0.0338820181787014, -0.09100858867168427, -0.13147662580013275, -0.017392994835972786, -0.06931635737419128, -0.030045434832572937, -0.053606875240802765, 0.06192634627223015, -0.11544645577669144, -0.02504497580230236, -0.020232923328876495, 0.03280993923544884, -0.08199630677700043, 0.21178454160690308, 0.030588161200284958, 0.05838993936777115, -0.06501927226781845, 0.11793997883796692, 0.10221098363399506, -0.1386040300130844, 0.01921902410686016, 0.17515212297439575, -0.09100015461444855, -0.05525923892855644, 0.013416985049843788, 0.1401887834072113, -0.02706984430551529, -0.07504994422197342, -0.06032668054103851, -0.04710449278354645, 0.06873932480812073, -0.004371167626231909, 0.04075177013874054, 0.015772445127367973, -0.023069478571414948, -0.012112821452319622, -0.12786738574504852, 0.07233663648366928, 0.10867072641849518, -0.004242123570293188, -0.00023745941871311516, 0.1585901826620102, 0.04918798431754112, 0.06117914244532585, -0.006030916702002287, -0.03485899046063423, -0.04677380993962288, 0.06007690355181694, -0.01876780018210411, -0.027324628084897995, -0.042762190103530884, -0.00018300306692253798, -0.029680972918868065, -0.006282344460487366, 0.005167332477867603, 0.01910404860973358, -0.0694107711315155, -0.034270402044057846, -0.02958158776164055, 0.04922763630747795, -0.07922688126564026, -0.0027447508182376623, 0.001108080381527543, -0.06859753280878067, 0.07621968537569046, 0.022311285138130188, 0.0020121673587709665, 0.002935105236247182, -0.009809168986976147, 0.06034315750002861, -0.05129598081111908, 0.007039566989988089, -0.017904886975884438, -0.09902341663837433, 0.05048273131251335, -0.006445492152124643, -0.008896767161786556, -0.017633995041251183, 0.04360181838274002, -0.1293231099843979, 0.06731041520833969, -0.029394885525107384, -0.019854122772812843, -0.07585296034812927, 0.1253664493560791, 0.0182605292648077, 0.09169087558984756, 0.10157176852226257, -0.07300373166799545, 0.05946911498904228, -0.1295689344406128, -0.04643745720386505, 0.025784296914935112, 0.018998099491000175, -0.0275675430893898, -0.0684308409690857, 0.05268194153904915, -0.03728805482387543, 0.08283697068691254, 0.0861908569931984, 0.05689453333616257, 0.03611287847161293, -0.14588043093681335, -0.015761910006403923, 0.051239948719739914, 0.04220515862107277, -0.00717655336484313, 0.013904496096074581, -0.020027359947562218, 0.05288131535053253, -0.008719142526388168, 0.09339512139558792, 0.12931077182292938, 0.22661198675632477, 0.08180840313434601, 0.09883688390254974, -0.06075229123234749, -0.11694752424955368, -0.10135544091463089, 0.10641121864318848, 0.0031034210696816444, 0.022825362160801888, -0.0401696152985096, 0.1375669687986374, 0.08051460981369019, -0.1589299440383911, 0.08588671684265137, 0.016359034925699234, -0.10615748167037964, -0.09137868881225586, -0.03854440152645111, -0.02738635055720806, -0.07349763810634613, -0.013520052656531334, -0.08339639008045197, 0.040996793657541275, 0.0540788508951664, 0.08897363394498825, -0.03263890743255615, 0.14521391689777374, -0.019187171012163162, -0.07048778235912323, 0.10057039558887482, -0.011037035845220089, 0.10684429109096527, -0.0861271470785141, 0.018112406134605408, 0.012337079271674156, -0.015440907329320908, 0.05689281225204468, 0.013893493451178074, -0.04124296456575394, 0.03407243266701698, 0.047062505036592484, -0.035589464008808136, -0.0013304114108905196, 0.041512131690979004, 0.11648598313331604, 0.1203819289803505, 0.0744176059961319, -0.058617208153009415, -0.017750807106494904, 0.19514983892440796, -0.035203926265239716, -0.10206475108861923, -0.17407333850860596, 0.17258186638355255, 0.08398216217756271, 0.030339956283569336, 0.029197925701737404, -0.08982256054878235, -0.009026159532368183, 0.22145073115825653, 0.14549124240875244, -0.0438680537045002, -0.04821421951055527, 0.038535427302122116, -0.0062371050007641315, 0.03162160888314247, 0.09753736853599548, 0.05766896903514862, 0.20881357789039612, -0.11762647330760956, 0.03441581875085831, -0.0798560380935669, -0.05355680361390114, -0.010488449595868587, 0.16800551116466522, 0.010328802280128002, -0.014957782812416553, -0.05574846640229225, 0.10861674696207047, 0.010693621821701527, -0.1945534497499466, 0.07015147060155869, -0.06134660914540291, -0.12537385523319244, -0.02108263224363327, -0.022972024977207184, 0.00734745291993022, 0.04991627112030983, 0.0001325054618064314, -0.011795171536505222, 0.14484767615795135, 0.03146502003073692, -0.05341833829879761, -0.15175345540046692, 0.06585078686475754, 0.0055009471252560616, 0.18255546689033508, -0.0028897570446133614, 0.09108662605285645, 0.07391605526208878, 0.017330603674054146, -0.10224910080432892, 0.06427937746047974, 0.054085683077573776, 0.0466424860060215, 0.04515146464109421, 0.10032165050506592, -0.02973165735602379, 0.06978757679462433, 0.00809235218912363, -0.09707611799240112, 0.0656798928976059, -0.13973690569400787, -0.06241616606712341, -0.16469094157218933, 0.05200355499982834, -0.04484712705016136, 0.12121037393808365, 0.21556979417800903, -0.016732044517993927, 0.026646340265870094, -0.0756177306175232, 0.037810683250427246, -0.02399340458214283, 0.14393943548202515, 0.01301022619009018, -0.17356380820274353, 0.02999182417988777, -0.06961432844400406, 0.03344631940126419, -0.21960031986236572, -0.018018465489149094, 0.013394137844443321, -0.0691409558057785, -0.020464347675442696, 0.12078024446964264, 0.03883141279220581, 0.059752460569143295, -0.03524801880121231, -0.02464379370212555, -0.0264979787170887, 0.1388716697692871, -0.10403923690319061, -0.08890115469694138 ]
null
null
transformers
# legal_t5_small_trans_de_fr_small_finetuned model Model on translating legal text from Deustch to French. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_fr_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_de_fr_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to French. ### How to use Here is how to use this model to translate legal text from Deustch to French in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_fr_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_fr", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "SCHRIFTLICHE ANFRAGE P-0029/06" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_fr_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_fr_small_finetuned | 47.461| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch French", "tags": ["translation Deustch French model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "SCHRIFTLICHE ANFRAGE P-0029/06"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_fr_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch French model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch French" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_fr\_small\_finetuned model ======================================================= Model on translating legal text from Deustch to French. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_fr\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_de\_fr\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to French. ### How to use Here is how to use this model to translate legal text from Deustch to French in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_fr\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_fr\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_fr\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch French model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to French in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_fr\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06824474781751633, 0.09348749369382858, -0.003492747899144888, 0.08847614377737045, 0.04390378296375275, 0.0028797767590731382, 0.03826679661870003, 0.08449222892522812, -0.032044559717178345, 0.07948186248540878, 0.04685594514012337, -0.03551904857158661, 0.058780960738658905, 0.03161690756678581, 0.07673320174217224, -0.2083045244216919, 0.00683413865044713, -0.039129022508859634, -0.016818832606077194, 0.09385811537504196, 0.0986250564455986, -0.05256360396742821, 0.05193222686648369, -0.024923864752054214, -0.020265890285372734, 0.03998523950576782, -0.09822049736976624, -0.04341479390859604, 0.09048651903867722, 0.09371061623096466, 0.08319750428199768, -0.014361485838890076, 0.06845719367265701, -0.192293182015419, -0.003017459064722061, 0.08238697797060013, -0.009359861724078655, 0.03949623927474022, 0.1385224461555481, -0.007283316925168037, 0.1556391716003418, -0.056985847651958466, 0.030827399343252182, 0.04125961288809776, -0.11972858011722565, -0.1184629499912262, -0.05610064044594765, 0.016286615282297134, 0.07201524823904037, 0.138218954205513, -0.049503978341817856, 0.05107598379254341, -0.06274272501468658, 0.06675393879413605, 0.07147961109876633, -0.20932886004447937, -0.031294602900743484, 0.019510040059685707, 0.053491801023483276, 0.07117831707000732, -0.05006235092878342, -0.01646312139928341, 0.04863645136356354, 0.0660988837480545, 0.026932358741760254, -0.06700721383094788, -0.044218335300683975, -0.05560141056776047, -0.12609072029590607, -0.060342464596033096, 0.1496228724718094, 0.025782207027077675, -0.05551857128739357, -0.08247557282447815, -0.06109526380896568, -0.07177919894456863, -0.01946203224360943, -0.050954267382621765, 0.04019686579704285, -0.003516055177897215, 0.07950853556394577, -0.014595926739275455, -0.10390154272317886, -0.060666248202323914, -0.07278532534837723, 0.11292503774166107, 0.04458495229482651, 0.00939058419317007, 0.027534544467926025, 0.08982203900814056, -0.13247144222259521, -0.07383157312870026, 0.0016697413520887494, 0.012730017304420471, -0.10876542329788208, -0.0004553913604468107, -0.007885330356657505, -0.15068753063678741, -0.00639515183866024, 0.054503485560417175, -0.08330798149108887, 0.05430872365832329, 0.06766310334205627, 0.03738125413656235, 0.06186096742749214, 0.11520872265100479, -0.12518523633480072, -0.1339101344347, -0.03335573524236679, 0.005957878660410643, -0.00610019825398922, 0.02912277728319168, -0.05157297104597092, -0.03906690329313278, 0.00538204750046134, 0.02779088355600834, 0.01519724726676941, 0.007598535623401403, -0.0193721242249012, -0.03165232390165329, 0.13608486950397491, -0.10702525824308395, 0.002376614371314645, 0.001886739395558834, -0.10071805864572525, -0.018351444974541664, 0.06933637708425522, -0.010329123586416245, -0.12083395570516586, 0.06740899384021759, -0.032812148332595825, -0.03287361562252045, -0.10958480089902878, -0.17086343467235565, -0.014641773886978626, 0.023022182285785675, -0.06467333436012268, -0.09910406917333603, -0.12068518996238708, -0.09755059331655502, 0.04633062332868576, -0.06696130335330963, 0.0038126471918076277, -0.0731750950217247, -0.005812698043882847, 0.00003798454417847097, 0.0018614463042467833, 0.09882482886314392, -0.047031648457050323, 0.01834244281053543, 0.02104705013334751, 0.06638046354055405, 0.0194009467959404, 0.03480008617043495, -0.11103679984807968, 0.05060639604926109, -0.1517089158296585, 0.15917518734931946, -0.02932208590209484, -0.012229694984853268, -0.1366899013519287, -0.06209243834018707, -0.09053584933280945, 0.0602644681930542, 0.07283490896224976, 0.12693330645561218, -0.20096144080162048, -0.015026229433715343, 0.2040579915046692, -0.08450082689523697, -0.07255476713180542, 0.14645090699195862, -0.026867400854825974, 0.021772317588329315, 0.07792837172746658, 0.11143410205841064, 0.07837773859500885, -0.014742915518581867, -0.04469464719295502, 0.018260549753904343, 0.025234904140233994, 0.08524873107671738, 0.09506738930940628, -0.07397697120904922, 0.027638930827379227, 0.006830758880823851, 0.010988694615662098, 0.00046249100705608726, -0.021200405433773994, -0.03056519664824009, 0.01381753757596016, -0.038602493703365326, -0.023897437378764153, 0.03139267489314079, 0.0035300031304359436, -0.05688228830695152, -0.0920107513666153, -0.04457026347517967, 0.10503615438938141, -0.05198441073298454, 0.012437575496733189, -0.000058971876569557935, -0.06052764877676964, -0.091972716152668, 0.010638982988893986, -0.16022945940494537, -0.019903551787137985, 0.03344886004924774, -0.08900474011898041, 0.10803350806236267, 0.06573684513568878, 0.05489616096019745, 0.10053389519453049, -0.06349318474531174, -0.03240596503019333, -0.01338689774274826, -0.021808331832289696, -0.0893840417265892, -0.1246955469250679, -0.029458895325660706, -0.024302491918206215, -0.006035447120666504, -0.11968289315700531, -0.0037608263082802296, -0.06805960834026337, 0.08472103625535965, 0.0038313192781060934, -0.01424000971019268, 0.011607608757913113, 0.06575855612754822, -0.02741999179124832, -0.03645078465342522, 0.018059488385915756, -0.01739480346441269, -0.020001348108053207, 0.08909950405359268, -0.14842267334461212, -0.10302208364009857, 0.08388591557741165, 0.00819322932511568, -0.134361132979393, -0.0023885169066488743, -0.019860921427607536, -0.06280606240034103, -0.0491703636944294, -0.056552447378635406, 0.24814333021640778, 0.03579626977443695, 0.14812706410884857, -0.10828323662281036, -0.03199397400021553, 0.010778876952826977, -0.014331532642245293, -0.002551160054281354, 0.1729000210762024, 0.08519374579191208, -0.13201077282428741, 0.08652033656835556, 0.014259732328355312, -0.026378147304058075, 0.09187210351228714, 0.061180658638477325, -0.10061065107584, -0.0090628108009696, 0.047557368874549866, 0.0019176549976691604, 0.06002051755785942, -0.09507078677415848, -0.00976572185754776, 0.021443072706460953, 0.054650817066431046, 0.060799237340688705, -0.08456894755363464, 0.07031698524951935, 0.06161552295088768, -0.030748028308153152, 0.04219375178217888, -0.04446440562605858, -0.03553556650876999, 0.10403580963611603, 0.029524467885494232, -0.05329856276512146, -0.04977567121386528, -0.052011869847774506, -0.10733699053525925, 0.2038935422897339, -0.06500061601400375, -0.21246229112148285, -0.10404442250728607, 0.07815441489219666, -0.0524793416261673, 0.0381634458899498, 0.029250837862491608, -0.03812813386321068, -0.06768722832202911, -0.12765195965766907, 0.07927343994379044, -0.08588621020317078, -0.04592716693878174, -0.13088403642177582, 0.029069695621728897, 0.007044024765491486, -0.1317136138677597, 0.0200037881731987, 0.0006915619014762342, -0.023626960813999176, -0.013270157389342785, -0.028899066150188446, 0.1122586578130722, 0.12745021283626556, -0.024529796093702316, -0.038051456212997437, 0.0009771864861249924, 0.1418600231409073, -0.09860026091337204, 0.06144998222589493, 0.06150402873754501, 0.0543990395963192, 0.03458240255713463, 0.1501787006855011, 0.026633763685822487, -0.05870504677295685, 0.04593190923333168, 0.06666305661201477, -0.01706715300679207, -0.2145872563123703, -0.11662033945322037, -0.06839168816804886, 0.02012588083744049, 0.11983489990234375, 0.04115210473537445, -0.03589290380477905, 0.017925916239619255, -0.062003787606954575, 0.06443361192941666, 0.0017090998589992523, 0.05321091040968895, 0.044614024460315704, -0.008004456758499146, 0.0806146040558815, -0.06294003129005432, -0.06608071178197861, 0.10570187121629715, 0.030709926038980484, 0.17877359688282013, -0.05607900023460388, 0.1945742517709732, 0.05740297585725784, 0.022504055872559547, 0.0015007808106020093, 0.05512193217873573, -0.03515835106372833, 0.026445217430591583, -0.03565957024693489, -0.06032911688089371, -0.017262926325201988, 0.056361448019742966, 0.009957312606275082, 0.005706296768039465, -0.062674380838871, -0.06263259798288345, 0.05507997050881386, 0.19854944944381714, 0.05851158872246742, -0.202251598238945, -0.05078139901161194, 0.000050560418458189815, -0.0697537213563919, -0.07196584343910217, 0.026222359389066696, 0.14485521614551544, -0.09866774082183838, 0.013325393199920654, 0.023282812908291817, 0.119010329246521, -0.12145838141441345, -0.00779149541631341, 0.05235563591122627, 0.05462926998734474, -0.015495861880481243, 0.12256617099046707, -0.2430182695388794, 0.14181271195411682, 0.008888311684131622, 0.043236661702394485, -0.03271639347076416, 0.021342169493436813, -0.031488507986068726, 0.01858416013419628, 0.12879163026809692, 0.019495230168104172, -0.0030914470553398132, -0.0951877236366272, -0.0882050022482872, -0.002277679042890668, 0.062010470777750015, -0.07043377310037613, 0.08383353799581528, 0.0511518232524395, 0.003428565338253975, -0.02017117477953434, 0.01577630080282688, -0.07580947875976562, -0.16600528359413147, 0.009781000204384327, -0.03407783806324005, -0.017895963042974472, -0.01197126880288124, -0.02254396863281727, -0.06934310495853424, 0.19938592612743378, -0.0919509157538414, -0.06602936238050461, -0.07021339982748032, -0.009695281274616718, 0.13977277278900146, -0.07296139746904373, 0.01868075132369995, -0.005534002557396889, 0.04755513742566109, -0.032346826046705246, -0.028333889320492744, 0.08918219804763794, -0.09626641869544983, -0.08147676289081573, -0.07526799291372299, 0.11034156382083893, 0.07709529995918274, 0.042942874133586884, -0.009794463403522968, 0.030587218701839447, -0.023048527538776398, -0.1244039461016655, -0.03139695152640343, 0.00998838059604168, 0.10855823755264282, 0.06184627488255501, -0.045973312109708786, -0.04701516032218933, -0.06564167141914368, -0.04346737265586853, 0.07993824779987335, 0.1560715287923813, -0.04846872016787529, 0.03247161582112312, 0.19549761712551117, -0.09765725582838058, -0.19678941369056702, -0.05164504051208496, 0.08761947602033615, 0.061075109988451004, -0.023523550480604172, -0.17541953921318054, 0.042050134390592575, 0.09098886698484421, 0.007997122593224049, 0.06362566351890564, -0.3878575265407562, -0.13792766630649567, 0.07218042016029358, 0.02314049005508423, -0.03641382232308388, -0.11372560262680054, -0.04590515047311783, -0.06636205315589905, -0.018208619207143784, 0.0931401401758194, -0.037758175283670425, 0.08371682465076447, -0.0007508817943744361, 0.004074111580848694, 0.04180927202105522, -0.03434081748127937, 0.13675865530967712, 0.04828525334596634, 0.04827268049120903, -0.053643934428691864, 0.06721819937229156, 0.010999259538948536, -0.013678381219506264, 0.15301278233528137, -0.033726878464221954, 0.05550277978181839, -0.13833321630954742, -0.052834026515483856, -0.06848249584436417, 0.030693449079990387, -0.04316427931189537, -0.0658935084939003, -0.05663001537322998, 0.04607978090643883, 0.05800047889351845, 0.0007505998364649713, -0.040579114109277725, -0.04371614009141922, -0.010086145251989365, 0.1705181747674942, 0.10881537199020386, 0.034922681748867035, -0.12145740538835526, 0.047820646315813065, 0.009684264659881592, 0.07802669703960419, -0.06767250597476959, 0.017545731738209724, 0.13654138147830963, 0.012590993195772171, 0.11639348417520523, -0.009233660995960236, -0.1401161402463913, -0.02292056195437908, 0.05000962316989899, -0.09095225483179092, -0.13563470542430878, -0.01233005989342928, 0.029755501076579094, -0.06541512161493301, -0.03955879434943199, 0.09052882343530655, -0.09444066137075424, -0.01966969110071659, 0.004807441495358944, 0.02982240356504917, -0.03973354399204254, 0.1908901333808899, 0.027284514158964157, 0.043133486062288284, -0.061733923852443695, 0.10379525274038315, 0.13183997571468353, -0.1706920713186264, 0.011582151055335999, 0.1930544674396515, -0.08086056262254715, -0.06805505603551865, 0.011908898130059242, 0.12964653968811035, -0.037010643631219864, -0.06507597118616104, -0.03067193739116192, -0.05314953625202179, 0.03487054258584976, -0.005111068487167358, 0.040585957467556, 0.03514302149415016, -0.006461848970502615, -0.02415245585143566, -0.09840414673089981, 0.09779481589794159, 0.08463192731142044, 0.03393348678946495, -0.0052234698086977005, 0.11337249726057053, 0.025087246671319008, -0.01179471891373396, -0.011379125528037548, 0.009372545406222343, -0.06599309295415878, 0.005025333724915981, -0.0829007625579834, 0.014546817168593407, -0.041400130838155746, -0.004052731674164534, -0.023213468492031097, 0.0060251932591199875, -0.011700666509568691, -0.0014524407451972365, -0.03055734373629093, -0.05436193570494652, -0.02921825833618641, 0.029719023033976555, -0.0991716980934143, -0.03033154457807541, 0.020826930180191994, -0.028008898720145226, 0.05239320546388626, -0.0020836112089455128, 0.018368124961853027, -0.01723640412092209, -0.01717210002243519, 0.05569293349981308, -0.004206626210361719, 0.0512121208012104, -0.013585194945335388, -0.0926334485411644, 0.02244749292731285, 0.019900407642126083, -0.0036015810910612345, -0.020875442773103714, 0.013368897140026093, -0.14658816158771515, 0.006214594468474388, -0.019312478601932526, -0.043228764086961746, -0.08136071264743805, 0.09495359659194946, 0.03782571852207184, 0.06806652247905731, 0.10169439017772675, -0.07408656179904938, 0.07445710897445679, -0.15631848573684692, -0.014254271984100342, 0.021595977246761322, 0.013382057659327984, -0.0193277969956398, -0.007658991031348705, 0.04496071860194206, -0.05451624467968941, 0.14802336692810059, 0.048049889504909515, 0.07818538695573807, 0.02416566014289856, -0.0856807604432106, -0.015872452408075333, 0.03237176686525345, 0.05771731585264206, -0.030783401802182198, -0.024709058925509453, -0.07958666980266571, 0.08949511498212814, 0.0076981051824986935, 0.08056885004043579, 0.04811583459377289, 0.13359303772449493, 0.11289383471012115, 0.03529230132699013, -0.017871137708425522, -0.11095212399959564, -0.0639406144618988, 0.053840186446905136, 0.0056121754460036755, 0.038012754172086716, -0.03018680401146412, 0.09577767550945282, 0.10827095806598663, -0.13874979317188263, 0.13447195291519165, 0.0019725291058421135, -0.08152689039707184, -0.04403470829129219, -0.1177290752530098, -0.03935202211141586, -0.017898909747600555, -0.051420100033283234, -0.10497763007879257, 0.023683812469244003, 0.06648131459951401, 0.05545831844210625, -0.025981465354561806, 0.13916294276714325, -0.0779518112540245, -0.11237622052431107, 0.05035591125488281, 0.020702458918094635, 0.09300883114337921, 0.02957179583609104, 0.03840562328696251, 0.04803503304719925, 0.007272806018590927, 0.04165368154644966, 0.057980868965387344, -0.014371974393725395, 0.0036314395256340504, 0.015183808282017708, -0.046145498752593994, -0.03865201026201248, 0.007833627983927727, 0.06985749304294586, 0.20862749218940735, 0.05724161118268967, -0.07680892944335938, -0.015138293616473675, 0.18519090116024017, -0.06337588280439377, -0.0858701542019844, -0.10528222471475601, 0.22186926007270813, 0.03939458355307579, 0.04039677605032921, -0.009021824225783348, -0.0982002541422844, -0.011133510619401932, 0.13999371230602264, 0.2013148069381714, -0.05224590376019478, -0.029516857117414474, 0.028688091784715652, -0.005094323307275772, 0.046526648104190826, 0.02922205813229084, 0.034259963780641556, 0.279933363199234, -0.0906338095664978, 0.09083195775747299, -0.04499255493283272, 0.017188366502523422, -0.004788313060998917, 0.1763082593679428, 0.0072657763957977295, 0.023406781256198883, -0.06992977112531662, 0.07705827802419662, -0.012036584317684174, -0.18771623075008392, 0.027579786255955696, -0.08730654418468475, -0.10844159126281738, 0.011223305948078632, -0.033164966851472855, 0.05985170602798462, 0.06804057955741882, 0.013168631121516228, 0.025255510583519936, 0.08073350042104721, 0.01157110184431076, -0.10952194780111313, -0.12319689244031906, 0.0016763625899329782, -0.002802595030516386, 0.1453440934419632, 0.0019026189111173153, 0.1332153081893921, 0.08297022432088852, -0.0003679304209072143, -0.10152594745159149, 0.07383883744478226, 0.028511982411146164, 0.04185545817017555, 0.07827351242303848, 0.09001770615577698, 0.0017538336105644703, 0.0633375272154808, 0.02821722999215126, -0.07076059281826019, 0.02855759672820568, -0.047339633107185364, -0.025891566649079323, -0.13515760004520416, 0.09615452587604523, -0.041799888014793396, 0.14232107996940613, 0.1848289519548416, -0.012650720775127411, -0.006509752478450537, -0.05295564606785774, 0.00915556401014328, -0.011952743865549564, 0.09026958793401718, -0.008924081921577454, -0.14688560366630554, 0.040581364184617996, -0.08008647710084915, 0.03716713935136795, -0.2657147943973541, -0.021400930359959602, 0.021266566589474678, -0.045600418001413345, 0.004151404835283756, 0.07236266136169434, 0.03538166731595993, 0.04630519077181816, -0.056697696447372437, -0.08182412385940552, -0.004536351189017296, 0.1006440818309784, -0.08604665100574493, -0.11340209096670151 ]
null
null
transformers
# legal_t5_small_trans_de_it model Model on translating legal text from Deustch to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_it is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Italian. ### How to use Here is how to use this model to translate legal text from Deustch to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_it"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_it", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Zum Zeitpunkt der Schlussabstimmung anwesende Stellvertreter(innen)" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_it model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_it | 43.3| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Italian", "tags": ["translation Deustch Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Zum Zeitpunkt der Schlussabstimmung anwesende Stellvertreter(innen)"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_it
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_it model ===================================== Model on translating legal text from Deustch to Italian. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_it is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Italian. ### How to use Here is how to use this model to translate legal text from Deustch to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_it model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.13543079793453217, 0.0922074094414711, -0.0021908720955252647, 0.08225801587104797, 0.09887326508760452, 0.020842431113123894, 0.08542408794164658, 0.10786139219999313, -0.06459056586027145, 0.0826837494969368, 0.05361679196357727, 0.03288445994257927, 0.09665457904338837, 0.12223754823207855, 0.03363755717873573, -0.22480015456676483, 0.031021276488900185, -0.018762892112135887, -0.00673278234899044, 0.13688436150550842, 0.12400992214679718, -0.09158699959516525, 0.02233528532087803, -0.03425050899386406, -0.12603046000003815, 0.00721018249168992, -0.06159035861492157, -0.06465470045804977, 0.08075905591249466, 0.049586713314056396, 0.11451691389083862, 0.01886853575706482, 0.09322699904441833, -0.15945464372634888, -0.0014812068548053503, 0.09598598629236221, 0.05008530989289284, 0.03136894479393959, 0.08425887674093246, -0.008221680298447609, 0.14756521582603455, -0.008697476238012314, 0.05057916045188904, 0.0180429145693779, -0.11310577392578125, -0.12015372514724731, -0.06403406709432602, 0.03407710790634155, 0.135039284825325, 0.15269891917705536, -0.05802753195166588, 0.0717005506157875, -0.1269058883190155, 0.05791650712490082, 0.06343623250722885, -0.25203123688697815, -0.07239455729722977, 0.008844399824738503, 0.03327729552984238, 0.08689907193183899, -0.040187910199165344, -0.03684036806225777, 0.034150391817092896, 0.02218574658036232, -0.013867893256247044, -0.008196238428354263, 0.021965352818369865, -0.019978685304522514, -0.16527724266052246, -0.09515136480331421, 0.16526316106319427, -0.004271426238119602, -0.0803217813372612, -0.08957230299711227, -0.03100963681936264, -0.14449740946292877, 0.006666465662419796, -0.06492862850427628, 0.03726505488157272, 0.0037771782372146845, 0.04837749898433685, -0.000733008433599025, -0.11412344127893448, -0.11922141164541245, 0.03781510517001152, 0.11113453656435013, 0.10549351572990417, -0.01825999841094017, 0.009270193055272102, 0.16689243912696838, 0.008991274051368237, -0.08688478171825409, -0.014539879746735096, 0.010129599831998348, -0.10732545703649521, -0.02575990930199623, -0.035201966762542725, -0.14351223409175873, -0.05463431030511856, 0.10526861995458603, -0.007172321435064077, 0.04771403968334198, 0.03186682611703873, 0.04487264156341553, 0.008094709366559982, 0.15677006542682648, -0.09563747048377991, -0.07115228474140167, -0.05841464549303055, 0.059410855174064636, -0.05765533447265625, 0.021360740065574646, -0.025102099403738976, -0.01781313121318817, 0.05456460267305374, 0.08236956596374512, -0.06539369374513626, 0.0014854057226330042, -0.0495067834854126, -0.03142736479640007, 0.04203128069639206, -0.1301776021718979, -0.028627989813685417, 0.0025446745567023754, -0.11783165484666824, -0.01109709870070219, 0.05839916691184044, -0.015113326720893383, -0.1418052762746811, 0.06400907039642334, -0.03186238184571266, -0.026121707633137703, -0.14427663385868073, -0.10256367176771164, -0.02292451448738575, -0.07398565113544464, -0.04156086593866348, -0.06743695586919785, -0.15619446337223053, -0.10320299863815308, 0.06772822141647339, -0.06912500411272049, -0.03191612288355827, -0.09073866158723831, -0.0048109861090779305, -0.009091389365494251, -0.028951959684491158, 0.11258279532194138, -0.02487214095890522, 0.096242256462574, 0.012196021154522896, 0.04488997533917427, 0.13350258767604828, 0.08008397370576859, -0.10403451323509216, 0.01009022630751133, -0.09615545719861984, 0.1704005002975464, -0.01947224885225296, -0.007189048919826746, -0.15007652342319489, -0.0768524780869484, -0.06749048084020615, 0.06347516179084778, 0.12123336642980576, 0.15230943262577057, -0.1653330773115158, -0.01057538390159607, 0.20863217115402222, -0.10005313903093338, -0.03479084372520447, 0.12341742217540741, -0.03979257121682167, 0.1486990749835968, 0.09381143003702164, 0.17403803765773773, 0.053989943116903305, -0.0855971947312355, -0.007016574498265982, -0.04433320462703705, -0.009598977863788605, 0.0006282824906520545, 0.08306656032800674, -0.03899183124303818, -0.03901762515306473, -0.007870808243751526, -0.1112128272652626, 0.027941884472966194, -0.0646061971783638, -0.06619923561811447, 0.017839830368757248, -0.0678076222538948, -0.039005178958177567, 0.06449472159147263, 0.05267484486103058, -0.05260933190584183, -0.13536326587200165, -0.025934044271707535, 0.10537435859441757, -0.0797320008277893, 0.025625143200159073, -0.06362839788198471, -0.021213069558143616, -0.05962789058685303, -0.009500711224973202, -0.15518252551555634, 0.03522292897105217, 0.056936927139759064, -0.022198539227247238, 0.03837672993540764, 0.038993388414382935, 0.04072317108511925, 0.05893572047352791, -0.008210442960262299, -0.05753915756940842, -0.051671046763658524, -0.03960690274834633, -0.12076025456190109, -0.11522173881530762, -0.036718133836984634, -0.01893809251487255, 0.09814150631427765, -0.18021941184997559, 0.039104923605918884, -0.08478555828332901, 0.056426040828228, -0.025587724521756172, -0.04564922675490379, 0.005701080430299044, 0.031151574105024338, 0.01575399748980999, -0.07077748328447342, 0.042486220598220825, 0.028188297525048256, 0.0440838485956192, 0.07362797856330872, -0.09661625325679779, -0.13072682917118073, 0.08483454585075378, 0.048476845026016235, -0.1495712846517563, 0.004949563182890415, -0.039574142545461655, -0.06697550415992737, -0.05181411653757095, -0.0011072735069319606, 0.22889238595962524, 0.01120772771537304, 0.1342519074678421, -0.10879703611135483, -0.02584906294941902, 0.0008687463705427945, -0.010005352087318897, 0.0006042561726644635, 0.13382969796657562, 0.07850809395313263, -0.08855302631855011, 0.05641937255859375, 0.026817571371793747, -0.033634334802627563, 0.14925721287727356, 0.011879080906510353, -0.13871395587921143, 0.022126588970422745, 0.08154089748859406, -0.023035457357764244, 0.08465301245450974, -0.1519586592912674, 0.010875614359974861, 0.007971041835844517, 0.05119958892464638, 0.06562129408121109, -0.1554027944803238, 0.021089771762490273, 0.05124929919838905, -0.04857712611556053, 0.009885224513709545, -0.024396155029535294, -0.06592074781656265, 0.08852874487638474, 0.023721374571323395, -0.04753374308347702, -0.012226461432874203, -0.03557003289461136, -0.12423039227724075, 0.1997908353805542, -0.06082108989357948, -0.14501333236694336, -0.09757792204618454, 0.09826208651065826, 0.0579763688147068, 0.00385578372515738, 0.03696386143565178, -0.08204155415296555, -0.04421106353402138, -0.08021873980760574, 0.10056871920824051, -0.05711638182401657, -0.052424319088459015, -0.09908328950405121, -0.000503281713463366, -0.018931278958916664, -0.125129833817482, 0.0317329540848732, -0.04213844612240791, -0.09153956174850464, -0.009353683330118656, -0.06883767992258072, 0.08744403719902039, 0.15895242989063263, -0.010120442137122154, 0.03023531101644039, -0.0046972837299108505, 0.16861891746520996, -0.14496944844722748, 0.01986972615122795, 0.0916564092040062, 0.03927735239267349, 0.00418632198125124, 0.10599533468484879, -0.008550204336643219, -0.0950542539358139, 0.03886539861559868, 0.05515078082680702, -0.017758961766958237, -0.27317506074905396, -0.028760768473148346, -0.025952449068427086, -0.0358126275241375, 0.1036490723490715, 0.03792376071214676, 0.015461978502571583, 0.04683477059006691, -0.026591410860419273, -0.021551689133048058, 0.039928946644067764, 0.04700513556599617, -0.021484101191163063, 0.0015919898869469762, 0.07627169787883759, -0.0481143593788147, -0.0252673402428627, 0.05512497201561928, 0.0355764776468277, 0.25608184933662415, -0.05685002729296684, 0.12168868631124496, 0.07595515996217728, 0.11054594069719315, -0.002020493382588029, 0.07745232433080673, -0.012416258454322815, 0.008193905465304852, -0.0048835850320756435, -0.030158311128616333, -0.061996784061193466, 0.03826645389199257, 0.009207409806549549, -0.007872718386352062, -0.10636038333177567, -0.014458746649324894, 0.02042894810438156, 0.34210333228111267, 0.058730218559503555, -0.24422307312488556, -0.05697888135910034, -0.007650927174836397, -0.056015558540821075, -0.09694809466600418, 0.058039434254169464, 0.08196377754211426, -0.1267510950565338, -0.027964912354946136, -0.024210020899772644, 0.09847921878099442, -0.11328918486833572, -0.05311029776930809, 0.06072145700454712, 0.0613863505423069, -0.011631240136921406, 0.09807752072811127, -0.2935159504413605, 0.19527828693389893, -0.013283642008900642, 0.1456916630268097, -0.026442876085639, 0.028157807886600494, -0.04858895763754845, 0.02524537593126297, 0.16902437806129456, -0.004283005837351084, 0.03253886103630066, -0.07260099053382874, -0.10889814794063568, 0.01351984217762947, 0.05571436136960983, -0.058450303971767426, 0.07837370038032532, 0.016787461936473846, 0.032519903033971786, -0.005528476554900408, -0.08370090276002884, -0.12863178551197052, -0.11053403466939926, 0.005590119864791632, -0.07551351934671402, 0.04939743131399155, -0.03644125908613205, -0.06543298065662384, -0.0012101366883143783, 0.15950442850589752, -0.10525477677583694, -0.08222000300884247, -0.09966640174388885, 0.03618987277150154, 0.0868116170167923, -0.04534841328859329, -0.0003709805605467409, 0.014021910727024078, 0.015051043592393398, -0.006253672763705254, 0.020439473912119865, 0.10616129636764526, -0.0695633515715599, -0.11239317059516907, -0.048712603747844696, 0.1137465164065361, 0.1316678673028946, 0.05063537508249283, -0.017836233600974083, 0.004176637623459101, -0.00924582127481699, -0.06811878085136414, 0.008007579483091831, -0.007185001391917467, 0.046951644122600555, 0.027793049812316895, -0.08008656650781631, -0.01643780618906021, -0.09750483930110931, -0.05172355845570564, 0.09351074695587158, 0.137867733836174, -0.05319174379110336, 0.03985653445124626, 0.16808614134788513, -0.11132153868675232, -0.1790386140346527, 0.04277164116501808, 0.10041476041078568, 0.07812469452619553, -0.08352509140968323, -0.2233881801366806, 0.014418204315006733, 0.10563443601131439, 0.005143388174474239, 0.0008633030811324716, -0.4231940805912018, -0.12271741777658463, 0.09299872815608978, 0.09441955387592316, -0.02774999849498272, -0.07606273144483566, -0.004880115855485201, 0.03714153543114662, -0.02896219491958618, 0.052022386342287064, -0.004147356376051903, 0.08461231738328934, 0.02832598052918911, -0.06516166031360626, 0.042877163738012314, -0.059296615421772, 0.1117459163069725, 0.07721222192049026, 0.04938806965947151, -0.04314568638801575, 0.03930220752954483, -0.00017098644457291812, -0.007802847307175398, 0.15528994798660278, 0.038590557873249054, 0.023445703089237213, -0.1831463724374771, -0.07110369950532913, -0.07951536029577255, 0.005126012954860926, -0.07162634283304214, -0.05457331985235214, -0.03545888885855675, 0.09569192677736282, 0.04597034677863121, 0.002321342471987009, -0.039494678378105164, -0.0759972482919693, -0.029989222064614296, 0.08274263888597488, 0.09410956501960754, 0.07183670252561569, -0.0790875181555748, 0.02005939930677414, 0.04372783005237579, 0.09048207849264145, -0.12000331282615662, -0.01758509874343872, 0.1276884227991104, -0.027026519179344177, 0.13009943068027496, -0.003186577232554555, -0.14665085077285767, 0.0009908624924719334, 0.04463319852948189, -0.08314673602581024, -0.1286022961139679, -0.013213408179581165, -0.06505685299634933, -0.038007743656635284, -0.028179027140140533, 0.0652485191822052, -0.11523587256669998, -0.021221041679382324, -0.024982457980513573, 0.04408082365989685, -0.07929347455501556, 0.2324797362089157, 0.035147666931152344, 0.0445554181933403, -0.06461872905492783, 0.1463029831647873, 0.10072953999042511, -0.13293181359767914, 0.019587919116020203, 0.17203892767429352, -0.09261763840913773, -0.056095246225595474, 0.015264548361301422, 0.12309405207633972, -0.026778139173984528, -0.08469368517398834, -0.0773930549621582, -0.04480709135532379, 0.06149868294596672, -0.025959091261029243, 0.043785542249679565, 0.020749462768435478, -0.04169416055083275, -0.005435193423181772, -0.14125166833400726, 0.06470273435115814, 0.1175002008676529, -0.0014301617629826069, -0.016994329169392586, 0.1785392016172409, 0.0571913942694664, 0.030639419332146645, -0.005914501380175352, -0.040956150740385056, -0.04161272197961807, 0.07286974787712097, -0.008044160902500153, -0.03985878452658653, -0.042381785809993744, -0.009310282766819, -0.036053311079740524, -0.009724091738462448, -0.005281410180032253, 0.01933012157678604, -0.07649683952331543, -0.032116055488586426, -0.03952484950423241, 0.03495420143008232, -0.07231006771326065, -0.0008495051879435778, -0.008293154649436474, -0.06654679775238037, 0.08031023293733597, 0.02490362524986267, -0.0019540684297680855, 0.00287952134385705, -0.011448420584201813, 0.07862710952758789, -0.04479727894067764, 0.004342757165431976, -0.017363622784614563, -0.08215919882059097, 0.0513887032866478, 0.007208502385765314, -0.021336546167731285, -0.014668530784547329, 0.04794056713581085, -0.12687143683433533, 0.0836641788482666, -0.019492734223604202, -0.01744728907942772, -0.06983694434165955, 0.13111701607704163, 0.029439479112625122, 0.08878608047962189, 0.08655942231416702, -0.06355095654726028, 0.06750407814979553, -0.1305980682373047, -0.039669670164585114, 0.03311384096741676, 0.009351721964776516, -0.028041262179613113, -0.05877407267689705, 0.06230656057596207, -0.037850186228752136, 0.0610128678381443, 0.08013199269771576, 0.04651632532477379, 0.0323634147644043, -0.12576770782470703, 0.00044133560732007027, 0.04831733927130699, 0.06226688623428345, -0.009554107673466206, 0.017115553840994835, -0.010244195349514484, 0.05246307700872421, -0.02339794859290123, 0.097162626683712, 0.12438207864761353, 0.23559771478176117, 0.08256548643112183, 0.10453955829143524, -0.03610977530479431, -0.12035287171602249, -0.11450079828500748, 0.10494678467512131, -0.014783188700675964, 0.025385240092873573, -0.031301986426115036, 0.14176683127880096, 0.08839783072471619, -0.15994882583618164, 0.06444727629423141, 0.0023284878116101027, -0.1116425022482872, -0.08399415761232376, -0.07504451274871826, -0.025219744071364403, -0.07174776494503021, -0.007146263495087624, -0.09126337617635727, 0.03870864585042, 0.0538131408393383, 0.07398112118244171, -0.05555512383580208, 0.15403388440608978, 0.004143364727497101, -0.06730426847934723, 0.10481525957584381, -0.011954991146922112, 0.11073441058397293, -0.08903288096189499, -0.010274584405124187, 0.004288431257009506, -0.014063230715692043, 0.06374792754650116, 0.006105373613536358, -0.03538057953119278, 0.02562878094613552, 0.04682197794318199, -0.036083757877349854, -0.002228576224297285, 0.03437432646751404, 0.1139046773314476, 0.11598844826221466, 0.05844700708985329, -0.04419979825615883, -0.022722819820046425, 0.19379936158657074, -0.02940407767891884, -0.08609337359666824, -0.16846242547035217, 0.1654592752456665, 0.08344413340091705, 0.02883738838136196, 0.02904793806374073, -0.09904900938272476, -0.0038724178448319435, 0.2299717366695404, 0.13191835582256317, -0.05773278325796127, -0.05419699847698212, 0.03799869120121002, -0.0032390779815614223, 0.017326222732663155, 0.11341402679681778, 0.043767333030700684, 0.19274914264678955, -0.11336177587509155, 0.05466505512595177, -0.0747591182589531, -0.049254160374403, -0.014839385636150837, 0.16772037744522095, 0.010745051316916943, -0.008115037344396114, -0.07168944925069809, 0.11712907999753952, 0.02402978017926216, -0.18326707184314728, 0.06540405750274658, -0.07081915438175201, -0.13988827168941498, -0.022136518731713295, -0.01510169729590416, -0.0055937995202839375, 0.06887458264827728, -0.010441060177981853, -0.004634413402527571, 0.13497373461723328, 0.03494993597269058, -0.0520414263010025, -0.1734868288040161, 0.06819375604391098, -0.00812375545501709, 0.18020763993263245, -0.009592139162123203, 0.08388932794332504, 0.07594528049230576, 0.03017016313970089, -0.10950031131505966, 0.07585516571998596, 0.04796198382973671, 0.03597060963511467, 0.0446954108774662, 0.0965118259191513, -0.0456259660422802, 0.07902153581380844, 0.024373719468712807, -0.11226420849561691, 0.06707608699798584, -0.1459248960018158, -0.05614259093999863, -0.1652807742357254, 0.041377995163202286, -0.04627494886517525, 0.12949907779693604, 0.21566671133041382, -0.021276915445923805, 0.010425976477563381, -0.07141429930925369, 0.04935944825410843, -0.010187827050685883, 0.16270232200622559, 0.012827297672629356, -0.20167039334774017, 0.00786791741847992, -0.0680738016963005, 0.03452784940600395, -0.19451367855072021, -0.021352097392082214, 0.01038599293678999, -0.07955504208803177, -0.031494785100221634, 0.11583399772644043, 0.04841483011841774, 0.06222733482718468, -0.033356256783008575, -0.046737898141145706, -0.01881830208003521, 0.13850843906402588, -0.11201313883066177, -0.09144749492406845 ]
null
null
transformers
# legal_t5_small_trans_de_it_small_finetuned model Model on translating legal text from Deustch to Italian. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_it_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_de_it_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Italian. ### How to use Here is how to use this model to translate legal text from Deustch to Italian in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_it_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_it", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "sicherstellen, dass alle Bürger gemäß der Richtlinie .../.../EG [über den Universaldienst und Nutzerrechte bei elektronischen Kommunikationsnetzen und -diensten[ zu erschwinglichen Preisen Zugang zum Universaldienst erhalten;" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_it_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_it_small_finetuned | 42.895| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Italian", "tags": ["translation Deustch Italian model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "sicherstellen, dass alle B\u00fcrger gem\u00e4\u00df der Richtlinie .../.../EG [\u00fcber den Universaldienst und Nutzerrechte bei elektronischen Kommunikationsnetzen und -diensten[ zu erschwinglichen Preisen Zugang zum Universaldienst erhalten;"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_it_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Italian model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Italian" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_it\_small\_finetuned model ======================================================= Model on translating legal text from Deustch to Italian. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_it\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_de\_it\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Italian. ### How to use Here is how to use this model to translate legal text from Deustch to Italian in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_it\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_it\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_it\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Italian model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Italian in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_it\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.07957707345485687, 0.09881924092769623, -0.0030737046618014574, 0.08879710733890533, 0.05905604735016823, 0.018755951896309853, 0.044930290430784225, 0.10661818832159042, -0.01896600052714348, 0.09313222020864487, 0.030630405992269516, -0.02548304945230484, 0.07481994479894638, 0.03557931259274483, 0.05626922473311424, -0.21850092709064484, 0.011305582709610462, -0.0391555093228817, -0.006948594003915787, 0.0976823940873146, 0.09735912829637527, -0.06630121171474457, 0.047195035964250565, -0.04588402435183525, -0.03916452080011368, 0.03571855649352074, -0.09699825197458267, -0.04012279585003853, 0.08720207959413528, 0.09608549624681473, 0.08694210648536682, -0.015362294390797615, 0.07632070034742355, -0.1805330216884613, -0.002639924641698599, 0.09074262529611588, -0.0007876319577917457, 0.03308388963341713, 0.14434637129306793, 0.0027679374907165766, 0.16085711121559143, -0.040086597204208374, 0.021393518894910812, 0.037484310567379, -0.12091612070798874, -0.10759599506855011, -0.04972780868411064, 0.017917700111865997, 0.07649895548820496, 0.13914495706558228, -0.05354977026581764, 0.06685104966163635, -0.06065281108021736, 0.0684448629617691, 0.06830621510744095, -0.1969614028930664, -0.03874323517084122, 0.01633470319211483, 0.04072851315140724, 0.08410965651273727, -0.037604451179504395, -0.014875611290335655, 0.04996068775653839, 0.06354677677154541, 0.004014309495687485, -0.051279012113809586, -0.03700362890958786, -0.060635149478912354, -0.1280444860458374, -0.05843073129653931, 0.15539464354515076, 0.017935413867235184, -0.053489938378334045, -0.07935071736574173, -0.05931093916296959, -0.06269204616546631, -0.006660094950348139, -0.0461006760597229, 0.02155688777565956, 0.0037341078277677298, 0.09048581123352051, -0.0015543043846264482, -0.11206228286027908, -0.06606980413198471, -0.06107569858431816, 0.14748108386993408, 0.05650879070162773, 0.007295167073607445, 0.025594130158424377, 0.0894785150885582, -0.11989042162895203, -0.08185557276010513, 0.006547268945723772, 0.021416151896119118, -0.10118602961301804, -0.004857050720602274, -0.015198237262666225, -0.172032430768013, -0.007046921644359827, 0.041236624121665955, -0.04953521862626076, 0.04418538883328438, 0.0717259868979454, 0.043189845979213715, 0.06148405745625496, 0.10540685057640076, -0.13634636998176575, -0.12913869321346283, -0.029280707240104675, 0.00470534386113286, 0.007951322942972183, 0.01710871234536171, -0.06445801258087158, -0.04251326993107796, 0.0006318931118585169, 0.03689949959516525, 0.02551202103495598, 0.01705745980143547, -0.02851760946214199, -0.042521752417087555, 0.11586558818817139, -0.1167285367846489, 0.0019758802372962236, 0.004845333285629749, -0.10589820146560669, -0.008449544198811054, 0.05551885813474655, -0.019485024735331535, -0.1256590634584427, 0.07806219160556793, -0.0350494310259819, -0.03576742857694626, -0.11890317499637604, -0.1704493910074234, -0.0026467053685337305, -0.003594800364226103, -0.058407627046108246, -0.10315879434347153, -0.12473505735397339, -0.0951843410730362, 0.034648001194000244, -0.07407498359680176, 0.013323931023478508, -0.061703234910964966, 0.008260433562099934, -0.013612802140414715, -0.0057975007221102715, 0.09854576736688614, -0.04212617129087448, 0.03539825230836868, 0.030833808705210686, 0.0692652016878128, 0.014461304992437363, 0.03550204634666443, -0.12342429161071777, 0.04294956475496292, -0.14073246717453003, 0.14368388056755066, -0.024831093847751617, 0.0029645587783306837, -0.12821467220783234, -0.05180877819657326, -0.08929317444562912, 0.06037777289748192, 0.07897067070007324, 0.13769936561584473, -0.20969007909297943, -0.0072959125973284245, 0.19254498183727264, -0.09658592939376831, -0.0761372447013855, 0.13996948301792145, -0.018925128504633904, 0.0457182414829731, 0.08554936200380325, 0.11611593514680862, 0.07129966467618942, -0.01696825958788395, -0.050986990332603455, -0.0020558806136250496, 0.018560290336608887, 0.07766404002904892, 0.08629267662763596, -0.06145187094807625, 0.06118008866906166, 0.01688487082719803, -0.004623438697308302, -0.0017849849537014961, -0.024766935035586357, -0.033648356795310974, 0.008493777364492416, -0.05656594783067703, -0.04319831356406212, 0.03540295735001564, 0.008234987035393715, -0.05701647326350212, -0.0947365090250969, -0.06293007731437683, 0.1128544956445694, -0.06075220927596092, 0.01613950915634632, -0.016355326399207115, -0.03173183649778366, -0.09350641816854477, 0.015261070802807808, -0.15838868916034698, -0.018597831949591637, 0.04383229836821556, -0.0757642462849617, 0.10211577266454697, 0.05710449814796448, 0.05394495278596878, 0.0940975695848465, -0.06203925237059593, -0.03195386007428169, -0.001822932972572744, -0.02094709873199463, -0.09602721780538559, -0.12834958732128143, -0.04739139601588249, -0.01937405951321125, -0.006874882150441408, -0.11934569478034973, 0.002887153532356024, -0.05546141043305397, 0.09717830270528793, -0.0030444255098700523, -0.019719533622264862, 0.007563450373709202, 0.05988173559308052, -0.035329435020685196, -0.04492204636335373, 0.018016817048192024, -0.011458426713943481, -0.029871534556150436, 0.07599407434463501, -0.1589885801076889, -0.09740433841943741, 0.07901158183813095, 0.001741950516588986, -0.1262296587228775, -0.0077835191041231155, -0.014150582253932953, -0.06862236559391022, -0.05367594584822655, -0.0642005130648613, 0.22898434102535248, 0.03198627009987831, 0.1269913613796234, -0.1091129258275032, -0.011477146297693253, 0.015805751085281372, -0.015000085346400738, -0.009293866343796253, 0.15958431363105774, 0.09491661190986633, -0.14592431485652924, 0.09060636162757874, 0.010864470154047012, -0.0369575135409832, 0.098629891872406, 0.06925863772630692, -0.11237175762653351, -0.0009005960891954601, 0.051330432295799255, 0.0052345492877066135, 0.054759953171014786, -0.10679540783166885, 0.0026218737475574017, 0.020514972507953644, 0.054950181394815445, 0.06851695477962494, -0.08348315954208374, 0.05940966680645943, 0.05954964831471443, -0.02135179191827774, 0.036576032638549805, -0.05895860865712166, -0.04611917957663536, 0.10842730849981308, 0.02791234850883484, -0.05071553960442543, -0.04283708706498146, -0.04982597753405571, -0.10100143402814865, 0.19193097949028015, -0.05686886981129646, -0.21432314813137054, -0.10856098681688309, 0.07845716178417206, -0.039013493806123734, 0.044828835874795914, 0.02073972299695015, -0.033861372619867325, -0.06735818088054657, -0.1143256276845932, 0.08470699936151505, -0.08178980648517609, -0.048328597098588943, -0.13318921625614166, 0.03712616115808487, -0.0014210965018719435, -0.11531558632850647, 0.02357618883252144, 0.0008046258590184152, -0.031012313440442085, -0.01027136854827404, -0.047320276498794556, 0.1279607117176056, 0.13820725679397583, -0.017305318266153336, -0.026739101856946945, 0.0026367679238319397, 0.11549610644578934, -0.11228866875171661, 0.07046115398406982, 0.08130267262458801, 0.03530081361532211, 0.024429701268672943, 0.15186747908592224, 0.02274465188384056, -0.06977488100528717, 0.040119994431734085, 0.06091920658946037, -0.018433842808008194, -0.21862904727458954, -0.1092430055141449, -0.07152939587831497, 0.022313859313726425, 0.11312068998813629, 0.03819786384701729, -0.04566105827689171, 0.0276549831032753, -0.0658404752612114, 0.0414242297410965, 0.0033149756491184235, 0.051975496113300323, 0.025638792663812637, -0.012289104051887989, 0.08370810747146606, -0.05885745584964752, -0.05848909914493561, 0.1091332659125328, 0.04936166852712631, 0.1968640834093094, -0.05710877105593681, 0.20505738258361816, 0.0490059070289135, 0.03290881589055061, 0.006026085466146469, 0.055643439292907715, -0.027385111898183823, 0.017642153427004814, -0.02922593243420124, -0.06336433440446854, -0.02527202107012272, 0.06277787685394287, 0.011082245036959648, -0.0030789838638156652, -0.05242745578289032, -0.05759868398308754, 0.05284339189529419, 0.21609286963939667, 0.05487803742289543, -0.21140311658382416, -0.04986915737390518, 0.0026648070197552443, -0.05474553257226944, -0.06999961286783218, 0.013312168419361115, 0.14593994617462158, -0.08361051976680756, 0.00947143230587244, 0.022733289748430252, 0.11998515576124191, -0.14219453930854797, -0.013460014946758747, 0.044914714992046356, 0.056421879678964615, -0.018203074112534523, 0.1351192742586136, -0.24300463497638702, 0.14353032410144806, 0.012688720598816872, 0.0674080029129982, -0.05205811560153961, 0.018086742609739304, -0.035426389425992966, 0.024143507704138756, 0.1260870099067688, 0.011870612390339375, -0.0010358077706769109, -0.11995751410722733, -0.10230249911546707, -0.0032678176648914814, 0.0793018639087677, -0.06292376667261124, 0.08661921322345734, 0.0457882359623909, 0.015781231224536896, -0.014139180071651936, 0.03127221390604973, -0.056081462651491165, -0.16102972626686096, 0.0207811389118433, -0.021437376737594604, -0.026989039033651352, -0.011080833151936531, -0.032585591077804565, -0.052381496876478195, 0.20202256739139557, -0.08935360610485077, -0.06536423414945602, -0.07905668765306473, 0.013765158131718636, 0.13173557817935944, -0.0718642845749855, 0.0014794728485867381, -0.002637451747432351, 0.050070494413375854, -0.029137538745999336, -0.023162417113780975, 0.09912489354610443, -0.08808314800262451, -0.0920129045844078, -0.07861638069152832, 0.09792213886976242, 0.08036182820796967, 0.039793867617845535, -0.006871677469462156, 0.02586229518055916, -0.011518959887325764, -0.10807862132787704, -0.020866885781288147, 0.02005511149764061, 0.10703834146261215, 0.06943771988153458, -0.056298233568668365, -0.04013138264417648, -0.07008804380893707, -0.05635308846831322, 0.06841030716896057, 0.15315966308116913, -0.04477877542376518, 0.005253151524811983, 0.18865197896957397, -0.1049472764134407, -0.19823116064071655, -0.03438103571534157, 0.07658281177282333, 0.07232395559549332, -0.021947694942355156, -0.1745254546403885, 0.03749662637710571, 0.11309942603111267, 0.0019948615226894617, 0.07681837677955627, -0.38951167464256287, -0.12783274054527283, 0.06447360664606094, 0.03583598881959915, -0.03670523688197136, -0.11833962798118591, -0.0446273572742939, -0.0626925528049469, -0.015541449189186096, 0.06820952892303467, -0.02597706764936447, 0.09090880304574966, 0.0014137419639155269, -0.0011939324904233217, 0.04910135641694069, -0.039742324501276016, 0.13112694025039673, 0.04059310629963875, 0.043737199157476425, -0.06406480073928833, 0.062152571976184845, 0.01149614155292511, -0.006078378297388554, 0.15060478448867798, -0.036043014377355576, 0.041991040110588074, -0.12355201691389084, -0.06390324980020523, -0.06647954881191254, 0.03213595971465111, -0.04106082394719124, -0.07244029641151428, -0.04819798469543457, 0.05312785878777504, 0.04596937075257301, 0.0076791225001215935, -0.028743522241711617, -0.06381748616695404, -0.00685490295290947, 0.1696345955133438, 0.10902982205152512, 0.029197733849287033, -0.10678654164075851, 0.030437830835580826, 0.009509961120784283, 0.07541161775588989, -0.047461289912462234, 0.013928741216659546, 0.14023658633232117, 0.010522212833166122, 0.11184361577033997, -0.0070247710682451725, -0.14717227220535278, -0.016872957348823547, 0.06044108420610428, -0.08539904654026031, -0.13169848918914795, -0.009248812682926655, 0.03498103469610214, -0.07224471867084503, -0.018216572701931, 0.0946125015616417, -0.0954664871096611, -0.01665755733847618, 0.000656685559079051, 0.040391288697719574, -0.036916110664606094, 0.20941902697086334, 0.03235133737325668, 0.030711377039551735, -0.06151239946484566, 0.1287357658147812, 0.12993131577968597, -0.16666465997695923, 0.010948415845632553, 0.19104264676570892, -0.08158142864704132, -0.06810563057661057, 0.013334473595023155, 0.11513007432222366, -0.03688092902302742, -0.07368198782205582, -0.04655922204256058, -0.05142371729016304, 0.028611833229660988, -0.023097136989235878, 0.04259708896279335, 0.04047708213329315, -0.023669246584177017, -0.01789036951959133, -0.11168084293603897, 0.09176304936408997, 0.0909653827548027, 0.0363701768219471, -0.019970424473285675, 0.1314915120601654, 0.03149987384676933, -0.039812371134757996, -0.011739932000637054, 0.003569170134142041, -0.061903778463602066, 0.01674002967774868, -0.0741143673658371, 0.0026137246750295162, -0.04171869531273842, -0.012184208258986473, -0.028066173195838928, 0.002715925918892026, -0.02132588066160679, -0.0010968622518703341, -0.037370845675468445, -0.0517980195581913, -0.03895876184105873, 0.0184875950217247, -0.09304766356945038, -0.030382255092263222, 0.011282159015536308, -0.02674771472811699, 0.055295370519161224, 0.0008293969440273941, 0.014093945734202862, -0.01654934510588646, -0.01965550146996975, 0.072914257645607, 0.0016901142662391067, 0.049051012843847275, -0.013838706538081169, -0.07861672341823578, 0.025084303691983223, 0.03233961760997772, -0.016238808631896973, -0.01791992411017418, 0.01752745546400547, -0.14444218575954437, 0.02083495259284973, -0.010331777855753899, -0.04067520797252655, -0.07466892153024673, 0.09980849921703339, 0.04678323119878769, 0.06587140262126923, 0.08856438100337982, -0.06548795104026794, 0.08196283876895905, -0.15723702311515808, -0.008516456931829453, 0.027967091649770737, 0.003209424903616309, -0.01890433579683304, -0.000009834766387939453, 0.05318614840507507, -0.05483371391892433, 0.12906506657600403, 0.04287368059158325, 0.06748548150062561, 0.020480293780565262, -0.06776602566242218, -0.0006474247784353793, 0.029171517118811607, 0.07531793415546417, -0.03317350149154663, -0.021978486329317093, -0.071261465549469, 0.08972643315792084, -0.005078882910311222, 0.08578194677829742, 0.04305167868733406, 0.14273834228515625, 0.11425837129354477, 0.0394587479531765, 0.006239774636924267, -0.11393538862466812, -0.07585325837135315, 0.05221385881304741, -0.009905085898935795, 0.04190617427229881, -0.022671151906251907, 0.1001993864774704, 0.11662495136260986, -0.13962404429912567, 0.11464027315378189, -0.011835183016955853, -0.0866808071732521, -0.03740914538502693, -0.14987151324748993, -0.037173137068748474, -0.017430663108825684, -0.04644889757037163, -0.11298064887523651, 0.022269289940595627, 0.06620336323976517, 0.04244823008775711, -0.04721107706427574, 0.1477327197790146, -0.05696329101920128, -0.10993237793445587, 0.05506915599107742, 0.020422298461198807, 0.09660113602876663, 0.027729662135243416, 0.013956187292933464, 0.04125870764255524, 0.007598209660500288, 0.04775243625044823, 0.05083923786878586, -0.009249930270016193, -0.004205513745546341, 0.014899581670761108, -0.04641140624880791, -0.04080762341618538, 0.0016786007909104228, 0.06849252432584763, 0.20600207149982452, 0.04227535054087639, -0.06424076110124588, -0.01996123418211937, 0.18471845984458923, -0.05836954340338707, -0.07063093781471252, -0.09963498264551163, 0.21540173888206482, 0.038803040981292725, 0.039109814912080765, -0.009135965257883072, -0.10589100420475006, -0.007667379919439554, 0.14814673364162445, 0.1893836259841919, -0.0649912878870964, -0.03465317189693451, 0.02865520678460598, -0.002735802670940757, 0.034157950431108475, 0.04439528286457062, 0.023278003558516502, 0.2667285203933716, -0.08588176220655441, 0.10816981643438339, -0.04105394333600998, 0.021526265889406204, -0.007140940520912409, 0.1759921908378601, 0.008511565625667572, 0.027684256434440613, -0.08543048799037933, 0.08359622210264206, -0.00003757183003472164, -0.17633992433547974, 0.024097591638565063, -0.09756898134946823, -0.12083647400140762, 0.009422526694834232, -0.025603264570236206, 0.04804437607526779, 0.0859333947300911, 0.0032654826063662767, 0.032467689365148544, 0.07148033380508423, 0.01478258054703474, -0.11043953895568848, -0.14213526248931885, 0.004598184954375029, -0.013064062222838402, 0.14250262081623077, -0.004363242071121931, 0.12823446094989777, 0.08458650857210159, 0.012343941256403923, -0.10745725780725479, 0.0850927084684372, 0.023377051576972008, 0.02999495156109333, 0.07704942673444748, 0.08716794848442078, -0.012865624390542507, 0.07071957737207413, 0.04259863495826721, -0.08552812784910202, 0.0305678378790617, -0.05188862234354019, -0.019761867821216583, -0.135501429438591, 0.08579445630311966, -0.04342091828584671, 0.15029935538768768, 0.1855255365371704, -0.0170009545981884, -0.02305150218307972, -0.04897261783480644, 0.020034367218613625, 0.0009611159912310541, 0.10828607529401779, -0.009252696298062801, -0.17196115851402283, 0.01985260285437107, -0.0785883441567421, 0.038216110318899155, -0.24520209431648254, -0.023918673396110535, 0.01857154630124569, -0.05530404672026634, -0.00781306903809309, 0.06899260729551315, 0.043358735740184784, 0.04955532029271126, -0.0543818362057209, -0.1030837669968605, 0.00201742397621274, 0.1003798320889473, -0.09368041157722473, -0.11668971180915833 ]
null
null
transformers
# legal_t5_small_trans_de_sv model Model on translating legal text from Deustch to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_sv is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Swedish. ### How to use Here is how to use this model to translate legal text from Deustch to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_sv"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Betrifft: Leader-Programm" pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_sv model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_sv | 41.69| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Swedish", "tags": ["translation Deustch Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Betrifft: Leader-Programm"}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_sv
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_sv model ===================================== Model on translating legal text from Deustch to Swedish. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_sv is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Swedish. ### How to use Here is how to use this model to translate legal text from Deustch to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_sv model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.11662249267101288, 0.05432651937007904, -0.0020085095893591642, 0.08407068997621536, 0.08778147399425507, -0.0059608216397464275, 0.06755415350198746, 0.0869879201054573, -0.08136305212974548, 0.07068850100040436, 0.07647863775491714, 0.03556754067540169, 0.09968652576208115, 0.10895996540784836, 0.0442098006606102, -0.24733015894889832, 0.030784137547016144, -0.02898431569337845, -0.008436248637735844, 0.12943971157073975, 0.12752428650856018, -0.07961934059858322, 0.011048704385757446, -0.030293071642518044, -0.0852554589509964, -0.0035673135425895452, -0.0620538629591465, -0.06028959900140762, 0.07851358503103256, 0.04432741552591324, 0.1092027872800827, 0.026386931538581848, 0.09396311640739441, -0.160259410738945, -0.005227915942668915, 0.061545029282569885, 0.049872394651174545, 0.025898965075612068, 0.07000695914030075, 0.016641264781355858, 0.16335459053516388, -0.026887206360697746, 0.05645093321800232, 0.007872901856899261, -0.09043141454458237, -0.14962470531463623, -0.0644179955124855, 0.030648089945316315, 0.14230355620384216, 0.14772407710552216, -0.06787775456905365, 0.05598233640193939, -0.12229819595813751, 0.07370893657207489, 0.05904802680015564, -0.2615950405597687, -0.06889579445123672, 0.05704069882631302, 0.05736282467842102, 0.0917038768529892, -0.048995133489370346, -0.026684775948524475, 0.0405275784432888, 0.04204833135008812, 0.024140119552612305, -0.02583818882703781, -0.010613278485834599, -0.013971191830933094, -0.17236170172691345, -0.07791953533887863, 0.17105865478515625, -0.005244239699095488, -0.0721372440457344, -0.10090693831443787, -0.01864774152636528, -0.1396678388118744, 0.016449980437755585, -0.06823597848415375, 0.04762827977538109, -0.01309168990701437, 0.03866688534617424, -0.03743172809481621, -0.1269206404685974, -0.0994882583618164, 0.047067925333976746, 0.06286223977804184, 0.07834478467702866, -0.012329730205237865, 0.024832775816321373, 0.14575138688087463, -0.007441849447786808, -0.09400396049022675, -0.031353261321783066, -0.005861454177647829, -0.1165638267993927, -0.028721416369080544, -0.03949112817645073, -0.18951140344142914, -0.054435957223176956, 0.1013236716389656, -0.015026773326098919, 0.051382165402173996, 0.039294254034757614, 0.042201317846775055, 0.012677759863436222, 0.17338399589061737, -0.08655000478029251, -0.07923419773578644, -0.06446840614080429, 0.03875825181603432, -0.045917220413684845, 0.013286170549690723, -0.023214060813188553, -0.015705453231930733, 0.07996407896280289, 0.06462632864713669, -0.08680567145347595, 0.008740472607314587, -0.03134850040078163, -0.013868508860468864, 0.02597835473716259, -0.12380314618349075, -0.045905619859695435, -0.01058349758386612, -0.10249405354261398, -0.03941723331809044, 0.0813458040356636, 0.0006082505569793284, -0.1371718943119049, 0.08235716074705124, -0.011748201213777065, -0.014129259623587132, -0.11587881296873093, -0.11761091649532318, -0.017352523282170296, -0.0808652713894844, -0.03359793499112129, -0.06167232245206833, -0.1511072963476181, -0.11728079617023468, 0.07975980639457703, -0.05681373551487923, -0.0515098012983799, -0.10649114847183228, -0.02836942858994007, 0.0025109481066465378, -0.05008840188384056, 0.13151775300502777, -0.029850026592612267, 0.07726652920246124, -0.023370862007141113, 0.051286254078149796, 0.13768677413463593, 0.07264994829893112, -0.11112965643405914, 0.00532595906406641, -0.11998212337493896, 0.17913587391376495, -0.059448547661304474, -0.00464056059718132, -0.14173786342144012, -0.08434715867042542, -0.0435834601521492, 0.0744984969496727, 0.10793448984622955, 0.13014400005340576, -0.1682683825492859, -0.012433063238859177, 0.20036569237709045, -0.10526519268751144, -0.02576710470020771, 0.10645890980958939, -0.03261367231607437, 0.1359051614999771, 0.10354934632778168, 0.19885249435901642, 0.05405471473932266, -0.08138299733400345, -0.00539505947381258, -0.013849463313817978, -0.018612416461110115, -0.007028424181044102, 0.08752463012933731, -0.054225049912929535, -0.049060821533203125, 0.00441166153177619, -0.09646349400281906, 0.017331479117274284, -0.04207560420036316, -0.06380992382764816, 0.026472948491573334, -0.05338350683450699, -0.05570531636476517, 0.06543181836605072, 0.04530184343457222, -0.06444107741117477, -0.11289108544588089, 0.019705060869455338, 0.07731275260448456, -0.07198745757341385, 0.03643062338232994, -0.02638021670281887, -0.057498749345541, -0.07940595597028732, -0.01198283676058054, -0.15295007824897766, 0.019676290452480316, 0.030598929151892662, -0.016160456463694572, 0.041927456855773926, 0.0786990374326706, 0.04853237792849541, 0.06010519713163376, -0.009253386408090591, -0.04938560724258423, -0.0432310588657856, -0.039654847234487534, -0.12442240864038467, -0.11604655534029007, -0.023714782670140266, -0.029737843200564384, 0.08558882027864456, -0.17337864637374878, 0.020599110051989555, -0.0925753191113472, 0.04407133534550667, -0.014826850034296513, -0.045200835913419724, 0.04509970545768738, 0.04081128165125847, 0.03021945245563984, -0.0665113627910614, 0.04830307140946388, 0.022786151617765427, 0.008056496270000935, 0.10932477563619614, -0.07136683166027069, -0.15684917569160461, 0.07909473031759262, 0.054984450340270996, -0.15112212300300598, 0.013190500438213348, -0.03844216838479042, -0.06644342094659805, -0.05705730989575386, 0.008490641601383686, 0.23080672323703766, 0.022815149277448654, 0.13423235714435577, -0.10790734738111496, -0.036331888288259506, -0.005870850291103125, -0.03858582302927971, 0.006482656579464674, 0.15953046083450317, 0.05412472039461136, -0.10350683331489563, 0.0531684011220932, -0.013060850091278553, -0.03035997599363327, 0.19700303673744202, 0.00908493623137474, -0.1268078088760376, 0.026683049276471138, 0.06532086431980133, -0.033263321965932846, 0.10582888126373291, -0.12145905196666718, 0.011431206949055195, 0.018554219976067543, 0.051638729870319366, 0.0566457062959671, -0.1454579383134842, 0.013915841467678547, 0.050947483628988266, -0.06304961442947388, 0.023599229753017426, -0.005820946302264929, -0.06527052074670792, 0.06328471750020981, 0.013526268303394318, -0.038758836686611176, -0.014320495538413525, -0.039401713758707047, -0.13843800127506256, 0.21065695583820343, -0.06804482638835907, -0.15498928725719452, -0.11344146728515625, 0.11214914172887802, 0.04241536557674408, 0.001795757794752717, 0.06031420826911926, -0.09253938496112823, -0.058890193700790405, -0.10031767189502716, 0.12313316762447357, -0.028870942071080208, -0.06034216657280922, -0.10926268994808197, -0.009883866645395756, -0.0014544513542205095, -0.13102111220359802, 0.029932567849755287, -0.056178610771894455, -0.06525741517543793, 0.0031169201247394085, -0.043884024024009705, 0.07161938399076462, 0.12554647028446198, 0.0015072720125317574, 0.019309021532535553, -0.0029077487997710705, 0.1907551884651184, -0.12095341086387634, 0.025443600490689278, 0.06277062743902206, 0.023271622136235237, 0.008185184560716152, 0.12453911453485489, -0.00773664191365242, -0.08130515366792679, 0.037096474319696426, 0.06467744708061218, -0.03257392719388008, -0.28116127848625183, -0.026600701734423637, -0.018343539908528328, -0.021780606359243393, 0.10106092691421509, 0.04376716911792755, -0.011410399340093136, 0.05469059199094772, -0.015721764415502548, -0.029746104031801224, 0.04111659154295921, 0.04559191316366196, -0.027160553261637688, -0.004159865900874138, 0.08270571380853653, -0.04458538815379143, -0.005380915943533182, 0.0473640002310276, 0.008909397758543491, 0.23402880132198334, -0.0631791204214096, 0.08381062000989914, 0.07950394600629807, 0.0962795838713646, -0.007030075415968895, 0.08020208775997162, -0.025765463709831238, 0.007982470095157623, 0.008185802027583122, -0.03246237337589264, -0.07852842658758163, 0.0389409139752388, 0.006382305640727282, -0.00035800194018520415, -0.08976001292467117, -0.0014818648342043161, 0.013272366486489773, 0.33068832755088806, 0.07198718190193176, -0.20804788172245026, -0.08457804471254349, 0.008746431209146976, -0.07089285552501678, -0.0914841741323471, 0.056643273681402206, 0.09045170247554779, -0.1433347910642624, 0.004968807101249695, -0.040442414581775665, 0.1008690819144249, -0.08918610215187073, -0.047980062663555145, 0.05339007452130318, 0.05518977344036102, -0.0072129047475755215, 0.09348943084478378, -0.2877562642097473, 0.19265516102313995, -0.01833057776093483, 0.13553105294704437, -0.02060253545641899, 0.030027465894818306, -0.05519050359725952, 0.016205839812755585, 0.18048936128616333, 0.015386742539703846, -0.01413037721067667, -0.056894250214099884, -0.09872737526893616, 0.021289581432938576, 0.026043379679322243, -0.06553538143634796, 0.08219863474369049, 0.030077405273914337, 0.044583018869161606, -0.026566434651613235, -0.1095283254981041, -0.12536804378032684, -0.09298726916313171, -0.009886928834021091, -0.10096258670091629, 0.05751875415444374, -0.03502519801259041, -0.062154050916433334, -0.04545027017593384, 0.16065385937690735, -0.12108206748962402, -0.1262233555316925, -0.10209180414676666, 0.033872075378894806, 0.08324596285820007, -0.04582108557224274, -0.007213252130895853, 0.025279873982071877, 0.012759530916810036, -0.025884553790092468, 0.02362525835633278, 0.08903419226408005, -0.0629923939704895, -0.11540713161230087, -0.015131397172808647, 0.12768054008483887, 0.13467691838741302, 0.049683086574077606, -0.020650535821914673, 0.025433866307139397, -0.009083743207156658, -0.08327753096818924, 0.009492565877735615, 0.008164494298398495, 0.0543578639626503, 0.007857815362513065, -0.05181954801082611, -0.010578238405287266, -0.08469325304031372, -0.057357948273420334, 0.1121249869465828, 0.1419731229543686, -0.051504649221897125, 0.08950735628604889, 0.17852774262428284, -0.10401585698127747, -0.20090070366859436, 0.015121745876967907, 0.11741738021373749, 0.08950373530387878, -0.054443586617708206, -0.20549744367599487, 0.03579254448413849, 0.0876491367816925, 0.007286527194082737, -0.03470541536808014, -0.38163384795188904, -0.14000056684017181, 0.09763786941766739, 0.08744857460260391, -0.03162679448723793, -0.05252433940768242, -0.014897830784320831, 0.036663688719272614, -0.027745425701141357, 0.05673835426568985, -0.016876546666026115, 0.09721459448337555, 0.03162262961268425, -0.0339403972029686, 0.04998304322361946, -0.06274927407503128, 0.12021105736494064, 0.06292617321014404, 0.038892392069101334, -0.07021776586771011, 0.07487288862466812, -0.004487633239477873, -0.022603560239076614, 0.18887023627758026, 0.016307411715388298, 0.023496560752391815, -0.19081589579582214, -0.07023318111896515, -0.08573759347200394, 0.019982730969786644, -0.07019250094890594, -0.06809759885072708, -0.051689211279153824, 0.09855436533689499, 0.06453783810138702, -0.01797214336693287, 0.006408503744751215, -0.09723515808582306, -0.03038332797586918, 0.08203806728124619, 0.1126009076833725, 0.07070158421993256, -0.07877903431653976, 0.0287521630525589, 0.03562628850340843, 0.09588625282049179, -0.13717569410800934, -0.023158039897680283, 0.13136674463748932, -0.028461750596761703, 0.12571513652801514, -0.029604241251945496, -0.1439170241355896, 0.0052838874980807304, 0.03192148730158806, -0.10141465812921524, -0.12336273491382599, -0.012112298980355263, -0.11556577682495117, -0.02747865580022335, -0.04674194008111954, 0.08304806053638458, -0.1327441930770874, -0.00931193120777607, -0.01880049705505371, 0.0526031069457531, -0.07832728326320648, 0.22467385232448578, 0.04427328705787659, 0.06037059426307678, -0.07125166058540344, 0.14625975489616394, 0.08008979260921478, -0.11636251956224442, 0.04434745013713837, 0.16694065928459167, -0.10006681829690933, -0.054095860570669174, 0.022547269240021706, 0.15023404359817505, -0.03108307346701622, -0.07737395167350769, -0.05610064044594765, -0.0610690712928772, 0.05820682272315025, -0.02037765271961689, 0.048978958278894424, 0.01004608441144228, -0.028054434806108475, -0.008174508810043335, -0.11709457635879517, 0.07877366989850998, 0.09655890613794327, -0.008747109211981297, -0.023407699540257454, 0.17315825819969177, 0.03747629374265671, 0.0512050986289978, -0.015288219787180424, -0.01458376832306385, -0.037868350744247437, 0.058257751166820526, -0.020519185811281204, -0.030580133199691772, -0.051932234317064285, 0.003701451700180769, -0.03431611508131027, -0.008729740977287292, 0.002081485465168953, 0.02351321466267109, -0.06825895607471466, -0.028182243928313255, -0.05361185222864151, 0.029008125886321068, -0.0745873749256134, -0.01980549469590187, -0.019569477066397667, -0.04568904638290405, 0.06980885565280914, 0.01657673344016075, -0.005235564429312944, 0.02966180071234703, 0.004886730574071407, 0.08937475830316544, -0.026010410860180855, -0.006031393073499203, -0.007810298353433609, -0.06596498191356659, 0.01186206191778183, -0.004183971788734198, -0.01854328252375126, -0.00892761442810297, 0.0443250797688961, -0.13074462115764618, 0.0566624253988266, -0.016512414440512657, -0.009072980843484402, -0.07360263913869858, 0.13197150826454163, 0.02126341685652733, 0.09624695032835007, 0.12977126240730286, -0.08469033241271973, 0.0718035027384758, -0.134070485830307, -0.04004998505115509, 0.03655334189534187, 0.003043635282665491, -0.030869053676724434, -0.058225903660058975, 0.053043365478515625, -0.06310133635997772, 0.07047715783119202, 0.08555532991886139, 0.045272305607795715, 0.03694911301136017, -0.12847840785980225, 0.009841440245509148, 0.048937734216451645, 0.06451869755983353, -0.018696747720241547, 0.01759231463074684, -0.017217177897691727, 0.05206015333533287, -0.019435912370681763, 0.06867572665214539, 0.1433299332857132, 0.20627963542938232, 0.08240239322185516, 0.111725352704525, -0.04101015627384186, -0.09344307333230972, -0.11828193068504333, 0.11222187429666519, 0.008078884333372116, 0.031167667359113693, -0.01074601523578167, 0.12324804812669754, 0.09998661279678345, -0.16941377520561218, 0.07763848453760147, 0.009195422753691673, -0.11293654888868332, -0.09893602132797241, -0.09700535982847214, -0.04665002599358559, -0.05201190337538719, -0.0006117065786384046, -0.10104526579380035, 0.026968110352754593, 0.06178336963057518, 0.0850910097360611, -0.035429492592811584, 0.15179869532585144, -0.00774140190333128, -0.07225661724805832, 0.09414129704236984, -0.013583033345639706, 0.09156852215528488, -0.07761701941490173, 0.004931809846311808, 0.02431660145521164, 0.0061523341573774815, 0.0392485186457634, 0.0013312484370544553, -0.03411339968442917, 0.014361259527504444, 0.04409008473157883, -0.045607950538396835, -0.00021109418594278395, 0.05464224889874458, 0.1264495551586151, 0.08434393256902695, 0.0767364501953125, -0.05094188451766968, -0.0337982103228569, 0.20397396385669708, -0.032180435955524445, -0.10724843293428421, -0.17700517177581787, 0.1617666631937027, 0.06419335305690765, 0.04581369459629059, 0.028834016993641853, -0.09680812060832977, 0.005179544445127249, 0.21882423758506775, 0.14528776705265045, -0.02903679385781288, -0.04324612021446228, 0.008034324273467064, -0.010209505446255207, 0.011790851131081581, 0.11432826519012451, 0.026384947821497917, 0.17527560889720917, -0.10643415153026581, 0.046381693333387375, -0.08461621403694153, -0.05943012610077858, -0.02002136781811714, 0.15995928645133972, 0.017460182309150696, -0.013144011609256268, -0.06093181297183037, 0.12461220473051071, -0.005594396498054266, -0.19072557985782623, 0.05916053429245949, -0.04232903942465782, -0.15018555521965027, -0.024843009188771248, 0.01790432631969452, 0.007380801253020763, 0.04602384194731712, 0.006816483568400145, 0.016057860106229782, 0.11495797336101532, 0.03647575527429581, -0.04818299785256386, -0.14013060927391052, 0.059799544513225555, 0.018610214814543724, 0.15075691044330597, 0.009864356368780136, 0.07233230024576187, 0.06857796013355255, 0.023955175653100014, -0.09635130316019058, 0.08716485649347305, 0.055927734822034836, 0.024902410805225372, 0.05431389808654785, 0.12777800858020782, -0.03371952846646309, 0.08812444657087326, 0.023860616609454155, -0.13275624811649323, 0.043254874646663666, -0.11128182709217072, -0.061304036527872086, -0.13778020441532135, 0.05842985957860947, -0.05139888450503349, 0.1252305954694748, 0.2278570532798767, -0.012203737162053585, 0.007610009051859379, -0.08623449504375458, 0.052370358258485794, -0.033248063176870346, 0.13755321502685547, 0.02254216931760311, -0.18090541660785675, 0.00033386488212272525, -0.058773934841156006, 0.018654102459549904, -0.18428786098957062, -0.020117808133363724, 0.007008090615272522, -0.07357832044363022, -0.014744916930794716, 0.12056673318147659, 0.035203926265239716, 0.050577133893966675, -0.024952344596385956, -0.0026115458458662033, 0.002182964701205492, 0.13712000846862793, -0.12280425429344177, -0.0963330790400505 ]
null
null
transformers
# legal_t5_small_trans_de_sv_small_finetuned model Model on translating legal text from Deustch to Swedish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_de_sv_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_de_sv_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from Deustch to Swedish. ### How to use Here is how to use this model to translate legal text from Deustch to Swedish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_de_sv_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_de_sv", do_lower_case=False, skip_special_tokens=True), device=0 ) de_text = "Die Finanzkrise hat schonungslos offenbart, wo die Mängel in den Überwachungsverfahren der EU liegen, die eine wirksame Vorbeugung von Verstößen gegen die Haushaltsdisziplin, ausufernden Haushaltsdefiziten der Mitgliedstaaten, Ungleichgewichten im Handel und Unterschieden in der Wettbewerbsfähigkeit gewährleisten sollen." pipeline([de_text], max_length=512) ``` ## Training data The legal_t5_small_trans_de_sv_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 8 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_de_sv_small_finetuned | 41.365| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "Deustch Swedish", "tags": ["translation Deustch Swedish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Die Finanzkrise hat schonungslos offenbart, wo die M\u00e4ngel in den \u00dcberwachungsverfahren der EU liegen, die eine wirksame Vorbeugung von Verst\u00f6\u00dfen gegen die Haushaltsdisziplin, ausufernden Haushaltsdefiziten der Mitgliedstaaten, Ungleichgewichten im Handel und Unterschieden in der Wettbewerbsf\u00e4higkeit gew\u00e4hrleisten sollen."}]}
text2text-generation
SEBIS/legal_t5_small_trans_de_sv_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation Deustch Swedish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "Deustch Swedish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_de\_sv\_small\_finetuned model ======================================================= Model on translating legal text from Deustch to Swedish. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_de\_sv\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_de\_sv\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from Deustch to Swedish. ### How to use Here is how to use this model to translate legal text from Deustch to Swedish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_de\_sv\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_sv\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_sv\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation Deustch Swedish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from Deustch to Swedish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_de\\_sv\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 8 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06350648403167725, 0.06588605046272278, -0.0028573584277182817, 0.09236284345388412, 0.04689871519804001, -0.004562027286738157, 0.02756006456911564, 0.08651469647884369, -0.031870219856500626, 0.08229022473096848, 0.051462091505527496, -0.02517259493470192, 0.07753928005695343, 0.025413453578948975, 0.06519899517297745, -0.23904994130134583, 0.011468431912362576, -0.04788000136613846, -0.006239484064280987, 0.09027320146560669, 0.10015516728162766, -0.05578948184847832, 0.03889370709657669, -0.04050156846642494, -0.0023363798391073942, 0.026227127760648727, -0.09746627509593964, -0.03489917889237404, 0.08419253677129745, 0.08965226262807846, 0.0822862982749939, -0.009764873422682285, 0.07576870918273926, -0.1804664582014084, -0.005729071795940399, 0.05933672934770584, -0.000823241367470473, 0.027586884796619415, 0.13032539188861847, 0.024746853858232498, 0.1751563847064972, -0.053086426109075546, 0.02679917961359024, 0.02968987077474594, -0.09880968183279037, -0.13211309909820557, -0.04868335649371147, 0.0153065649792552, 0.07891371101140976, 0.13539673388004303, -0.0627150759100914, 0.0525638647377491, -0.058101836591959, 0.08378507941961288, 0.06396152079105377, -0.20715510845184326, -0.035472143441438675, 0.05862300097942352, 0.06096937134861946, 0.08874889463186264, -0.04565820097923279, -0.005991966929286718, 0.05656421184539795, 0.08245103806257248, 0.036951255053281784, -0.0649084597826004, -0.06793806701898575, -0.05696135759353638, -0.1337437927722931, -0.0401747040450573, 0.15985220670700073, 0.018952662125229836, -0.04799612611532211, -0.08790337294340134, -0.04784408584237099, -0.059137653559446335, 0.0030722387600690126, -0.049324531108140945, 0.03019830957055092, -0.010017236694693565, 0.08020389080047607, -0.03169430419802666, -0.12172982096672058, -0.04714043810963631, -0.05674470588564873, 0.10859723389148712, 0.03414410352706909, 0.010388447903096676, 0.040339380502700806, 0.06996823102235794, -0.13458886742591858, -0.08801434189081192, -0.00973538402467966, 0.006766199134290218, -0.10832002758979797, -0.00685947947204113, -0.018895704299211502, -0.21192635595798492, -0.007899931631982327, 0.03823037073016167, -0.05637454614043236, 0.047003645449876785, 0.07693452388048172, 0.04192877188324928, 0.06596599519252777, 0.12026087939739227, -0.12742774188518524, -0.138021320104599, -0.033546581864356995, -0.014968338422477245, 0.02018793299794197, 0.00980965793132782, -0.06240829452872276, -0.04207095876336098, 0.023356152698397636, 0.01839112490415573, 0.005911451298743486, 0.02466934360563755, -0.012920013628900051, -0.028026796877384186, 0.10255833715200424, -0.1112758219242096, -0.013736110180616379, -0.007954609580338001, -0.09292716532945633, -0.030070390552282333, 0.0749906525015831, -0.005177807528525591, -0.1210286021232605, 0.09504672884941101, -0.017163794487714767, -0.024998866021633148, -0.09465773403644562, -0.1850857436656952, 0.002170246560126543, -0.008329167030751705, -0.049401916563510895, -0.09939766675233841, -0.12010064721107483, -0.107056625187397, 0.044519443064928055, -0.06293720752000809, -0.004996211268007755, -0.07512892782688141, -0.012442355044186115, -0.001899314345791936, -0.024624237790703773, 0.11191951483488083, -0.04492496699094772, 0.018589351326227188, -0.0012580720940604806, 0.07545523345470428, 0.014961755834519863, 0.028894145041704178, -0.13145552575588226, 0.03739417716860771, -0.16292370855808258, 0.1506161242723465, -0.0591854453086853, 0.005361165851354599, -0.11908168345689774, -0.057372044771909714, -0.06898593157529831, 0.07046496868133545, 0.06498178094625473, 0.1189800426363945, -0.21162107586860657, -0.008154558017849922, 0.1834850013256073, -0.10202571749687195, -0.06763347238302231, 0.12454578280448914, -0.011389418505132198, 0.03419124335050583, 0.09524154663085938, 0.13864248991012573, 0.07078167051076889, -0.013272011652588844, -0.050347279757261276, 0.026979003101587296, 0.010028809309005737, 0.07142053544521332, 0.08796462416648865, -0.07438413053750992, 0.051085326820611954, 0.0285410787910223, 0.011438277550041676, -0.010451992973685265, -0.0026680214796215296, -0.029443982988595963, 0.01669895090162754, -0.04332997649908066, -0.056502316147089005, 0.038271546363830566, 0.00029813614673912525, -0.06873693317174911, -0.07440042495727539, -0.021024323999881744, 0.08724610507488251, -0.05543486401438713, 0.02597511000931263, 0.018436873331665993, -0.06382332742214203, -0.11043915897607803, 0.012108566239476204, -0.15487442910671234, -0.032372090965509415, 0.019880549982190132, -0.07003670185804367, 0.10581380873918533, 0.09563339501619339, 0.06176584213972092, 0.09368237853050232, -0.06336510926485062, -0.024811003357172012, 0.005486427806317806, -0.01921987719833851, -0.09851846098899841, -0.12833267450332642, -0.03436395525932312, -0.029081065207719803, -0.020634757354855537, -0.10995509475469589, -0.013526290655136108, -0.06153680384159088, 0.08468106389045715, 0.0064368415623903275, -0.018777888268232346, 0.041143227368593216, 0.0682208314538002, -0.02320154383778572, -0.039593786001205444, 0.02267412841320038, -0.015973681584000587, -0.06188935786485672, 0.1080315038561821, -0.13603338599205017, -0.11759159713983536, 0.07496252655982971, 0.010165398009121418, -0.12825950980186462, -0.0032221083529293537, -0.01172656286507845, -0.06737004965543747, -0.05841485783457756, -0.05791619420051575, 0.23108528554439545, 0.04027007892727852, 0.1246902346611023, -0.10945850610733032, -0.019975557923316956, 0.00933782383799553, -0.038860611617565155, -0.0022350051440298557, 0.18387435376644135, 0.07181723415851593, -0.15880392491817474, 0.08756458014249802, -0.022490205243229866, -0.036061640828847885, 0.14170628786087036, 0.06892482191324234, -0.10174840688705444, 0.003961620852351189, 0.034266047179698944, -0.0031542631331831217, 0.07376728951931, -0.07851766794919968, 0.005243191961199045, 0.02930280938744545, 0.05467090383172035, 0.06071683391928673, -0.07206854224205017, 0.05278487876057625, 0.05788511037826538, -0.03451460227370262, 0.04854901134967804, -0.042507633566856384, -0.04515988379716873, 0.08577502518892288, 0.019838087260723114, -0.04330669343471527, -0.044189922511577606, -0.05135337635874748, -0.11234351992607117, 0.19983726739883423, -0.0656123235821724, -0.22405681014060974, -0.12187957763671875, 0.09100136905908585, -0.05407121777534485, 0.04226967692375183, 0.04034241661429405, -0.04223809391260147, -0.08315273374319077, -0.13121773302555084, 0.10603757202625275, -0.058973416686058044, -0.05574014410376549, -0.1433756798505783, 0.026523450389504433, 0.013426347635686398, -0.12049052119255066, 0.022155875340104103, -0.012312508188188076, -0.007471625693142414, 0.0024806694127619267, -0.025133756920695305, 0.11309091001749039, 0.11018498241901398, -0.0056762490421533585, -0.03702087327837944, 0.004525390919297934, 0.13338138163089752, -0.09121458232402802, 0.07730411738157272, 0.05397018417716026, 0.02220430225133896, 0.029227446764707565, 0.16795076429843903, 0.02401014044880867, -0.0566183440387249, 0.03872155025601387, 0.0681304857134819, -0.03153907135128975, -0.22317562997341156, -0.10750365257263184, -0.06520698964595795, 0.03489048779010773, 0.11178062856197357, 0.0414370633661747, -0.06928716599941254, 0.03383743017911911, -0.05666843801736832, 0.0331270657479763, 0.0031591500155627728, 0.0489313118159771, 0.020967183634638786, -0.01787477917969227, 0.08775588124990463, -0.055724989622831345, -0.04368261247873306, 0.1012803316116333, 0.027137547731399536, 0.17747850716114044, -0.06395218521356583, 0.17117434740066528, 0.051959384232759476, 0.017954876646399498, 0.0034749857150018215, 0.0584336593747139, -0.03899010270833969, 0.014717228710651398, -0.01914425939321518, -0.06549546122550964, -0.040004875510931015, 0.06401173025369644, 0.010324645787477493, 0.0051611983217298985, -0.036549657583236694, -0.047086361795663834, 0.04720745608210564, 0.20488058030605316, 0.06580904871225357, -0.17759089171886444, -0.07448477298021317, 0.014804293401539326, -0.06878922134637833, -0.06382869929075241, 0.011987751349806786, 0.15364132821559906, -0.09828690439462662, 0.040080200880765915, 0.00930258259177208, 0.11866743117570877, -0.11991025507450104, -0.009020170196890831, 0.03809084743261337, 0.051680587232112885, -0.014721845276653767, 0.13031531870365143, -0.2385081797838211, 0.13877426087856293, 0.009271711111068726, 0.059887681156396866, -0.045892149209976196, 0.01817186549305916, -0.04108019918203354, 0.01503133587539196, 0.13441133499145508, 0.029850143939256668, -0.04195995628833771, -0.10515616089105606, -0.09314662963151932, 0.004310720134526491, 0.05130250006914139, -0.06680694967508316, 0.09000201523303986, 0.05729398503899574, 0.02702697366476059, -0.0313568040728569, 0.01066097617149353, -0.0543535016477108, -0.1450071483850479, 0.006959205027669668, -0.043163787573575974, -0.019148381426930428, -0.008374759927392006, -0.031085923314094543, -0.09146541357040405, 0.20192743837833405, -0.10335239768028259, -0.10394895821809769, -0.08023067563772202, 0.010664630681276321, 0.12842266261577606, -0.07353147119283676, -0.004393272567540407, 0.006779536139219999, 0.04809169843792915, -0.04616523161530495, -0.020936470478773117, 0.08376381546258926, -0.08067641407251358, -0.09400282800197601, -0.048048440366983414, 0.10789357125759125, 0.08478173613548279, 0.03992835059762001, -0.009261511266231537, 0.04422619566321373, -0.013166341930627823, -0.12229868769645691, -0.019786594435572624, 0.03356869891285896, 0.1127164214849472, 0.053057294338941574, -0.03118368424475193, -0.03449385613203049, -0.05638103559613228, -0.05987418815493584, 0.08507584780454636, 0.1566576063632965, -0.043160330504179, 0.04835249483585358, 0.20034024119377136, -0.09690073132514954, -0.21851103007793427, -0.05787075683474541, 0.09077024459838867, 0.08291161060333252, 0.0033979376312345266, -0.15783031284809113, 0.0591251365840435, 0.09804144501686096, 0.004356272052973509, 0.047037821263074875, -0.34994688630104065, -0.1449623703956604, 0.07075897604227066, 0.030498264357447624, -0.036545779556035995, -0.09526047855615616, -0.05385284870862961, -0.06328809261322021, -0.012513590045273304, 0.07330474257469177, -0.03875428065657616, 0.10106108337640762, 0.0033247547689825296, 0.028487948700785637, 0.053922995924949646, -0.043307118117809296, 0.13751837611198425, 0.031418029218912125, 0.03449404612183571, -0.08694544434547424, 0.09406600147485733, 0.006367331370711327, -0.018238011747598648, 0.18133001029491425, -0.057911425828933716, 0.04366351291537285, -0.12889572978019714, -0.06419722735881805, -0.0729072093963623, 0.04827048256993294, -0.040185559540987015, -0.08437288552522659, -0.06102375686168671, 0.05638336017727852, 0.05999545380473137, -0.010859294794499874, 0.012340439483523369, -0.08352319151163101, -0.006772306747734547, 0.1694934219121933, 0.12663516402244568, 0.026636840775609016, -0.1066058799624443, 0.03897899389266968, 0.0006141816265881062, 0.08053126186132431, -0.06062724068760872, 0.008766191080212593, 0.14253418147563934, 0.009472835808992386, 0.10556399077177048, -0.03042578138411045, -0.14224550127983093, -0.013219091109931469, 0.04865739494562149, -0.10088876634836197, -0.12391713261604309, -0.007481221109628677, -0.01019822433590889, -0.06461770832538605, -0.0345405712723732, 0.10887043178081512, -0.10959658026695251, -0.005620589014142752, 0.00546796852722764, 0.04476359486579895, -0.03701699152588844, 0.20174922049045563, 0.04002029821276665, 0.04516122117638588, -0.06798794865608215, 0.12778227031230927, 0.11161085218191147, -0.15237723290920258, 0.03202373906970024, 0.18539974093437195, -0.08885171264410019, -0.06732425093650818, 0.019290296360850334, 0.1365189105272293, -0.04118892177939415, -0.06685436517000198, -0.025487588718533516, -0.0666789561510086, 0.02394706755876541, -0.01864386536180973, 0.04729308560490608, 0.03004317171871662, -0.011426934972405434, -0.02208840847015381, -0.08901584148406982, 0.10255562514066696, 0.07199325412511826, 0.03218076750636101, -0.025682318955659866, 0.1261262148618698, 0.01482707541435957, -0.021892763674259186, -0.020155800506472588, 0.025810012593865395, -0.058223605155944824, 0.003311984008178115, -0.08541090786457062, 0.011337888427078724, -0.049948833882808685, -0.00035199098056182265, -0.026777010411024094, 0.0033173230476677418, -0.013602625578641891, 0.001821973011828959, -0.030123990029096603, -0.046844907104969025, -0.04995933175086975, 0.010841778479516506, -0.09377408772706985, -0.048073407262563705, 0.0015870570205152035, -0.008294889703392982, 0.04547937214374542, -0.007932990789413452, 0.00933933723717928, 0.007392245810478926, -0.005243290681391954, 0.08304201066493988, 0.01833345554769039, 0.039807528257369995, -0.006616069469600916, -0.06143368408083916, -0.010375632904469967, 0.024354925379157066, -0.012172868475317955, -0.012964664027094841, 0.01597439870238304, -0.14862672984600067, -0.0014598917914554477, -0.006986822467297316, -0.03509323298931122, -0.0781506896018982, 0.09874961525201797, 0.04153328761458397, 0.07102338969707489, 0.12665627896785736, -0.08422289043664932, 0.0845084860920906, -0.15992914140224457, -0.00801108218729496, 0.031847529113292694, -0.0008862693794071674, -0.02233460545539856, 0.00008653647819301113, 0.045129161328077316, -0.07653914391994476, 0.13849499821662903, 0.047925252467393875, 0.06586593389511108, 0.024042056873440742, -0.07046971470117569, 0.007781160529702902, 0.028971070423722267, 0.07646133750677109, -0.04218525439500809, -0.018830249086022377, -0.07933899760246277, 0.08855528384447098, -0.002283860696479678, 0.0590156726539135, 0.06020139157772064, 0.11538967490196228, 0.11360353231430054, 0.04711240902543068, 0.0015795815270394087, -0.09080658853054047, -0.07856514304876328, 0.05737338215112686, 0.008997534401714802, 0.04775888845324516, -0.0020787136163562536, 0.08043171465396881, 0.12587372958660126, -0.14561216533184052, 0.1249907985329628, -0.0058061727322638035, -0.08621393144130707, -0.05078086256980896, -0.1688983291387558, -0.056436195969581604, -0.00018259685020893812, -0.04054480418562889, -0.1214979961514473, 0.010854306630790234, 0.07243067771196365, 0.053221408277750015, -0.028811132535338402, 0.14535601437091827, -0.06785031408071518, -0.11572957783937454, 0.046848125755786896, 0.019366659224033356, 0.08120804280042648, 0.03942013159394264, 0.026199160143733025, 0.057022158056497574, 0.02658882923424244, 0.024498632177710533, 0.04573144391179085, -0.008468130603432655, -0.01665731705725193, 0.011552872136235237, -0.05613359436392784, -0.038189392536878586, 0.018180906772613525, 0.08140869438648224, 0.1786683201789856, 0.05817068740725517, -0.07159052789211273, -0.029156116768717766, 0.18963831663131714, -0.06060686334967613, -0.09026103466749191, -0.10538730025291443, 0.21275222301483154, 0.02101626619696617, 0.05404732748866081, -0.008993112482130527, -0.1032341867685318, 0.001588816987350583, 0.13666129112243652, 0.2007857710123062, -0.04122177138924599, -0.025842616334557533, 0.0017338702455163002, -0.009762221947312355, 0.02903706021606922, 0.04592979699373245, 0.005881855264306068, 0.25411590933799744, -0.07813630998134613, 0.10198616981506348, -0.04982839897274971, 0.010585307143628597, -0.011965891346335411, 0.16597266495227814, 0.013388088904321194, 0.022998547181487083, -0.07737497240304947, 0.09110435098409653, -0.028692815452814102, -0.17979386448860168, 0.018801556900143623, -0.0714372768998146, -0.13127362728118896, 0.008133209310472012, 0.0005912591586820781, 0.05940696597099304, 0.06448588520288467, 0.016704322770237923, 0.05041233077645302, 0.05280681326985359, 0.015193017199635506, -0.1084534153342247, -0.1117393746972084, -0.0019699628464877605, 0.009098844602704048, 0.11441165953874588, 0.012309720739722252, 0.11867756396532059, 0.07871152460575104, 0.00671871704980731, -0.09489032626152039, 0.09480322897434235, 0.029488051310181618, 0.020497839897871017, 0.08679758012294769, 0.11364003270864487, -0.002843551803380251, 0.07870780676603317, 0.04311765730381012, -0.10240347683429718, 0.008186203427612782, -0.019845884293317795, -0.02478456310927868, -0.11085133254528046, 0.10264775156974792, -0.0461350753903389, 0.14661173522472382, 0.1959192007780075, -0.007641893345862627, -0.027054093778133392, -0.06166960671544075, 0.023176129907369614, -0.018352607265114784, 0.08351863920688629, -0.0021119958255439997, -0.15242376923561096, 0.012634861283004284, -0.06950949132442474, 0.02417885698378086, -0.23836153745651245, -0.022261347621679306, 0.015487081371247768, -0.050252046436071396, 0.007007529027760029, 0.07170787453651428, 0.03078487515449524, 0.039255231618881226, -0.04611064866185188, -0.05998188257217407, 0.019963949918746948, 0.0980011448264122, -0.1037014052271843, -0.12166177481412888 ]
null
null
transformers
# legal_t5_small_trans_en_cs model Model on translating legal text from English to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_en_cs is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from English to Cszech. ### How to use Here is how to use this model to translate legal text from English to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_en_cs"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_en_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "1 In the countries concerned, this certainly affects the priority assigned to making progress on the issue of final disposal, particularly of highly radioactive waste and irradiated fuel elements." pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_trans_en_cs model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_en_cs | 50.177| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Cszech", "tags": ["translation English Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "1 In the countries concerned, this certainly affects the priority assigned to making progress on the issue of final disposal, particularly of highly radioactive waste and irradiated fuel elements."}]}
text2text-generation
SEBIS/legal_t5_small_trans_en_cs
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_en\_cs model ===================================== Model on translating legal text from English to Cszech. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_en\_cs is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Cszech. ### How to use Here is how to use this model to translate legal text from English to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_en\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_cs model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.12435829639434814, 0.07688017189502716, -0.002261885441839695, 0.06537483632564545, 0.09388384222984314, 0.019121872261166573, 0.0739629939198494, 0.0959683284163475, -0.07118228822946548, 0.07365576177835464, 0.06606675684452057, 0.03420472517609596, 0.08967527747154236, 0.11492867767810822, 0.043745256960392, -0.22137711942195892, 0.02397894114255905, -0.012770353816449642, -0.0017759106121957302, 0.1438736468553543, 0.1158042699098587, -0.09313400834798813, 0.0212950948625803, -0.025782635435461998, -0.11751030385494232, -0.01625674031674862, -0.06002727895975113, -0.07137767225503922, 0.07277891039848328, 0.04005325958132744, 0.11663803458213806, 0.020643850788474083, 0.08209886401891708, -0.1729613095521927, -0.0003895926638506353, 0.08113354444503784, 0.052979324012994766, 0.04840153083205223, 0.06626252084970474, -0.010088608600199223, 0.15316228568553925, -0.01828937605023384, 0.06049087643623352, 0.023629052564501762, -0.11859956383705139, -0.12219031155109406, -0.0642453134059906, 0.044123467057943344, 0.15359896421432495, 0.14719705283641815, -0.05901701748371124, 0.07390401512384415, -0.1222255527973175, 0.0648498684167862, 0.07139023393392563, -0.2794821858406067, -0.06533797085285187, 0.03016212023794651, 0.04169971123337746, 0.08154375106096268, -0.05422484129667282, -0.04417897015810013, 0.03509863466024399, 0.025251181796193123, 0.0099901482462883, -0.023038474842905998, -0.003939746879041195, -0.00668970076367259, -0.1701805591583252, -0.10173196345567703, 0.1682213842868805, 0.003055686131119728, -0.07217023521661758, -0.09553340822458267, -0.03795655071735382, -0.16003160178661346, 0.006563137285411358, -0.04438823461532593, 0.03822930157184601, -0.0038889108691364527, 0.024170249700546265, -0.01831291802227497, -0.1223558709025383, -0.11373733729124069, 0.03318144753575325, 0.08706084638834, 0.09160614758729935, -0.008324709720909595, 0.004852138925343752, 0.1655685305595398, 0.027855675667524338, -0.09096655249595642, -0.019924884662032127, 0.003949158359318972, -0.12265513837337494, -0.022473355755209923, -0.02965482883155346, -0.134424090385437, -0.06840742379426956, 0.08927608281373978, -0.018147720023989677, 0.060362305492162704, 0.039468809962272644, 0.04456733912229538, 0.011492159217596054, 0.16967511177062988, -0.07832381874322891, -0.05124054104089737, -0.06110972911119461, 0.06124941259622574, -0.06273633241653442, 0.016562724485993385, -0.018922096118330956, -0.0019599103834480047, 0.06461942940950394, 0.07754412293434143, -0.08459768444299698, 0.011144956573843956, -0.045647237449884415, -0.02147163264453411, 0.04757611081004143, -0.1189405545592308, -0.03918564319610596, 0.005626767408102751, -0.10348451882600784, -0.033752214163541794, 0.08420949429273605, -0.013146448880434036, -0.130426824092865, 0.0561966672539711, -0.03392482176423073, -0.014058894477784634, -0.1318725347518921, -0.09548786282539368, -0.030449382960796356, -0.0669645294547081, -0.04606295004487038, -0.0615156888961792, -0.15782871842384338, -0.10444749891757965, 0.06437273323535919, -0.046669017523527145, -0.049953948706388474, -0.09188908338546753, -0.01875261589884758, -0.001738340244628489, -0.03493571653962135, 0.12673933804035187, -0.02390003763139248, 0.096649669110775, 0.012221741490066051, 0.043966371566057205, 0.1517641693353653, 0.07349351793527603, -0.10391224175691605, 0.012279927730560303, -0.0968932956457138, 0.17086704075336456, -0.019219273701310158, 0.003117050975561142, -0.14762111008167267, -0.07793448865413666, -0.05849536508321762, 0.0646631121635437, 0.0947549045085907, 0.12979164719581604, -0.15181536972522736, -0.011463508941233158, 0.2136663943529129, -0.08490252494812012, -0.041605379432439804, 0.10182484239339828, -0.04489084705710411, 0.14003628492355347, 0.08819931745529175, 0.16658659279346466, 0.05088519677519798, -0.07585078477859497, -0.0012948414077982306, -0.034293558448553085, -0.0012717152712866664, -0.001756072393618524, 0.08487484604120255, -0.05142320692539215, -0.0688103437423706, -0.005853887181729078, -0.09532361477613449, 0.03228483349084854, -0.06640976667404175, -0.06516316533088684, 0.017299078404903412, -0.05920139700174332, -0.039251167327165604, 0.06045624613761902, 0.05258532986044884, -0.04898689314723015, -0.12430812418460846, -0.0004248663317412138, 0.10058624297380447, -0.06971825659275055, 0.027371102944016457, -0.06177351623773575, -0.04721341282129288, -0.07391596585512161, -0.011498041450977325, -0.17218811810016632, 0.04003841057419777, 0.047898441553115845, -0.00799752026796341, 0.04704909026622772, 0.04541198909282684, 0.035560380667448044, 0.05442158505320549, -0.001824751845560968, -0.05901378020644188, -0.037798766046762466, -0.040884729474782944, -0.13265405595302582, -0.10616928339004517, -0.03437827154994011, -0.02231123298406601, 0.10595816373825073, -0.18727584183216095, 0.03491741791367531, -0.09503099322319031, 0.045632727444171906, -0.015756115317344666, -0.05041629448533058, 0.040125709027051926, 0.038778774440288544, 0.027191031724214554, -0.061639394611120224, 0.04940161481499672, 0.03222045674920082, 0.024211609736084938, 0.09146775305271149, -0.11033784598112106, -0.1572064310312271, 0.08741045743227005, 0.05324101075530052, -0.14828535914421082, 0.008485325612127781, -0.041962023824453354, -0.06238606944680214, -0.05358589440584183, 0.008222618140280247, 0.24746686220169067, 0.01545485109090805, 0.14875435829162598, -0.10691000521183014, -0.046307314187288284, -0.009113967418670654, -0.02102411910891533, 0.019945053383708, 0.1467595249414444, 0.058993417769670486, -0.09158006310462952, 0.046810898929834366, 0.019109167158603668, -0.022820767015218735, 0.15158958733081818, -0.002852143021300435, -0.13102500140666962, 0.01203599851578474, 0.06876850128173828, -0.031262144446372986, 0.09152580797672272, -0.14387284219264984, -0.0011059502139687538, 0.020041441544890404, 0.04689567908644676, 0.05449135601520538, -0.1610454022884369, 0.025423087179660797, 0.05979025736451149, -0.053398557007312775, 0.013147258199751377, -0.015885325148701668, -0.047005537897348404, 0.07388226687908173, 0.008650155737996101, -0.02098711021244526, -0.017330240458250046, -0.0387837253510952, -0.14002950489521027, 0.20745691657066345, -0.06519066542387009, -0.1389491707086563, -0.10577137768268585, 0.09519948065280914, 0.07343807816505432, -0.007405419833958149, 0.04051390662789345, -0.08331228792667389, -0.04762672632932663, -0.09820033609867096, 0.09551910310983658, -0.05378587916493416, -0.052164699882268906, -0.09058783203363419, -0.010913875885307789, -0.008336779661476612, -0.1301647424697876, 0.035033244639635086, -0.04483438655734062, -0.0859474241733551, 0.007898859679698944, -0.05225270614027977, 0.06943400949239731, 0.1615801304578781, -0.005126830656081438, 0.022700151428580284, -0.012942801229655743, 0.185236394405365, -0.12975247204303741, 0.008632417768239975, 0.07362908869981766, 0.03438473492860794, 0.005028572864830494, 0.09793297201395035, -0.00903196632862091, -0.08601780980825424, 0.04732847586274147, 0.045056503266096115, -0.024293050169944763, -0.2829226851463318, -0.0256004948168993, -0.023534560576081276, -0.037599291652441025, 0.10906782001256943, 0.04027672857046127, 0.014322258532047272, 0.04621586576104164, -0.023705754429101944, -0.008317547850310802, 0.024101171642541885, 0.0515875369310379, -0.023265914991497993, 0.004520696587860584, 0.07578714936971664, -0.04947580769658089, -0.009991729632019997, 0.04166480153799057, 0.00737354252487421, 0.24958793818950653, -0.05524986609816551, 0.12842218577861786, 0.0811401754617691, 0.11679662019014359, 0.013028223998844624, 0.07850068807601929, -0.0287654809653759, 0.017674334347248077, -0.0034636240452528, -0.026625720784068108, -0.07533217966556549, 0.03461961820721626, 0.019883379340171814, 0.014675854705274105, -0.10683194547891617, -0.019039779901504517, 0.014077328145503998, 0.3381880223751068, 0.0655839592218399, -0.23084957897663116, -0.0653437152504921, 0.0009129546233452857, -0.07319145649671555, -0.09193842858076096, 0.06352538615465164, 0.08777365833520889, -0.13792788982391357, -0.025465767830610275, -0.027924908325076103, 0.09981462359428406, -0.10169891268014908, -0.05303443595767021, 0.0485200397670269, 0.041801102459430695, -0.008865990675985813, 0.0904184877872467, -0.29283103346824646, 0.20078696310520172, -0.012897904962301254, 0.126972034573555, -0.013362203724682331, 0.021845491603016853, -0.050247181206941605, 0.00276092067360878, 0.15708748996257782, -0.00298939342610538, 0.02175017073750496, -0.07442962378263474, -0.10679879784584045, 0.021116359159350395, 0.04032932221889496, -0.06812015920877457, 0.09340298920869827, 0.025439273566007614, 0.0301519762724638, -0.010023018345236778, -0.1045193076133728, -0.12949411571025848, -0.10481718182563782, -0.010213217698037624, -0.08164060860872269, 0.05793024227023125, -0.04283486679196358, -0.05509273335337639, -0.008932151831686497, 0.14423805475234985, -0.12073543667793274, -0.09224896132946014, -0.09355348348617554, 0.017754940316081047, 0.09929752349853516, -0.04704746603965759, -0.0020178300328552723, 0.015061601996421814, 0.012746247462928295, 0.00362616335041821, 0.017376838251948357, 0.09272827208042145, -0.06187351793050766, -0.13112719357013702, -0.041390836238861084, 0.1369902491569519, 0.12269904464483261, 0.06368113309144974, -0.030978873372077942, 0.012279603630304337, -0.015895605087280273, -0.07509513199329376, 0.004196814727038145, 0.007738279178738594, 0.05426532030105591, 0.02597114071249962, -0.06514151394367218, -0.0096528809517622, -0.09800337255001068, -0.053531963378190994, 0.10806799679994583, 0.14781762659549713, -0.04861140623688698, 0.06765954941511154, 0.173972025513649, -0.1089751124382019, -0.1606598198413849, 0.015109146945178509, 0.09692424535751343, 0.08143658936023712, -0.07480478286743164, -0.21645084023475647, 0.02561251074075699, 0.08324026316404343, 0.007759904954582453, -0.017731787636876106, -0.41344553232192993, -0.1342431604862213, 0.11042707413434982, 0.08981383591890335, -0.042147278785705566, -0.08849861472845078, -0.01956024579703808, 0.04573401063680649, -0.04351049289107323, 0.08325368165969849, -0.01970900595188141, 0.0940452367067337, 0.022877609357237816, -0.04044003412127495, 0.0414416529238224, -0.05907901003956795, 0.11271461099386215, 0.07087132334709167, 0.045402321964502335, -0.042670972645282745, 0.030949227511882782, -0.0005929834442213178, -0.021202387288212776, 0.15517984330654144, 0.02559458091855049, 0.03832110017538071, -0.21009308099746704, -0.06139615550637245, -0.08595827221870422, -0.01124134287238121, -0.06923690438270569, -0.048324104398489, -0.046102847903966904, 0.0843573734164238, 0.04809545353055, -0.006432775873690844, -0.01719067245721817, -0.07389065623283386, -0.01493228692561388, 0.08504356443881989, 0.1070697009563446, 0.06461766362190247, -0.08504050225019455, 0.015815766528248787, 0.037779301404953, 0.09546513855457306, -0.156009241938591, -0.02661701664328575, 0.12376070767641068, -0.013610606081783772, 0.13754555583000183, -0.00880371406674385, -0.14461565017700195, 0.007406237535178661, 0.03196627274155617, -0.09184940904378891, -0.11774326860904694, -0.007097234483808279, -0.056963708251714706, -0.05026835575699806, -0.05374588072299957, 0.06605616211891174, -0.11288545280694962, -0.021201761439442635, -0.016088612377643585, 0.034405287355184555, -0.068463534116745, 0.2204376608133316, 0.04573448374867439, 0.06384490430355072, -0.06662505865097046, 0.12733717262744904, 0.10818128287792206, -0.11592976748943329, 0.019989341497421265, 0.1723465919494629, -0.10000479221343994, -0.0533822625875473, 0.015414398163557053, 0.12981967628002167, -0.011325663886964321, -0.06933771818876266, -0.052741795778274536, -0.04832736402750015, 0.06228150427341461, 0.00950185302644968, 0.04416758194565773, 0.024966083467006683, -0.03321557492017746, -0.0015273751923814416, -0.126829132437706, 0.06579847633838654, 0.09882751107215881, 0.000023505630451836623, -0.019515788182616234, 0.19441531598567963, 0.05521371215581894, 0.05579959228634834, -0.006965458393096924, -0.040760576725006104, -0.04482460767030716, 0.07416525483131409, -0.007924449630081654, -0.02466718852519989, -0.0561465322971344, -0.016189701855182648, -0.023580769076943398, -0.00791238248348236, 0.00015788951714057475, 0.015587596222758293, -0.06935379654169083, -0.029080264270305634, -0.03763905167579651, 0.04404890537261963, -0.0682157427072525, -0.00591855077072978, -0.000010756294614111539, -0.06442283093929291, 0.0742969885468483, 0.026844214648008347, -0.00620326679199934, 0.015516476705670357, -0.01007227972149849, 0.07111482322216034, -0.028324492275714874, 0.0023616652470082045, -0.0171003770083189, -0.08223126828670502, 0.036869343370199203, -0.008981465362012386, -0.007503452245146036, -0.009382849559187889, 0.050518449395895004, -0.1344597190618515, 0.060666296631097794, -0.020387746393680573, -0.004129255190491676, -0.07703971117734909, 0.1103174239397049, 0.023141486570239067, 0.08472371101379395, 0.10963691025972366, -0.06868469715118408, 0.06736709177494049, -0.14670780301094055, -0.0400148443877697, 0.02351764775812626, 0.017530066892504692, -0.04133197292685509, -0.06319855153560638, 0.05972176417708397, -0.049151502549648285, 0.0702812671661377, 0.056216128170490265, 0.038101676851511, 0.03426769748330116, -0.11714990437030792, 0.002548426855355501, 0.04635220021009445, 0.06674882024526596, 0.003132604295387864, 0.010972512885928154, 0.0060722134076058865, 0.04828231781721115, -0.020608365535736084, 0.0710490345954895, 0.13707448542118073, 0.2105681449174881, 0.08906228840351105, 0.10461417585611343, -0.054012689739465714, -0.11043768376111984, -0.09207896888256073, 0.10162654519081116, -0.018310150131583214, 0.024906937032938004, -0.025350511074066162, 0.15535928308963776, 0.09516076743602753, -0.1586180329322815, 0.0726439356803894, 0.0053917341865599155, -0.10816395282745361, -0.08730435371398926, -0.05885655805468559, -0.03158354014158249, -0.06461071223020554, -0.0058564189821481705, -0.09001718461513519, 0.0277395099401474, 0.07708007097244263, 0.07263654470443726, -0.03402344509959221, 0.14141716063022614, 0.013553565368056297, -0.06447570770978928, 0.09458612650632858, -0.013342974707484245, 0.1010856032371521, -0.0972558856010437, 0.006843253970146179, 0.021755704656243324, -0.015253358520567417, 0.06570976227521896, 0.011033539660274982, -0.044562675058841705, 0.03523116558790207, 0.042699944227933884, -0.05231313779950142, -0.00007202204142231494, 0.042666301131248474, 0.12373162060976028, 0.10260580480098724, 0.07263926416635513, -0.03988558426499367, -0.02808861806988716, 0.20605424046516418, -0.03757944703102112, -0.09756171703338623, -0.17005802690982819, 0.16313773393630981, 0.07304780185222626, 0.012926449999213219, 0.038080815225839615, -0.09516536444425583, -0.003050356637686491, 0.23971207439899445, 0.1295345425605774, -0.05363483726978302, -0.042937058955430984, 0.03259224444627762, -0.007438600994646549, 0.010334674268960953, 0.1118079200387001, 0.04926573485136032, 0.19715982675552368, -0.10835321247577667, 0.034169260412454605, -0.07650637626647949, -0.06587786227464676, -0.012499283999204636, 0.15389195084571838, 0.0064619253389537334, -0.012285787612199783, -0.05583719536662102, 0.10421010106801987, -0.007945479825139046, -0.18256552517414093, 0.0572688952088356, -0.06305442750453949, -0.1296430379152298, -0.022497769445180893, -0.001988847739994526, 0.005520745646208525, 0.046324487775564194, -0.005716721527278423, -0.00337971025146544, 0.1332460641860962, 0.03168005868792534, -0.058573782444000244, -0.14708393812179565, 0.07702425867319107, -0.02034917101264, 0.16995571553707123, -0.002095058560371399, 0.08800439536571503, 0.06988479942083359, 0.024362796917557716, -0.10552965849637985, 0.08194196969270706, 0.050011999905109406, 0.030085083097219467, 0.050395477563142776, 0.11881768703460693, -0.024304799735546112, 0.08329147100448608, 0.02760995179414749, -0.11041998863220215, 0.06198390573263168, -0.13673049211502075, -0.04459472373127937, -0.16517393290996552, 0.030090298503637314, -0.0388604998588562, 0.12800873816013336, 0.2138105034828186, -0.027817459776997566, 0.008393079042434692, -0.06352043896913528, 0.0339859314262867, -0.02861860580742359, 0.14776577055454254, 0.009183966554701328, -0.18284733593463898, 0.011688304133713245, -0.0680966004729271, 0.029153481125831604, -0.21836909651756287, -0.023495178669691086, 0.012011533603072166, -0.08504421263933182, -0.027431799098849297, 0.12685523927211761, 0.04428469017148018, 0.06494280695915222, -0.03079455904662609, -0.016664883121848106, -0.01218866091221571, 0.1460549384355545, -0.1328074187040329, -0.09737173467874527 ]
null
null
transformers
# legal_t5_small_trans_en_cs_small_finetuned model Model on translating legal text from English to Cszech. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_en_cs_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_en_cs_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from English to Cszech. ### How to use Here is how to use this model to translate legal text from English to Cszech in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_en_cs_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_en_cs", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "Members present for the final vote" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_trans_en_cs_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_en_cs_small_finetuned | 50.394| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Cszech", "tags": ["translation English Cszech model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Members present for the final vote"}]}
text2text-generation
SEBIS/legal_t5_small_trans_en_cs_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Cszech model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Cszech" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_en\_cs\_small\_finetuned model ======================================================= Model on translating legal text from English to Cszech. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_en\_cs\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_en\_cs\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Cszech. ### How to use Here is how to use this model to translate legal text from English to Cszech in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_en\_cs\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_cs\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_cs\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Cszech model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Cszech in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_cs\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06787015497684479, 0.08653593808412552, -0.00323608354665339, 0.07565553486347198, 0.05260827764868736, 0.016120366752147675, 0.033146992325782776, 0.09569081664085388, -0.025308027863502502, 0.08505710959434509, 0.041740622371435165, -0.02368091605603695, 0.06908944994211197, 0.03284408897161484, 0.06385549157857895, -0.21190570294857025, 0.0015153661370277405, -0.035987939685583115, -0.000488363963086158, 0.10307835042476654, 0.09047136455774307, -0.06803449243307114, 0.04546589404344559, -0.03570973873138428, -0.0300467386841774, 0.01629745401442051, -0.09611126780509949, -0.045055799186229706, 0.07873416692018509, 0.08795724809169769, 0.0876656025648117, -0.01580186001956463, 0.0661068931221962, -0.19666895270347595, -0.00024477194529026747, 0.0780254527926445, 0.0004894333542324603, 0.046442437916994095, 0.12945348024368286, 0.001446671667508781, 0.16151702404022217, -0.04952652379870415, 0.030571574345231056, 0.04284217953681946, -0.12433888018131256, -0.1063680574297905, -0.05001356452703476, 0.026814337819814682, 0.09296802431344986, 0.13359801471233368, -0.05516708642244339, 0.06780920177698135, -0.05290945619344711, 0.07557541877031326, 0.07526074349880219, -0.22193171083927155, -0.031322117894887924, 0.03417154401540756, 0.046768125146627426, 0.07804929465055466, -0.05102286487817764, -0.022585444152355194, 0.050815075635910034, 0.0657641664147377, 0.02784169465303421, -0.06265772134065628, -0.0602453388273716, -0.05040721595287323, -0.13133983314037323, -0.06301876902580261, 0.155289888381958, 0.025229454040527344, -0.046040259301662445, -0.08335872739553452, -0.06289627403020859, -0.07356122881174088, -0.006646214053034782, -0.02880813740193844, 0.024014130234718323, -0.003956058528274298, 0.06734979897737503, -0.01939045451581478, -0.11879429966211319, -0.05827654153108597, -0.06715180724859238, 0.12828338146209717, 0.045387547463178635, 0.015568452887237072, 0.02188733033835888, 0.08540653437376022, -0.10639754682779312, -0.08433791995048523, 0.0014155175304040313, 0.016285859048366547, -0.11529909819364548, -0.004134691786020994, -0.011244076304137707, -0.16556581854820251, -0.01937088556587696, 0.029526539146900177, -0.0583123154938221, 0.056988123804330826, 0.080225370824337, 0.04252919182181358, 0.0652695968747139, 0.11456427723169327, -0.12032033503055573, -0.11130046099424362, -0.030165862292051315, 0.004350534174591303, 0.0054908678866922855, 0.011774813756346703, -0.05641966685652733, -0.025864263996481895, 0.009579283185303211, 0.03102109581232071, 0.010105329565703869, 0.023846780881285667, -0.022936437278985977, -0.03472837060689926, 0.11788724362850189, -0.10851682722568512, -0.004902352578938007, 0.007230814080685377, -0.09356173127889633, -0.029847068712115288, 0.07742142677307129, -0.01460498757660389, -0.11717724800109863, 0.07251042872667313, -0.03712131083011627, -0.02406913973391056, -0.10583983361721039, -0.16462428867816925, -0.009033244103193283, 0.006043622735887766, -0.06105778366327286, -0.09958084672689438, -0.12567538022994995, -0.09630893915891647, 0.03164581209421158, -0.055000875145196915, -0.004080587532371283, -0.06433136761188507, -0.002413651440292597, -0.005505606532096863, -0.011354400776326656, 0.11154665052890778, -0.0407714918255806, 0.033696942031383514, 0.030994322150945663, 0.06638013571500778, 0.03149348124861717, 0.028962258249521255, -0.12372050434350967, 0.04432811960577965, -0.1401989609003067, 0.14231429994106293, -0.02558578923344612, 0.011976961977779865, -0.12694025039672852, -0.051625318825244904, -0.07782705873250961, 0.059674106538295746, 0.055477265268564224, 0.1174570769071579, -0.1949254274368286, -0.007125634700059891, 0.19276070594787598, -0.08233059197664261, -0.08156237006187439, 0.11961911618709564, -0.02337818220257759, 0.03284311667084694, 0.0803348496556282, 0.10790574550628662, 0.06981978565454483, -0.0102184833958745, -0.049111902713775635, 0.006729878485202789, 0.026157867163419724, 0.07558315247297287, 0.08667478710412979, -0.07140396535396576, 0.03369395434856415, 0.0156256053596735, 0.010984472930431366, -0.00011406651901779696, -0.023343095555901527, -0.034001220017671585, 0.008074019104242325, -0.04842245578765869, -0.043165065348148346, 0.02896074391901493, 0.007984125055372715, -0.05617208778858185, -0.08642204105854034, -0.04335744306445122, 0.10818402469158173, -0.05239156633615494, 0.018978839740157127, -0.014317669905722141, -0.057877372950315475, -0.10781733691692352, 0.014413081109523773, -0.17077875137329102, -0.01490075420588255, 0.03551580011844635, -0.06633444875478745, 0.11309657990932465, 0.060394808650016785, 0.04867522045969963, 0.09002108871936798, -0.05671371892094612, -0.03357195854187012, 0.008753873407840729, -0.021119607612490654, -0.1085381731390953, -0.1193423792719841, -0.04658450558781624, -0.022227676585316658, -0.003936023451387882, -0.12293751537799835, -0.0022851137910038233, -0.06658778339624405, 0.08626805245876312, 0.006289405282586813, -0.02403232455253601, 0.04049484059214592, 0.06490638852119446, -0.02762874960899353, -0.03705217316746712, 0.022127224132418633, -0.00732137355953455, -0.049935661256313324, 0.0924905315041542, -0.17208394408226013, -0.1196914091706276, 0.08078798651695251, 0.006405999884009361, -0.12273456156253815, -0.006262361072003841, -0.016154460608959198, -0.06465421617031097, -0.05861280485987663, -0.0550772100687027, 0.24744845926761627, 0.03493718057870865, 0.14020861685276031, -0.10748723149299622, -0.031135719269514084, 0.005577487871050835, -0.02569890394806862, 0.008742551319301128, 0.1702716201543808, 0.07706763595342636, -0.14760497212409973, 0.08038465678691864, 0.0052667343989014626, -0.028403548523783684, 0.09630238264799118, 0.06128401681780815, -0.1028701439499855, -0.009888138622045517, 0.03605903312563896, -0.0012227714760228992, 0.05933978036046028, -0.09364877641201019, -0.0072687785141170025, 0.029973549768328667, 0.05109128728508949, 0.05831382796168327, -0.08681099116802216, 0.06529432535171509, 0.06739775836467743, -0.02477753907442093, 0.040957141667604446, -0.05465373769402504, -0.029967688024044037, 0.09544979780912399, 0.014860609546303749, -0.025071511045098305, -0.047573719173669815, -0.05218354985117912, -0.11381416767835617, 0.19799089431762695, -0.06130433827638626, -0.20802991092205048, -0.11497040092945099, 0.07320734113454819, -0.02710588090121746, 0.03477552905678749, 0.025675997138023376, -0.0340745784342289, -0.07174793630838394, -0.13025714457035065, 0.07993367314338684, -0.0800253227353096, -0.048271119594573975, -0.12411953508853912, 0.026185261085629463, 0.010758615098893642, -0.11950460076332092, 0.027139438316226006, -0.000036977337003918365, -0.024213673546910286, 0.0040994915179908276, -0.027808556333184242, 0.1110382154583931, 0.1404476761817932, -0.01197269931435585, -0.0343456007540226, -0.005023963283747435, 0.12970057129859924, -0.09700322896242142, 0.06165169179439545, 0.06239777058362961, 0.029727011919021606, 0.02644127979874611, 0.143060103058815, 0.022968918085098267, -0.06123940646648407, 0.047844965010881424, 0.05346119403839111, -0.02145233191549778, -0.22618833184242249, -0.1053004339337349, -0.06976038217544556, 0.021303927525877953, 0.119120292365551, 0.04008854925632477, -0.05123024061322212, 0.02447139099240303, -0.06342761218547821, 0.05374915525317192, -0.00980072095990181, 0.056059498339891434, 0.021992214024066925, -0.008940626867115498, 0.08158471435308456, -0.06007400527596474, -0.043426722288131714, 0.09609422087669373, 0.02731775864958763, 0.1916273981332779, -0.053786199539899826, 0.21173232793807983, 0.05544763803482056, 0.03486068174242973, 0.0181188452988863, 0.05398818105459213, -0.04174687713384628, 0.027563050389289856, -0.027635248377919197, -0.062036752700805664, -0.03496092930436134, 0.060418613255023956, 0.02079741843044758, 0.020173020660877228, -0.05175165832042694, -0.061334509402513504, 0.04698785021901131, 0.2090190351009369, 0.061133138835430145, -0.19619809091091156, -0.055394455790519714, 0.011442706920206547, -0.07308946549892426, -0.06146955117583275, 0.01792122796177864, 0.15036454796791077, -0.09283499419689178, 0.014111260883510113, 0.01916106790304184, 0.12177120894193649, -0.1324760466814041, -0.01477142982184887, 0.03354986384510994, 0.04169871658086777, -0.015769831836223602, 0.129235178232193, -0.24308748543262482, 0.14568805694580078, 0.014675960876047611, 0.054004915058612823, -0.038180381059646606, 0.01249450258910656, -0.036015745252370834, 0.001142477267421782, 0.11530300229787827, 0.013746080920100212, -0.011021412909030914, -0.12131757289171219, -0.10414019227027893, 0.004191966727375984, 0.06433892995119095, -0.07014037668704987, 0.10108586400747299, 0.05342373251914978, 0.013367985375225544, -0.017756996676325798, 0.01646493375301361, -0.05976848304271698, -0.15608221292495728, 0.007163424044847488, -0.025848090648651123, -0.023139923810958862, -0.014802793972194195, -0.023692451417446136, -0.05964938923716545, 0.18740884959697723, -0.10797495394945145, -0.07608040422201157, -0.07446732372045517, -0.00031733300420455635, 0.139358788728714, -0.07536902278661728, -0.00012018834968330339, -0.0007893745205365121, 0.05105803161859512, -0.02343224734067917, -0.027332885190844536, 0.08422693610191345, -0.08253699541091919, -0.10707278549671173, -0.0712147131562233, 0.11634018272161484, 0.07207246124744415, 0.05255250632762909, -0.017271092161536217, 0.033542387187480927, -0.019101912155747414, -0.11372692137956619, -0.02231142669916153, 0.03197953850030899, 0.11596124619245529, 0.06768196076154709, -0.04108858481049538, -0.032535210251808167, -0.0723964273929596, -0.057223621755838394, 0.08052664995193481, 0.16369061172008514, -0.03984112665057182, 0.027401616796851158, 0.19441667199134827, -0.10492782294750214, -0.1812063753604889, -0.06102088838815689, 0.07301876693964005, 0.0753462016582489, -0.014256277121603489, -0.16861513257026672, 0.050722964107990265, 0.09372264891862869, 0.004646241664886475, 0.06076329946517944, -0.3792950212955475, -0.13954174518585205, 0.07799148559570312, 0.033233631402254105, -0.05029284954071045, -0.1280982941389084, -0.05898820236325264, -0.05612843856215477, -0.024804193526506424, 0.09650080651044846, -0.040777869522571564, 0.09933862090110779, -0.004493188578635454, 0.022194955497980118, 0.04628365486860275, -0.040005579590797424, 0.12767921388149261, 0.03532876446843147, 0.04284589737653732, -0.06262767314910889, 0.05818494036793709, 0.006615546997636557, -0.01935405842959881, 0.15184494853019714, -0.050261810421943665, 0.057163041085004807, -0.14618238806724548, -0.05467499420046806, -0.06953411549329758, 0.019350336864590645, -0.040514811873435974, -0.06639671325683594, -0.05788861960172653, 0.04182792827486992, 0.048762138932943344, -0.0010033769067376852, -0.0015626676613464952, -0.05868053808808327, 0.006645781919360161, 0.17091991007328033, 0.12049101293087006, 0.01924058236181736, -0.11490587890148163, 0.026066508144140244, 0.0036124042235314846, 0.07991868257522583, -0.07779380679130554, 0.005056840367615223, 0.1356184035539627, 0.023993749171495438, 0.11785858124494553, -0.011685849167406559, -0.14380475878715515, -0.009943333454430103, 0.04871664568781853, -0.09361037611961365, -0.12399782985448837, -0.002654659328982234, 0.04552742838859558, -0.08442029356956482, -0.04070177674293518, 0.09441361576318741, -0.09202881157398224, -0.01878092624247074, 0.008859341032803059, 0.032702259719371796, -0.027134113013744354, 0.2000279426574707, 0.04105621203780174, 0.04669390991330147, -0.06354475021362305, 0.11315387487411499, 0.13909076154232025, -0.15082190930843353, 0.011793800629675388, 0.1915169656276703, -0.0853905901312828, -0.06668300181627274, 0.013952317647635937, 0.12317259609699249, -0.023215992376208305, -0.05785473808646202, -0.02338855154812336, -0.05358034744858742, 0.030074384063482285, 0.006855324376374483, 0.042708028107881546, 0.04289023205637932, -0.015064539387822151, -0.0144365094602108, -0.09814328700304031, 0.09454023838043213, 0.07513266801834106, 0.038107503205537796, -0.02341664396226406, 0.14407140016555786, 0.03168343007564545, -0.015304237604141235, -0.012074055150151253, 0.0038851553108543158, -0.06563710421323776, 0.01662823185324669, -0.0690077394247055, 0.018212756142020226, -0.05185152217745781, -0.01603824831545353, -0.018604885786771774, 0.005215149372816086, -0.016944700852036476, -0.0015661399811506271, -0.029907457530498505, -0.049940407276153564, -0.03783296421170235, 0.024648811668157578, -0.090840183198452, -0.03726135939359665, 0.017498968169093132, -0.025014353916049004, 0.04985887184739113, 0.0009277052595280111, 0.009221386164426804, -0.006002210546284914, -0.01831280067563057, 0.06644681841135025, 0.01637379825115204, 0.04590877890586853, -0.012325595133006573, -0.08102693408727646, 0.01077160332351923, 0.017319725826382637, -0.0036594674456864595, -0.013534961268305779, 0.02004656381905079, -0.15249554812908173, -0.0023246014025062323, -0.01208886131644249, -0.029476970434188843, -0.08115220814943314, 0.08027347922325134, 0.04039236903190613, 0.061768997460603714, 0.11168162524700165, -0.06992462277412415, 0.08283507823944092, -0.17027080059051514, -0.007944237440824509, 0.020591331645846367, 0.01052951067686081, -0.031217822805047035, -0.0026315082795917988, 0.05027995631098747, -0.06564830243587494, 0.13557924330234528, 0.02184494584798813, 0.05969204008579254, 0.022551026195287704, -0.06219679117202759, -0.0030939574353396893, 0.02618388459086418, 0.07898180931806564, -0.02445933409035206, -0.030101165175437927, -0.05461176484823227, 0.08645528554916382, 0.0010455860756337643, 0.060423027724027634, 0.05664000287652016, 0.11932866275310516, 0.12160205096006393, 0.04029843583703041, -0.009269773960113525, -0.10567215830087662, -0.05756956711411476, 0.049090426415205, -0.011552650481462479, 0.04238978773355484, -0.014563988894224167, 0.10846484452486038, 0.12350024282932281, -0.13728497922420502, 0.12227079272270203, -0.006519945338368416, -0.08185204863548279, -0.04054942727088928, -0.13543793559074402, -0.044476468116045, -0.008511137217283249, -0.04592056944966316, -0.11242158710956573, 0.01083811093121767, 0.08510904014110565, 0.04129726067185402, -0.027915911749005318, 0.13617755472660065, -0.046179626137018204, -0.10826872289180756, 0.04494037106633186, 0.018258918076753616, 0.08606201410293579, 0.023723646998405457, 0.029815981164574623, 0.05776875466108322, 0.005079286172986031, 0.04840826243162155, 0.05375577509403229, -0.014131697826087475, 0.00549724604934454, 0.012474961578845978, -0.06102769076824188, -0.03782956302165985, 0.00855228491127491, 0.07891981303691864, 0.19350682199001312, 0.05547802522778511, -0.05889466032385826, -0.023740876466035843, 0.1952080875635147, -0.06418396532535553, -0.07895711809396744, -0.09862042963504791, 0.2145882248878479, 0.0332062803208828, 0.024067498743534088, -0.0008404820109717548, -0.10363586992025375, -0.006688396912068129, 0.15404807031154633, 0.1838037222623825, -0.06033385917544365, -0.026465822011232376, 0.021063458174467087, -0.005215423181653023, 0.029308108612895012, 0.042439039796590805, 0.027731532230973244, 0.26823341846466064, -0.0813727006316185, 0.09068411588668823, -0.040889520198106766, 0.007077225483953953, -0.006599548738449812, 0.16371585428714752, 0.0024018778931349516, 0.025477269664406776, -0.0704634040594101, 0.07115805149078369, -0.026705501601099968, -0.17474789917469025, 0.014739231206476688, -0.08729257434606552, -0.11232144385576248, 0.011722113005816936, -0.010887070558965206, 0.05811233073472977, 0.06702571362257004, 0.009169346652925014, 0.03107745200395584, 0.06821326911449432, 0.012403016909956932, -0.11566556990146637, -0.12015189975500107, 0.012245887890458107, -0.02598762884736061, 0.13414707779884338, 0.002939658472314477, 0.13229766488075256, 0.07933129370212555, 0.004158891271799803, -0.10549388080835342, 0.09226971864700317, 0.025068968534469604, 0.027830565348267555, 0.086028091609478, 0.10998424142599106, 0.00602043978869915, 0.07512010633945465, 0.04495429992675781, -0.08076189458370209, 0.022865787148475647, -0.0424153208732605, -0.011953816749155521, -0.13460944592952728, 0.07683321088552475, -0.03619087487459183, 0.14928357303142548, 0.18033267557621002, -0.022609470412135124, -0.02433782070875168, -0.042927086353302, 0.006621965207159519, -0.013868294656276703, 0.09608316421508789, -0.011823128908872604, -0.15528710186481476, 0.023499073460698128, -0.08138012140989304, 0.03503558412194252, -0.265462189912796, -0.024635156616568565, 0.01927085407078266, -0.05979347974061966, -0.004086458124220371, 0.07764880359172821, 0.04213990271091461, 0.05166274309158325, -0.0543597973883152, -0.07576799392700195, 0.00681445375084877, 0.10434630513191223, -0.11182009428739548, -0.12018983066082001 ]
null
null
transformers
# legal_t5_small_trans_en_de model Model on translating legal text from English to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_en_de is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from English to Deustch. ### How to use Here is how to use this model to translate legal text from English to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_en_de"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_en_de", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "· the impact of electromagnetic fields on animals, especially birds in cities;" pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_trans_en_de model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 5 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_en_de | 43.656| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Deustch", "tags": ["translation English Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "\u00b7 the impact of electromagnetic fields on animals, especially birds in cities;"}]}
text2text-generation
SEBIS/legal_t5_small_trans_en_de
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_en\_de model ===================================== Model on translating legal text from English to Deustch. It was first released in this repository. This model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_en\_de is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Deustch. ### How to use Here is how to use this model to translate legal text from English to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_en\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 164, 50, 29, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_de model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 5 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.12125194072723389, 0.07414855062961578, -0.0025375487748533487, 0.07186470925807953, 0.09537380188703537, 0.019682899117469788, 0.07542774826288223, 0.09439486265182495, -0.08243420720100403, 0.06784255802631378, 0.0665372982621193, 0.03828750550746918, 0.08924838900566101, 0.1140102744102478, 0.046013545244932175, -0.22709506750106812, 0.02397271804511547, -0.010665648616850376, -0.005849636159837246, 0.1346011757850647, 0.12005206942558289, -0.08972308784723282, 0.02153760753571987, -0.028290607035160065, -0.11071324348449707, -0.0034347602631896734, -0.06398408859968185, -0.07076727598905563, 0.07700590044260025, 0.0427059642970562, 0.11453702300786972, 0.01857961341738701, 0.08136797696352005, -0.17291763424873352, -0.0007905908860266209, 0.0867573693394661, 0.05381576344370842, 0.041266631335020065, 0.06651024520397186, -0.009219053201377392, 0.14857809245586395, -0.02001783438026905, 0.06469133496284485, 0.021302908658981323, -0.12146353721618652, -0.1373792439699173, -0.06307930499315262, 0.03671453520655632, 0.15210647881031036, 0.14924967288970947, -0.060592956840991974, 0.06283959001302719, -0.11691264063119888, 0.06220693141222, 0.061484068632125854, -0.26699861884117126, -0.06356742233037949, 0.024937372654676437, 0.043276261538267136, 0.07900264114141464, -0.04869196191430092, -0.048343636095523834, 0.038603030145168304, 0.033429376780986786, 0.009430203586816788, -0.020963959395885468, 0.010836527682840824, -0.008813566528260708, -0.1664687544107437, -0.09694875031709671, 0.1590990424156189, 0.0002328150294488296, -0.07567653059959412, -0.0954517051577568, -0.03623204305768013, -0.15557648241519928, 0.010445012710988522, -0.05845840275287628, 0.04243092238903046, -0.0009139608009718359, 0.030845709145069122, -0.018555890768766403, -0.11705298721790314, -0.11395762860774994, 0.026219062507152557, 0.08124584704637527, 0.09774061292409897, -0.012992678210139275, -0.004768454935401678, 0.16724233329296112, 0.020366424694657326, -0.09284363687038422, -0.02095525898039341, 0.0002565730537753552, -0.11332307755947113, -0.019988272339105606, -0.031837619841098785, -0.14047808945178986, -0.06438983976840973, 0.10053275525569916, -0.027181237936019897, 0.05950571224093437, 0.033794254064559937, 0.03838619217276573, 0.015672435984015465, 0.165967658162117, -0.08034471422433853, -0.06179042160511017, -0.057443760335445404, 0.062364738434553146, -0.05902861803770065, 0.022515960037708282, -0.01519012451171875, -0.0037341017741709948, 0.06728027760982513, 0.07573338598012924, -0.08228835463523865, 0.006517923902720213, -0.0466652549803257, -0.021752363070845604, 0.05122615396976471, -0.11913374066352844, -0.03992471098899841, 0.0013890062691643834, -0.10669384151697159, -0.03151397407054901, 0.08170469105243683, -0.01082884892821312, -0.13120222091674805, 0.05563846230506897, -0.029379533603787422, -0.022742409259080887, -0.13290515542030334, -0.10328493267297745, -0.03148214891552925, -0.06820476055145264, -0.03722710907459259, -0.0671510100364685, -0.17083804309368134, -0.10546430945396423, 0.06579691916704178, -0.05141815170645714, -0.049017004668712616, -0.09378443658351898, -0.014560808427631855, -0.0045299832709133625, -0.03538301959633827, 0.12379442155361176, -0.023689797148108482, 0.0953172892332077, 0.02197166346013546, 0.04598693177103996, 0.14666858315467834, 0.08125601708889008, -0.10197115689516068, 0.010715360753238201, -0.09723558276891708, 0.17990051209926605, -0.015317642129957676, 0.002581010339781642, -0.14906863868236542, -0.0779261365532875, -0.06209087744355202, 0.07027831673622131, 0.10278311371803284, 0.12777897715568542, -0.15580470860004425, -0.012365414761006832, 0.20899781584739685, -0.08737670630216599, -0.03549477830529213, 0.10462487488985062, -0.048062048852443695, 0.14262861013412476, 0.09571844339370728, 0.16977070271968842, 0.053997334092855453, -0.07338067144155502, 0.0018193353898823261, -0.03873185068368912, -0.006445045582950115, 0.007747335359454155, 0.08008736371994019, -0.050032857805490494, -0.06578855216503143, -0.0067559159360826015, -0.08939559012651443, 0.03547855094075203, -0.06089670583605766, -0.060652315616607666, 0.023299627006053925, -0.05997816100716591, -0.040841519832611084, 0.05999607592821121, 0.050589654594659805, -0.05158597603440285, -0.12549737095832825, 0.015812508761882782, 0.09847772121429443, -0.06815662235021591, 0.02417372167110443, -0.05773823335766792, -0.05724111199378967, -0.07246728241443634, -0.00902694184333086, -0.1672273874282837, 0.040592215955257416, 0.044787488877773285, -0.0249875970184803, 0.0482211709022522, 0.05018778145313263, 0.036693792790174484, 0.05523262917995453, -0.00456761522218585, -0.06001880392432213, -0.04652339220046997, -0.03704683855175972, -0.1265687793493271, -0.10410607606172562, -0.025190556421875954, -0.026171740144491196, 0.08936647325754166, -0.18097037076950073, 0.035977281630039215, -0.10422378778457642, 0.04521917924284935, -0.013579708524048328, -0.04637124016880989, 0.03622185066342354, 0.04085575416684151, 0.024700356647372246, -0.06555585563182831, 0.044358327984809875, 0.027893496677279472, 0.03040310926735401, 0.08600141853094101, -0.10771001130342484, -0.1502552479505539, 0.08933873474597931, 0.055256422609090805, -0.14987434446811676, 0.021014193072915077, -0.04010523110628128, -0.06733877956867218, -0.054378725588321686, 0.0002207520738011226, 0.25444552302360535, 0.01190285012125969, 0.14743302762508392, -0.10374018549919128, -0.045578114688396454, -0.012529542669653893, -0.013892999850213528, 0.013519426807761192, 0.14406268298625946, 0.056470684707164764, -0.09092194586992264, 0.04890590161085129, 0.02137725055217743, -0.0238590557128191, 0.14727124571800232, 0.00007205882866401225, -0.12928490340709686, 0.01495883148163557, 0.06447762250900269, -0.03325643762946129, 0.08657153695821762, -0.1456788331270218, 0.0024980639573186636, 0.015260843560099602, 0.05476107820868492, 0.05793856084346771, -0.15612192451953888, 0.025522373616695404, 0.061970386654138565, -0.05312289297580719, 0.009399170987308025, -0.013536946848034859, -0.04859054833650589, 0.07273302972316742, 0.011544241569936275, -0.023216690868139267, -0.012166903354227543, -0.035898931324481964, -0.13365891575813293, 0.2054884433746338, -0.06919148564338684, -0.14526253938674927, -0.1014789491891861, 0.09504780173301697, 0.06830865144729614, -0.005511033348739147, 0.04827698692679405, -0.08818192034959793, -0.04934736713767052, -0.09284800291061401, 0.11353905498981476, -0.06015698239207268, -0.05318443104624748, -0.09114784747362137, -0.010880977846682072, -0.004683289211243391, -0.1372596174478531, 0.0335562489926815, -0.04190589115023613, -0.07927419990301132, 0.006401211488991976, -0.04852623865008354, 0.06495581567287445, 0.14962008595466614, -0.003943619318306446, 0.018362678587436676, -0.013778899796307087, 0.1908944845199585, -0.1307782679796219, 0.01183954905718565, 0.07979431748390198, 0.038212839514017105, 0.005291930865496397, 0.10427873581647873, -0.010305898264050484, -0.09306520968675613, 0.05107802525162697, 0.05384715646505356, -0.020922154188156128, -0.28717514872550964, -0.016499817371368408, -0.01973709464073181, -0.03721733018755913, 0.10871796309947968, 0.04146512597799301, 0.01669313758611679, 0.0406208373606205, -0.020604094490408897, 0.005681581795215607, 0.029407309368252754, 0.049323733896017075, -0.004335886333137751, 0.0019230913603678346, 0.07716701179742813, -0.04890253394842148, -0.019009964540600777, 0.044270895421504974, 0.012524623423814774, 0.2461244761943817, -0.05480685830116272, 0.11111190170049667, 0.08069292455911636, 0.09805639833211899, 0.005471091717481613, 0.0788695365190506, -0.029560744762420654, 0.019850702956318855, -0.008657609112560749, -0.022680580615997314, -0.06649017333984375, 0.03744957968592644, 0.012917275540530682, 0.011451048776507378, -0.10338021069765091, -0.015441390685737133, 0.011904173530638218, 0.326992392539978, 0.06942152231931686, -0.23507608473300934, -0.06434889882802963, 0.0035924562253057957, -0.07564038783311844, -0.09798252582550049, 0.06787306070327759, 0.07371360808610916, -0.12901121377944946, -0.017280099913477898, -0.023722536861896515, 0.09944680333137512, -0.10455172508955002, -0.04875977709889412, 0.05060611292719841, 0.05165398493409157, -0.008300738409161568, 0.08587293326854706, -0.30947473645210266, 0.1884640008211136, -0.01599952019751072, 0.1316969394683838, -0.007502648513764143, 0.026448996737599373, -0.049226805567741394, 0.00009030669752974063, 0.15339837968349457, -0.0051262266933918, 0.026902101933956146, -0.06405642628669739, -0.10083238780498505, 0.02192188985645771, 0.033253591507673264, -0.06335724890232086, 0.08677126467227936, 0.027662815526127815, 0.0361926406621933, -0.008031655102968216, -0.10216783732175827, -0.14208438992500305, -0.1042100116610527, -0.012785976752638817, -0.08907248079776764, 0.0679582729935646, -0.04069162532687187, -0.055202484130859375, -0.011567329056560993, 0.14096251130104065, -0.11781971901655197, -0.09986113011837006, -0.09124147891998291, 0.028081752359867096, 0.09496196359395981, -0.04605196788907051, -0.002616650890558958, 0.01836382783949375, 0.003895968198776245, 0.0014222004683688283, 0.012487664818763733, 0.09514236450195312, -0.06002536043524742, -0.12255994975566864, -0.0436842255294323, 0.12552477419376373, 0.12748603522777557, 0.05856503173708916, -0.031593769788742065, 0.008810409344732761, -0.016240529716014862, -0.07187280058860779, 0.007806906942278147, 0.011275704950094223, 0.05080753564834595, 0.018761662766337395, -0.06210411339998245, -0.011394591070711613, -0.09887713193893433, -0.05554845556616783, 0.10068241506814957, 0.14330077171325684, -0.0442529171705246, 0.06216398999094963, 0.17463338375091553, -0.10985822230577469, -0.161414235830307, 0.016270380467176437, 0.09827706217765808, 0.08229032903909683, -0.07286292314529419, -0.2242274284362793, 0.02542022615671158, 0.08803859353065491, 0.01156316976994276, -0.011058260686695576, -0.4108699858188629, -0.13512001931667328, 0.1181369200348854, 0.08967085182666779, -0.03625653684139252, -0.08769570291042328, -0.015113944187760353, 0.03889589384198189, -0.040812160819768906, 0.0746065229177475, -0.017149021849036217, 0.08958882838487625, 0.021534940227866173, -0.03727155178785324, 0.03906748443841934, -0.05463995784521103, 0.11415267735719681, 0.07096163183450699, 0.05192204937338829, -0.04528064653277397, 0.03845295310020447, -0.0058067720383405685, -0.02107076719403267, 0.16081814467906952, 0.029402518644928932, 0.03777604550123215, -0.203215554356575, -0.06664783507585526, -0.08072297275066376, -0.008719178847968578, -0.06874434649944305, -0.05524696409702301, -0.04461144283413887, 0.08554033935070038, 0.04354189708828926, -0.009875481948256493, -0.022837134078145027, -0.06899132579565048, -0.022940125316381454, 0.07323101162910461, 0.10279454290866852, 0.07294165343046188, -0.08425571769475937, 0.018701892346143723, 0.03615632653236389, 0.09917668253183365, -0.15683266520500183, -0.023902224376797676, 0.12547746300697327, -0.02140493504703045, 0.14107732474803925, -0.008485659956932068, -0.1485951840877533, 0.010944732464849949, 0.03196188807487488, -0.08410822600126266, -0.12825201451778412, -0.007757087703794241, -0.07077044248580933, -0.04178496077656746, -0.04808206856250763, 0.06882651150226593, -0.11558777093887329, -0.023066559806466103, -0.01616571471095085, 0.038445763289928436, -0.07234583050012589, 0.22444479167461395, 0.0413641519844532, 0.05730600282549858, -0.06423468887805939, 0.1340464949607849, 0.10822464525699615, -0.12540730834007263, 0.026826702058315277, 0.1712019145488739, -0.09468435496091843, -0.05442637950181961, 0.0054247151128947735, 0.13535594940185547, -0.01179323811084032, -0.07295430451631546, -0.05418577790260315, -0.05141955986618996, 0.06650703400373459, -0.0062976437620818615, 0.04036618024110794, 0.025833014398813248, -0.03566429391503334, 0.0024277924094349146, -0.1253920942544937, 0.06605149805545807, 0.10298985242843628, 0.0028702779673039913, -0.023150809109210968, 0.18045100569725037, 0.05624128133058548, 0.0527382455766201, -0.005469157826155424, -0.03738394379615784, -0.044642142951488495, 0.07146110385656357, -0.02751118876039982, -0.027640894055366516, -0.0524209588766098, -0.010839368216693401, -0.023303983733057976, -0.0071143656969070435, 0.0026149037294089794, 0.0190670657902956, -0.0644562840461731, -0.0337398499250412, -0.03535072132945061, 0.04397229105234146, -0.07102301716804504, -0.010501820594072342, -0.005447368137538433, -0.06285666674375534, 0.07691194862127304, 0.02369515597820282, -0.004352132324129343, 0.010003985837101936, 0.0028203155379742384, 0.07382358610630035, -0.026340410113334656, 0.0018611056730151176, -0.01698886789381504, -0.08985158056020737, 0.036178700625896454, -0.009473075158894062, -0.013934101909399033, -0.013601583428680897, 0.04769190400838852, -0.13664603233337402, 0.0655955970287323, -0.021153707057237625, -0.005151758436113596, -0.07579760998487473, 0.106020987033844, 0.02299116738140583, 0.08749503642320633, 0.11049928516149521, -0.07083568722009659, 0.06564450263977051, -0.1437646597623825, -0.04047097638249397, 0.024303479120135307, 0.019197942689061165, -0.031964000314474106, -0.062019359320402145, 0.059979792684316635, -0.048358622938394547, 0.07385332882404327, 0.06170963495969772, 0.03622188791632652, 0.035559698939323425, -0.1304134577512741, -0.0042216842994093895, 0.04662638157606125, 0.06416117399930954, -0.0024906194303184748, 0.010861351154744625, -0.010613472200930119, 0.04898231849074364, -0.014258070848882198, 0.08072437345981598, 0.13242992758750916, 0.22152449190616608, 0.08358902484178543, 0.10156376659870148, -0.05321340262889862, -0.11052891612052917, -0.10439970344305038, 0.10158775746822357, -0.005706690717488527, 0.02196621522307396, -0.02126672863960266, 0.1436997652053833, 0.0959027111530304, -0.15440374612808228, 0.07267024368047714, 0.003214014694094658, -0.10534742474555969, -0.08758494257926941, -0.05342918634414673, -0.03028392419219017, -0.06605537235736847, -0.007875096052885056, -0.08887788653373718, 0.024771587923169136, 0.06462643295526505, 0.07565215975046158, -0.039238009601831436, 0.14366580545902252, 0.017291052266955376, -0.06850920617580414, 0.09156478941440582, -0.013414448127150536, 0.09765612334012985, -0.08519021421670914, 0.011319526471197605, 0.018850281834602356, -0.009083290584385395, 0.06064353138208389, 0.0099516985937953, -0.05126844346523285, 0.03511051833629608, 0.04181932285428047, -0.04473230987787247, -0.002167885424569249, 0.04842928424477577, 0.11186039447784424, 0.11137177050113678, 0.0761241763830185, -0.04379470273852348, -0.02298048324882984, 0.19664467871189117, -0.034117259085178375, -0.10358455032110214, -0.16843244433403015, 0.17418810725212097, 0.08277492970228195, 0.020226914435625076, 0.03698243573307991, -0.09956523030996323, -0.00691305473446846, 0.23079392313957214, 0.13859261572360992, -0.049155667424201965, -0.04796844348311424, 0.03433215618133545, -0.007408432196825743, 0.010023849084973335, 0.10963717103004456, 0.05618549883365631, 0.19740521907806396, -0.10559879243373871, 0.03790116682648659, -0.077417753636837, -0.06102805957198143, -0.011408488266170025, 0.15523876249790192, 0.01096414029598236, -0.010242339223623276, -0.05388981103897095, 0.10753699392080307, -0.003982448019087315, -0.18960298597812653, 0.0552249550819397, -0.058298688381910324, -0.1292659193277359, -0.023117545992136, -0.005499482620507479, 0.0026697448920458555, 0.0466146320104599, -0.002114670816808939, 0.0007530862931162119, 0.13975019752979279, 0.02915419265627861, -0.05866249278187752, -0.14110614359378815, 0.07469525188207626, -0.0009826121386140585, 0.1594138741493225, -0.002368534915149212, 0.0849127396941185, 0.07158945500850677, 0.028676370158791542, -0.1074017733335495, 0.08629170060157776, 0.045397575944662094, 0.043066784739494324, 0.048919811844825745, 0.11023446172475815, -0.02393849939107895, 0.07412994652986526, 0.027489647269248962, -0.10651189088821411, 0.055745117366313934, -0.13107265532016754, -0.053771011531353, -0.1665629744529724, 0.03835649415850639, -0.03925023972988129, 0.12536388635635376, 0.21375706791877747, -0.0240595955401659, 0.012366033159196377, -0.06682047992944717, 0.0352424792945385, -0.024797948077321053, 0.1503438800573349, 0.00945253949612379, -0.17851784825325012, 0.012873074971139431, -0.06412115693092346, 0.030178910121321678, -0.2228313684463501, -0.022704245522618294, 0.013511069118976593, -0.08271239697933197, -0.02663968689739704, 0.12464886903762817, 0.04346891865134239, 0.06603795289993286, -0.03383010998368263, -0.01893686130642891, -0.018654337152838707, 0.1457281857728958, -0.1293296366930008, -0.09785225987434387 ]
null
null
transformers
# legal_t5_small_trans_en_de_small_finetuned model Model on translating legal text from English to Deustch. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_en_de_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_en_de_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from English to Deustch. ### How to use Here is how to use this model to translate legal text from English to Deustch in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_en_de_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_en_de", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "The reference framework for the free movement of workers is laid down in Council Regulation (EEC) No 1612/68 on freedom of movement for workers within the Community and has been revised several times." pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_trans_en_de_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_en_de_small_finetuned | 43.636| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Deustch", "tags": ["translation English Deustch model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "The reference framework for the free movement of workers is laid down in Council Regulation (EEC) No 1612/68 on freedom of movement for workers within the Community and has been revised several times."}]}
text2text-generation
SEBIS/legal_t5_small_trans_en_de_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Deustch model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Deustch" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_en\_de\_small\_finetuned model ======================================================= Model on translating legal text from English to Deustch. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_en\_de\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_en\_de\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Deustch. ### How to use Here is how to use this model to translate legal text from English to Deustch in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_en\_de\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_de\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_de\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 59, 210, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Deustch model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Deustch in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_de\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.06772402673959732, 0.08494381606578827, -0.0034570808056741953, 0.08150109648704529, 0.05326597020030022, 0.017290625721216202, 0.03470218926668167, 0.09426850825548172, -0.03381563350558281, 0.08083631098270416, 0.04357907548546791, -0.021245228126645088, 0.06925120204687119, 0.032661911100149155, 0.0662815123796463, -0.21743330359458923, 0.0034683081321418285, -0.03410954773426056, -0.007405790500342846, 0.09306522458791733, 0.09583073854446411, -0.06478223204612732, 0.04553687199950218, -0.03937344253063202, -0.021857725456357002, 0.027787918224930763, -0.10078725218772888, -0.044682908803224564, 0.08406195044517517, 0.0903058871626854, 0.08651556819677353, -0.0156568493694067, 0.06734666228294373, -0.19420477747917175, -0.0010435678996145725, 0.08307494223117828, 0.0015837650280445814, 0.03894558176398277, 0.1288929283618927, 0.0021079648286104202, 0.1558661013841629, -0.054170336574316025, 0.035577066242694855, 0.03978473320603371, -0.1271246373653412, -0.12211033701896667, -0.050092656165361404, 0.019981395453214645, 0.09075197577476501, 0.13554298877716064, -0.057028934359550476, 0.05800367519259453, -0.04865264147520065, 0.07223072648048401, 0.06520851701498032, -0.20841866731643677, -0.029238225892186165, 0.029957672581076622, 0.05075155571103096, 0.07507862150669098, -0.04563986137509346, -0.026358608156442642, 0.0538991317152977, 0.07382262498140335, 0.025278957560658455, -0.06082598492503166, -0.0455428846180439, -0.05166057497262955, -0.1281088888645172, -0.05731122940778732, 0.1492665559053421, 0.023390954360365868, -0.050215382128953934, -0.08382686227560043, -0.06186191365122795, -0.06947207450866699, -0.00296343513764441, -0.04507722333073616, 0.027917949482798576, -0.0005031502223573625, 0.07476180046796799, -0.01944650337100029, -0.11287210136651993, -0.057993896305561066, -0.07309863716363907, 0.12178178131580353, 0.0502997487783432, 0.010568905621767044, 0.013974941335618496, 0.08705735951662064, -0.11024211347103119, -0.08665110170841217, -0.0005768570117652416, 0.012625288218259811, -0.10667411237955093, -0.002552573336288333, -0.013759476132690907, -0.16961000859737396, -0.01416208129376173, 0.04305650293827057, -0.06591439247131348, 0.05544215068221092, 0.07346347719430923, 0.035548098385334015, 0.06979131698608398, 0.114080049097538, -0.12371321767568588, -0.12285760790109634, -0.02468956634402275, 0.006056055426597595, 0.009985387325286865, 0.017704498022794724, -0.05468820780515671, -0.02871442586183548, 0.012514876201748848, 0.031207293272018433, 0.013913298957049847, 0.020205095410346985, -0.024694908410310745, -0.034802645444869995, 0.12562045454978943, -0.10937263816595078, -0.0060064829885959625, 0.0037650878075510263, -0.0952516496181488, -0.02531537227332592, 0.07406821846961975, -0.013419552706182003, -0.11589659005403519, 0.07305731624364853, -0.03215811774134636, -0.03425874188542366, -0.10669474303722382, -0.1716454178094864, -0.00967723410576582, 0.005299644079059362, -0.052044376730918884, -0.10453364253044128, -0.13792793452739716, -0.09747099131345749, 0.030177626758813858, -0.059549927711486816, -0.0023627430200576782, -0.06619526445865631, -0.00042147882049903274, -0.007142271380871534, -0.010854735970497131, 0.10486601293087006, -0.040671512484550476, 0.03218016400933266, 0.03867824748158455, 0.06869897991418839, 0.02530229277908802, 0.03631719574332237, -0.12148582935333252, 0.04399384930729866, -0.14263629913330078, 0.1527729332447052, -0.023881686851382256, 0.012067627161741257, -0.12807847559452057, -0.05214203894138336, -0.08248453587293625, 0.06549717485904694, 0.06393098086118698, 0.11830410361289978, -0.2054867446422577, -0.0068054706789553165, 0.18878933787345886, -0.08421000093221664, -0.07666901499032974, 0.12118450552225113, -0.02761649712920189, 0.03495994955301285, 0.08779622614383698, 0.11137548089027405, 0.07107305526733398, -0.00835063774138689, -0.04637451469898224, 0.00215477729216218, 0.020468274131417274, 0.08890648931264877, 0.08240761607885361, -0.07231160253286362, 0.03973723202943802, 0.015199880115687847, 0.014691701158881187, 0.0013134736800566316, -0.017310548573732376, -0.02909301407635212, 0.014693399891257286, -0.04842042177915573, -0.04251820966601372, 0.02791590802371502, 0.004416890908032656, -0.058542992919683456, -0.08801835775375366, -0.026137184351682663, 0.10630706697702408, -0.05138734355568886, 0.015814105048775673, -0.009442482143640518, -0.06595868617296219, -0.10339013487100601, 0.017085539177060127, -0.16525758802890778, -0.01737452857196331, 0.03303348273038864, -0.08127737045288086, 0.11310891807079315, 0.06397038698196411, 0.04890153557062149, 0.08922038972377777, -0.05985607951879501, -0.03517759591341019, -0.0013137363130226731, -0.016216320917010307, -0.10172832012176514, -0.1184428408741951, -0.03760857135057449, -0.026303941383957863, -0.018436167389154434, -0.11689292639493942, -0.000990909757092595, -0.0706944465637207, 0.08750983327627182, 0.007233540527522564, -0.019939031451940536, 0.035048872232437134, 0.06703837215900421, -0.029510455206036568, -0.04268144443631172, 0.01575002260506153, -0.011586537584662437, -0.04554309323430061, 0.08816399425268173, -0.16904492676258087, -0.11676354706287384, 0.08369934558868408, 0.00882753450423479, -0.12758448719978333, 0.007638431154191494, -0.013331438414752483, -0.06882946193218231, -0.0566379688680172, -0.06625071167945862, 0.2519441246986389, 0.03196287527680397, 0.13683846592903137, -0.10487416386604309, -0.030452487990260124, 0.0023413782473653555, -0.017535006627440453, 0.0031817767303436995, 0.16722315549850464, 0.07534168660640717, -0.14744319021701813, 0.08420884609222412, 0.009274055249989033, -0.026531444862484932, 0.09039928764104843, 0.06246571242809296, -0.10231773555278778, -0.007534203585237265, 0.031779658049345016, -0.0030152511317282915, 0.057209812104701996, -0.09370628744363785, -0.004104026593267918, 0.024608757346868515, 0.0581093430519104, 0.061034396290779114, -0.08106065541505814, 0.06537435948848724, 0.06930157542228699, -0.024744544178247452, 0.03661645948886871, -0.052004460245370865, -0.031204819679260254, 0.09444724023342133, 0.020486734807491302, -0.02628389745950699, -0.04249788820743561, -0.048685792833566666, -0.10843349993228912, 0.19615809619426727, -0.06662189215421677, -0.21387962996959686, -0.11034532636404037, 0.07677581161260605, -0.03125329315662384, 0.03794288635253906, 0.032588813453912735, -0.037808459252119064, -0.07338308542966843, -0.12436094135046005, 0.09781309217214584, -0.0845232605934143, -0.049645744264125824, -0.12466633319854736, 0.02632165141403675, 0.013767208904027939, -0.12639857828617096, 0.02596798725426197, 0.003405056893825531, -0.015735456719994545, 0.0016329333884641528, -0.02494528889656067, 0.10768784582614899, 0.12895068526268005, -0.010036777704954147, -0.0395638570189476, -0.0059600635431706905, 0.13650721311569214, -0.09924053400754929, 0.06640099734067917, 0.06647966057062149, 0.03244595602154732, 0.02666197158396244, 0.1508985012769699, 0.021760450676083565, -0.06970159709453583, 0.051077716052532196, 0.06027946248650551, -0.018658922985196114, -0.2303883284330368, -0.09814921021461487, -0.06659387797117233, 0.02146018110215664, 0.11663823574781418, 0.040903959423303604, -0.04729464277625084, 0.018842145800590515, -0.061137329787015915, 0.06754977256059647, -0.004820635076612234, 0.05463039129972458, 0.04163135588169098, -0.012046221643686295, 0.08288640528917313, -0.06006300821900368, -0.05248548090457916, 0.09896937012672424, 0.03184972703456879, 0.1873389184474945, -0.052798572927713394, 0.19527305662631989, 0.05478052794933319, 0.01781616359949112, 0.010182737372815609, 0.05317249149084091, -0.04000640660524368, 0.029358597472310066, -0.03329906985163689, -0.05874459445476532, -0.02652030438184738, 0.06348560750484467, 0.015667034313082695, 0.015482453629374504, -0.04703011363744736, -0.05951717868447304, 0.04539613053202629, 0.19959837198257446, 0.06479895114898682, -0.20100179314613342, -0.05536108836531639, 0.014281177893280983, -0.07294390350580215, -0.06753072887659073, 0.021053621545433998, 0.13508282601833344, -0.08582622557878494, 0.021907314658164978, 0.024399206042289734, 0.1201685294508934, -0.13381263613700867, -0.01080430019646883, 0.03694552928209305, 0.04928133264183998, -0.014786630868911743, 0.12398507446050644, -0.25625309348106384, 0.13378261029720306, 0.01227391418069601, 0.05722760036587715, -0.03258886560797691, 0.01729031465947628, -0.03487742319703102, -0.0002084914012812078, 0.11220147460699081, 0.012309128418564796, -0.007085450459271669, -0.11185725778341293, -0.09777598083019257, 0.0038575397338718176, 0.0588868111371994, -0.06646233052015305, 0.09474414587020874, 0.05610320344567299, 0.018041377887129784, -0.01504905428737402, 0.018693620339035988, -0.07062825560569763, -0.1560954749584198, 0.005104269366711378, -0.03497091308236122, -0.011605723761022091, -0.014094047248363495, -0.023167621344327927, -0.06392467767000198, 0.18284223973751068, -0.1052718237042427, -0.08280631899833679, -0.0728716254234314, 0.007579082623124123, 0.13790450990200043, -0.0738910511136055, 0.0011422873940318823, 0.003153383731842041, 0.04359776899218559, -0.02400815486907959, -0.03228367865085602, 0.0872538834810257, -0.08046755194664001, -0.10154297202825546, -0.07487144321203232, 0.10830220580101013, 0.07752770930528641, 0.0471230149269104, -0.01723838783800602, 0.03144482150673866, -0.017831599339842796, -0.11083968728780746, -0.01799921691417694, 0.03506536781787872, 0.11137600988149643, 0.061063334345817566, -0.03755122423171997, -0.03541329503059387, -0.07430467009544373, -0.059306953102350235, 0.07204355299472809, 0.1593768149614334, -0.03788161650300026, 0.024285636842250824, 0.19379226863384247, -0.10472743213176727, -0.1843457669019699, -0.057759515941143036, 0.0742243155837059, 0.07623575627803802, -0.01114159356802702, -0.1750980168581009, 0.04879637807607651, 0.09790366142988205, 0.007947145029902458, 0.06685192137956619, -0.3759394586086273, -0.14152556657791138, 0.08457894623279572, 0.02885264717042446, -0.05067313835024834, -0.12803228199481964, -0.05556230992078781, -0.06300847232341766, -0.021771539002656937, 0.08741235733032227, -0.03739368915557861, 0.09184923022985458, -0.004131501540541649, 0.026040691882371902, 0.044233258813619614, -0.03490196913480759, 0.13131079077720642, 0.03651181608438492, 0.04740064963698387, -0.06475415080785751, 0.06336525827646255, 0.005503272637724876, -0.018846780061721802, 0.1551608443260193, -0.043263327330350876, 0.056639526039361954, -0.13799861073493958, -0.059572651982307434, -0.06456359475851059, 0.02210371568799019, -0.04025113210082054, -0.07327330112457275, -0.05729003623127937, 0.04192385450005531, 0.044832743704319, -0.005388414487242699, -0.011361701413989067, -0.054206714034080505, -0.003559018485248089, 0.15689398348331451, 0.11733779311180115, 0.03005218878388405, -0.11305878311395645, 0.0306780356913805, 0.001367133460007608, 0.08132486045360565, -0.07752043008804321, 0.008694914169609547, 0.13783451914787292, 0.01641610451042652, 0.12089720368385315, -0.010978920385241508, -0.14771778881549835, -0.006299914792180061, 0.047757405787706375, -0.08597071468830109, -0.13237977027893066, -0.0027742411475628614, 0.030116742476820946, -0.07760457694530487, -0.03485657274723053, 0.09771629422903061, -0.09265254437923431, -0.020223215222358704, 0.009148761630058289, 0.03638968989253044, -0.030636630952358246, 0.20135287940502167, 0.03754451125860214, 0.04024289548397064, -0.06092212721705437, 0.11801707744598389, 0.13812360167503357, -0.16077609360218048, 0.019662020727992058, 0.18907830119132996, -0.08155357837677002, -0.06749749928712845, 0.0032994721550494432, 0.1267002522945404, -0.024753769859671593, -0.060865674167871475, -0.02385525591671467, -0.05768432468175888, 0.033303890377283096, -0.007241027429699898, 0.0386497862637043, 0.044829048216342926, -0.016649022698402405, -0.009867439977824688, -0.09686248749494553, 0.09407464414834976, 0.0789259597659111, 0.042071979492902756, -0.02619621716439724, 0.12954099476337433, 0.030752109363675117, -0.0192127525806427, -0.011482920497655869, 0.008585871197283268, -0.06556136906147003, 0.013612181879580021, -0.08964569121599197, 0.0155076514929533, -0.048974476754665375, -0.011396965943276882, -0.01792510412633419, 0.005093063693493605, -0.013670333661139011, 0.00011674784036586061, -0.024656783789396286, -0.055278707295656204, -0.034368064254522324, 0.02483622170984745, -0.09378131479024887, -0.04209524020552635, 0.013727178797125816, -0.02500053122639656, 0.052658699452877045, -0.0013387728249654174, 0.012362348847091198, -0.011491655372083187, -0.004663591273128986, 0.0695740282535553, 0.018417559564113617, 0.047059524804353714, -0.012277861125767231, -0.08774933964014053, 0.008527956902980804, 0.01760551519691944, -0.010669706389307976, -0.01812109351158142, 0.01712656207382679, -0.15323667228221893, 0.0030739298090338707, -0.0129985511302948, -0.03052571415901184, -0.07949268817901611, 0.07597704231739044, 0.04261573404073715, 0.06489335745573044, 0.11114191263914108, -0.07256203889846802, 0.08262116461992264, -0.16696441173553467, -0.009443219751119614, 0.019933054223656654, 0.013925069011747837, -0.023170173168182373, -0.001441856031306088, 0.051890041679143906, -0.06488532572984695, 0.14131200313568115, 0.023603813722729683, 0.0609155036509037, 0.024227187037467957, -0.07381222397089005, -0.007945788092911243, 0.029285883530974388, 0.0776248574256897, -0.029957979917526245, -0.029840532690286636, -0.07365037500858307, 0.08637336641550064, 0.004891264718025923, 0.06771058589220047, 0.049939367920160294, 0.13146309554576874, 0.11804448068141937, 0.037192344665527344, -0.008176849223673344, -0.1061742901802063, -0.06829160451889038, 0.04620445892214775, -0.00009721193055156618, 0.037103187292814255, -0.009648059494793415, 0.1008380725979805, 0.12035491317510605, -0.13397401571273804, 0.12219281494617462, -0.010077476501464844, -0.08004215359687805, -0.039729710668325424, -0.12721548974514008, -0.04260028898715973, -0.009999989531934261, -0.04782997444272041, -0.11115539819002151, 0.007781007327139378, 0.0715164989233017, 0.04275686666369438, -0.0333595871925354, 0.13963264226913452, -0.04484331235289574, -0.1122858002781868, 0.04157828539609909, 0.020455285906791687, 0.08229047060012817, 0.035806238651275635, 0.03306413069367409, 0.05472737178206444, 0.013709034770727158, 0.0433729887008667, 0.05214676633477211, -0.020208874717354774, 0.004018977750092745, 0.010622882284224033, -0.053138479590415955, -0.039012838155031204, 0.01447715051472187, 0.06487345695495605, 0.20166391134262085, 0.05944744497537613, -0.06256185472011566, -0.018968209624290466, 0.18361124396324158, -0.061360735446214676, -0.08626127988100052, -0.09797970205545425, 0.2214067578315735, 0.042146410793066025, 0.032521579414606094, -0.0022542141377925873, -0.10965250432491302, -0.00954488292336464, 0.14339879155158997, 0.19434982538223267, -0.057612042874097824, -0.030087865889072418, 0.022104952484369278, -0.004569319076836109, 0.02852250635623932, 0.039325468242168427, 0.034360941499471664, 0.2680080533027649, -0.07790224999189377, 0.09320709109306335, -0.04206758737564087, 0.011821591295301914, -0.005533261224627495, 0.16588237881660461, 0.005708580370992422, 0.025442294776439667, -0.068169504404068, 0.07543905824422836, -0.022481687366962433, -0.18686537444591522, 0.01481979712843895, -0.08551774173974991, -0.11163894087076187, 0.011572083458304405, -0.01892182044684887, 0.054788738489151, 0.06559108942747116, 0.012877029366791248, 0.034792229533195496, 0.07771768420934677, 0.009771670214831829, -0.11455179005861282, -0.1109047457575798, 0.009486369788646698, -0.011082087643444538, 0.12433668226003647, 0.0020505490247160196, 0.1304895430803299, 0.08167973905801773, 0.010778381489217281, -0.10733828693628311, 0.09456731379032135, 0.02273825742304325, 0.04056817665696144, 0.08419199287891388, 0.09955134987831116, 0.005919551011174917, 0.06533867865800858, 0.04662711173295975, -0.07884914427995682, 0.018481170758605003, -0.03519001603126526, -0.02137032523751259, -0.13727492094039917, 0.08532385528087616, -0.038328126072883606, 0.1456030160188675, 0.17782746255397797, -0.019544221460819244, -0.02056073397397995, -0.04350073263049126, 0.0076146963983774185, -0.008129590190947056, 0.09839789569377899, -0.011421434581279755, -0.15067435801029205, 0.024634167551994324, -0.07663235813379288, 0.03635111078619957, -0.2682981789112091, -0.02545768953859806, 0.02151472121477127, -0.05656035244464874, -0.0015369114698842168, 0.07695642113685608, 0.03766041249036789, 0.05197431892156601, -0.05681965500116348, -0.07534702122211456, -0.0010966201080009341, 0.10507838428020477, -0.10770966857671738, -0.12060844898223877 ]
null
null
transformers
# legal_t5_small_trans_en_es_small_finetuned model Model on translating legal text from English to Spanish. It was first released in [this repository](https://github.com/agemagician/LegalTrans). This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. ## Model description legal_t5_small_trans_en_es_small_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal_t5_small_trans_en_es_small_finetuned is based on the `t5-small` model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using `dmodel = 512`, `dff = 2,048`, 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. ## Intended uses & limitations The model could be used for translation of legal texts from English to Spanish. ### How to use Here is how to use this model to translate legal text from English to Spanish in PyTorch: ```python from transformers import AutoTokenizer, AutoModelWithLMHead, TranslationPipeline pipeline = TranslationPipeline( model=AutoModelWithLMHead.from_pretrained("SEBIS/legal_t5_small_trans_en_es_small_finetuned"), tokenizer=AutoTokenizer.from_pretrained(pretrained_model_name_or_path = "SEBIS/legal_t5_small_trans_en_es", do_lower_case=False, skip_special_tokens=True), device=0 ) en_text = "Instructs its President to forward this resolution to the Council and Commission and the Government and Parliament of Uzbekistan." pipeline([en_text], max_length=512) ``` ## Training data The legal_t5_small_trans_en_es_small_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on [JRC-ACQUIS](https://wt-public.emm4u.eu/Acquis/index_2.2.html), [EUROPARL](https://www.statmt.org/europarl/), and [DCEP](https://ec.europa.eu/jrc/en/language-technologies/dcep) dataset consisting of 9 Million parallel texts. ## Training procedure The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. ## Evaluation results When the model is used for translation test dataset, achieves the following results: Test results : | Model | BLEU score | |:-----:|:-----:| | legal_t5_small_trans_en_es_small_finetuned | 53.692| ### BibTeX entry and citation info > Created by [Ahmed Elnaggar/@Elnaggar_AI](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/)
{"language": "English Spanish", "tags": ["translation English Spanish model"], "datasets": ["dcep europarl jrc-acquis"], "widget": [{"text": "Instructs its President to forward this resolution to the Council and Commission and the Government and Parliament of Uzbekistan."}]}
text2text-generation
SEBIS/legal_t5_small_trans_en_es_small_finetuned
[ "transformers", "pytorch", "jax", "t5", "text2text-generation", "translation English Spanish model", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:04+00:00
[]
[ "English Spanish" ]
TAGS #transformers #pytorch #jax #t5 #text2text-generation #translation English Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
legal\_t5\_small\_trans\_en\_es\_small\_finetuned model ======================================================= Model on translating legal text from English to Spanish. It was first released in this repository. This model is first pretrained all the translation data over some unsupervised task. Then the model is trained on three parallel corpus from jrc-acquis, europarl and dcep. Model description ----------------- legal\_t5\_small\_trans\_en\_es\_small\_finetuned is initially pretrained on unsupervised task with the all of the data of the training set. The unsupervised task was "masked language modelling". legal\_t5\_small\_trans\_en\_es\_small\_finetuned is based on the 't5-small' model and was trained on a large corpus of parallel text. This is a smaller model, which scales the baseline model of t5 down by using 'dmodel = 512', 'dff = 2,048', 8-headed attention, and only 6 layers each in the encoder and decoder. This variant has about 60 million parameters. Intended uses & limitations --------------------------- The model could be used for translation of legal texts from English to Spanish. ### How to use Here is how to use this model to translate legal text from English to Spanish in PyTorch: Training data ------------- The legal\_t5\_small\_trans\_en\_es\_small\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts. Training procedure ------------------ The model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. ### Preprocessing An unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model. ### Pretraining The pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly. Evaluation results ------------------ When the model is used for translation test dataset, achieves the following results: Test results : ### BibTeX entry and citation info > > Created by Ahmed Elnaggar/@Elnaggar\_AI | LinkedIn > > >
[ "### How to use\n\n\nHere is how to use this model to translate legal text from English to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ "TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### How to use\n\n\nHere is how to use this model to translate legal text from English to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.", "### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.", "### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :", "### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ 57, 208, 50, 68, 34 ]
[ "passage: TAGS\n#transformers #pytorch #jax #t5 #text2text-generation #translation English Spanish model #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to translate legal text from English to Spanish in PyTorch:\n\n\nTraining data\n-------------\n\n\nThe legal\\_t5\\_small\\_trans\\_en\\_es\\_small\\_finetuned (the supervised task which involved only the corresponding langauge pair and as well as unsupervised task where all of the data of all language pairs were available) model was trained on JRC-ACQUIS, EUROPARL, and DCEP dataset consisting of 9 Million parallel texts.\n\n\nTraining procedure\n------------------\n\n\nThe model was trained on a single TPU Pod V3-8 for 250K steps in total, using sequence length 512 (batch size 4096). It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Preprocessing\n\n\nAn unigram model trained with 88M lines of text from the parallel corpus (of all possible language pairs) to get the vocabulary (with byte pair encoding), which is used with this model.### Pretraining\n\n\nThe pre-training data was the combined data from all the 42 language pairs. The task for the model was to predict the portions of a sentence which were masked randomly.\n\n\nEvaluation results\n------------------\n\n\nWhen the model is used for translation test dataset, achieves the following results:\n\n\nTest results :### BibTeX entry and citation info\n\n\n\n> \n> Created by Ahmed Elnaggar/@Elnaggar\\_AI | LinkedIn\n> \n> \n>" ]
[ -0.1002928614616394, 0.11876772344112396, -0.002728697843849659, 0.0747818574309349, 0.0703844428062439, 0.0032076858915388584, 0.02305581606924534, 0.08494828641414642, -0.024194808676838875, 0.07042489945888519, 0.03329843282699585, -0.016417376697063446, 0.035805583000183105, 0.035474639385938644, 0.042262982577085495, -0.2154075801372528, 0.002944441046565771, -0.05118698254227638, -0.04786910116672516, 0.09931819885969162, 0.07951484620571136, -0.05269370973110199, 0.03989322856068611, -0.0014079033862799406, -0.045201849192380905, 0.025040503591299057, -0.08522354811429977, -0.05182002857327461, 0.10697835683822632, 0.11570747196674347, 0.08075014501810074, -0.0044310069642961025, 0.06018116697669029, -0.18975992500782013, -0.005333990789949894, 0.0668366327881813, -0.0074106850661337376, 0.037777166813611984, 0.12588942050933838, -0.0022766324691474438, 0.14935626089572906, -0.006846041418612003, 0.03255608305335045, 0.055009469389915466, -0.11978909373283386, -0.10954233258962631, -0.04976492375135422, -0.013684223406016827, 0.07010689377784729, 0.12819306552410126, -0.046122629195451736, 0.060784753412008286, -0.0454055480659008, 0.0630214512348175, 0.08550482243299484, -0.2156347930431366, -0.014719253405928612, 0.015498299151659012, 0.019492555409669876, 0.08141039311885834, -0.01143638789653778, 0.0072142877615988255, 0.07481592893600464, 0.05494821071624756, 0.04665398597717285, -0.04136331006884575, -0.038324009627103806, -0.03938640281558037, -0.1282324194908142, -0.05920237675309181, 0.14251549541950226, 0.039073072373867035, -0.03736823424696922, -0.09008639305830002, -0.0625341609120369, -0.06087202951312065, 0.00421583978459239, -0.04582936689257622, 0.009794737212359905, -0.004808096215128899, 0.06786713004112244, -0.02288791351020336, -0.11278615891933441, -0.06196796894073486, -0.07417814433574677, 0.1514500230550766, 0.03702453523874283, 0.018282728269696236, 0.049069490283727646, 0.07885028421878815, -0.12308201938867569, -0.06416451930999756, 0.024698948487639427, 0.022471541538834572, -0.10521842539310455, 0.016615556553006172, -0.038264937698841095, -0.18134881556034088, -0.02515384368598461, -0.02489902824163437, -0.104781873524189, 0.033976368606090546, 0.09752792119979858, 0.03959360718727112, 0.06496470421552658, 0.09666280448436737, -0.13291440904140472, -0.11352083086967468, -0.035083118826150894, 0.00788295827805996, -0.014165714383125305, 0.02675279974937439, -0.08619897812604904, -0.030411895364522934, 0.018259290605783463, 0.03264572471380234, -0.004840729292482138, 0.0201632808893919, -0.03053690679371357, -0.0443749874830246, 0.13240575790405273, -0.08986565470695496, -0.0007552350289188325, -0.006132085341960192, -0.10582278668880463, -0.03175489231944084, 0.08480356633663177, -0.009002093225717545, -0.07893311232328415, 0.044316988438367844, -0.022735005244612694, -0.03321937099099159, -0.10932754725217819, -0.1839459389448166, -0.010250289924442768, 0.007344034966081381, -0.06412199139595032, -0.11715955287218094, -0.1090584322810173, -0.09867695719003677, 0.04388628527522087, -0.05675575137138367, -0.008835867047309875, -0.09240053594112396, 0.010402633808553219, 0.004734633956104517, -0.013791099190711975, 0.06383491307497025, -0.035128626972436905, 0.02420058101415634, 0.042227502912282944, 0.07685522735118866, 0.004994040355086327, 0.0289893988519907, -0.10325099527835846, 0.05095530301332474, -0.14480797946453094, 0.17125515639781952, -0.024374008178710938, 0.02017669752240181, -0.1295166313648224, -0.04841893911361694, -0.07233034819364548, 0.08299154043197632, 0.05920013040304184, 0.12121167778968811, -0.18678532540798187, -0.027710620313882828, 0.20184308290481567, -0.07022595405578613, -0.08338258415460587, 0.13836905360221863, -0.03707132115960121, 0.05081620439887047, 0.09583435952663422, 0.08477164804935455, 0.06784429401159286, -0.03353345766663551, -0.02986830845475197, -0.008254463784396648, 0.02951699122786522, 0.04169667512178421, 0.08427836000919342, -0.06160630285739899, 0.0077618821524083614, 0.03198036551475525, -0.0006939888698980212, 0.032087065279483795, -0.02302844636142254, -0.037240080535411835, 0.009274771437048912, -0.05943725258111954, -0.008620214648544788, 0.03050902858376503, 0.02047153376042843, -0.06178498640656471, -0.08617980033159256, 0.056420713663101196, 0.09326354414224625, -0.04678221419453621, -0.000701744167599827, -0.01147216372191906, -0.05514354631304741, -0.12660011649131775, -0.002331526018679142, -0.18330025672912598, -0.02221999317407608, 0.009144259616732597, -0.05973788723349571, 0.11492421478033066, 0.08196242153644562, 0.06056500971317291, 0.11023763567209244, -0.05570772662758827, -0.018620042130351067, 0.0050821080803871155, -0.012083794921636581, -0.09347468614578247, -0.11257375031709671, -0.034637849777936935, -0.03162093088030815, 0.005758046638220549, -0.1321329027414322, -0.005239798221737146, -0.06302428990602493, 0.08768703788518906, 0.011788303032517433, -0.013530915603041649, 0.03436781466007233, 0.07896897196769714, -0.03251097351312637, -0.02603945881128311, 0.03211168944835663, -0.0023022473324090242, -0.05701195448637009, 0.0938328430056572, -0.14618302881717682, -0.077863909304142, 0.07931693643331528, 0.002877691062167287, -0.12604008615016937, -0.053711239248514175, -0.009471754543483257, -0.04895378276705742, -0.06411026418209076, -0.07376591861248016, 0.25516271591186523, 0.04619889333844185, 0.16379353404045105, -0.12126494944095612, -0.008661879226565361, 0.017869170755147934, -0.024423398077487946, -0.020299432799220085, 0.1711544394493103, 0.09938198328018188, -0.1776009500026703, 0.08857082575559616, -0.019442889839410782, -0.03144308552145958, 0.10679130256175995, 0.07893271744251251, -0.07948445528745651, -0.009512750431895256, 0.04555302858352661, 0.006001235917210579, 0.03317783400416374, -0.08978893607854843, -0.017849158495664597, 0.025831660255789757, 0.048808686435222626, 0.06231051683425903, -0.10375285893678665, 0.05538276955485344, 0.07897670567035675, -0.014504725113511086, 0.07982460409402847, -0.04114952310919762, -0.032179251313209534, 0.08546885848045349, 0.02988814003765583, -0.06251244246959686, -0.04375062510371208, -0.028820697218179703, -0.1065855622291565, 0.1977999359369278, -0.07048644125461578, -0.2559197247028351, -0.12136121839284897, 0.07769647985696793, -0.03964049369096756, 0.029848787933588028, 0.03169556334614754, -0.06635384261608124, -0.05323636531829834, -0.10921134799718857, 0.09580246359109879, -0.09420685470104218, -0.0529928132891655, -0.12034979462623596, 0.06048109382390976, 0.004169676452875137, -0.12012231349945068, 0.021863818168640137, 0.011958508752286434, -0.009826117195189, -0.007893065921962261, -0.06940001994371414, 0.11926546692848206, 0.15378247201442719, -0.005632500164210796, -0.014732741750776768, -0.0031105352099984884, 0.13799092173576355, -0.07205834239721298, 0.04670320078730583, 0.06904511898756027, 0.06529679149389267, 0.033043645322322845, 0.15817973017692566, 0.04664209485054016, -0.07652570307254791, 0.05807676911354065, 0.06167985871434212, -0.02112434245646, -0.19766013324260712, -0.13106071949005127, -0.05917121097445488, -0.036967139691114426, 0.11428548395633698, 0.028381410986185074, -0.018180804327130318, 0.03694380819797516, -0.050379566848278046, 0.06537767499685287, -0.03774450719356537, 0.04161833971738815, 0.07139696180820465, -0.01065909955650568, 0.07973472774028778, -0.05004056915640831, -0.06733033061027527, 0.09928552806377411, 0.02122689224779606, 0.19174456596374512, -0.04802046716213226, 0.22178108990192413, 0.07214509695768356, 0.044228702783584595, 0.015524396672844887, 0.033308349549770355, -0.040448110550642014, 0.021545633673667908, -0.034841764718294144, -0.06395340710878372, -0.016106415539979935, 0.07656227052211761, 0.005125568248331547, 0.008053301833570004, -0.0570957288146019, -0.05385943874716759, 0.03043835051357746, 0.20008981227874756, 0.06307058036327362, -0.2294314056634903, -0.06450801342725754, -0.009493928402662277, -0.05739039182662964, -0.05603991821408272, 0.0067801037803292274, 0.14052936434745789, -0.08430534601211548, 0.010187317617237568, 0.0018113894620910287, 0.12450437247753143, -0.13899430632591248, -0.020067373290657997, 0.044333938509225845, 0.05338506028056145, -0.008226669393479824, 0.12922966480255127, -0.19719024002552032, 0.1885576695203781, 0.018335070461034775, 0.038683366030454636, -0.04658935219049454, -0.005072595551609993, -0.036740873008966446, 0.008276769891381264, 0.09003391861915588, 0.01368713192641735, -0.0028042325284332037, -0.14200520515441895, -0.09430447965860367, -0.024192217737436295, 0.053348131477832794, -0.062056273221969604, 0.10764995217323303, 0.061829835176467896, 0.006492012646049261, -0.01054516900330782, 0.03596261143684387, -0.07188943773508072, -0.1688670665025711, -0.006551982834935188, 0.0071637495420873165, -0.03150554001331329, -0.008451693691313267, -0.05619274824857712, -0.07801682502031326, 0.20976632833480835, -0.04833327233791351, -0.07462026923894882, -0.08089876174926758, 0.03164754807949066, 0.1544959843158722, -0.07504617422819138, 0.015183297917246819, 0.0011325026862323284, 0.059586506336927414, -0.03741047903895378, -0.0383821502327919, 0.11017774790525436, -0.0917096957564354, -0.08066669851541519, -0.0713510662317276, 0.10577676445245743, 0.07702276855707169, 0.04652747884392738, -0.015890028327703476, 0.032601889222860336, 0.006391298025846481, -0.11226454377174377, -0.029845671728253365, 0.062080398201942444, 0.1036107987165451, 0.04912453889846802, -0.04215564578771591, -0.019942566752433777, -0.05970464646816254, -0.0712953507900238, 0.0986013114452362, 0.15698015689849854, -0.06249561905860901, 0.04271097481250763, 0.1966627538204193, -0.08889279514551163, -0.14870339632034302, -0.05050240457057953, 0.11498869955539703, 0.10006255656480789, 0.004559132270514965, -0.1761258989572525, 0.03866927698254585, 0.08771765977144241, -0.007130805868655443, 0.04291597753763199, -0.40475335717201233, -0.14299596846103668, 0.047199223190546036, 0.044028908014297485, -0.023215709254145622, -0.10903853923082352, -0.06380807608366013, -0.09379036724567413, -0.03681214153766632, 0.09358883649110794, -0.009191174060106277, 0.09370345622301102, -0.002356872893869877, 0.05200846120715141, 0.05206752568483353, -0.03318837285041809, 0.1254454255104065, -0.016692819073796272, 0.028970791026949883, -0.06199508532881737, 0.054110053926706314, 0.0156659297645092, -0.014145337976515293, 0.1567337065935135, -0.06648803502321243, 0.059196244925260544, -0.13398411870002747, -0.09174230694770813, -0.05879991874098778, 0.007865400053560734, -0.028992988169193268, -0.06548576802015305, -0.052750810980796814, 0.03783983364701271, 0.03856641426682472, -0.008113688789308071, -0.03644029051065445, -0.0485195554792881, 0.016582174226641655, 0.19479770958423615, 0.10980414599180222, 0.0243466068059206, -0.14296074211597443, 0.031849008053541183, 0.0023721973411738873, 0.07074781507253647, -0.10158734023571014, -0.003973689861595631, 0.139200359582901, 0.02715347893536091, 0.11349320411682129, -0.024117926135659218, -0.15937486290931702, 0.020195353776216507, 0.06382756680250168, -0.06021753326058388, -0.1518608182668686, -0.03214922919869423, 0.08462821692228317, -0.06987760961055756, -0.015722353011369705, 0.10717713087797165, -0.10239585489034653, -0.019200732931494713, -0.019502034410834312, 0.003961994778364897, -0.024839211255311966, 0.20738042891025543, 0.04967786371707916, 0.0387883260846138, -0.05590121075510979, 0.09736555069684982, 0.12334392219781876, -0.13453032076358795, 0.02781699411571026, 0.15461395680904388, -0.10293522477149963, -0.05795768275856972, 0.03524395450949669, 0.10781852900981903, -0.06831827014684677, -0.07423568516969681, -0.05134158581495285, -0.04478617385029793, 0.018874159082770348, 0.01405902300029993, 0.054510463029146194, 0.01524147018790245, -0.02834508940577507, -0.04707588627934456, -0.06140942499041557, 0.07799456268548965, 0.08585423231124878, 0.01878804713487625, -0.04927002638578415, 0.12272688001394272, 0.03424115478992462, -0.04159058257937431, -0.011740770190954208, 0.0006621251231990755, -0.05315827950835228, 0.021018795669078827, -0.11223278939723969, 0.009718788787722588, -0.054718419909477234, -0.008935678750276566, -0.01774638146162033, -0.0033428622409701347, -0.009297246113419533, -0.00393203180283308, -0.046624135226011276, -0.04206852987408638, -0.039149828255176544, 0.01177059393376112, -0.07200201600790024, -0.03791511058807373, 0.007625817786902189, -0.009866084903478622, 0.033337824046611786, -0.013852115720510483, 0.004771155770868063, 0.009868850000202656, -0.016497734934091568, 0.0706687793135643, 0.03534100577235222, 0.05315679311752319, 0.005047324113547802, -0.0723315179347992, 0.017851412296295166, 0.0379389263689518, 0.02129078283905983, -0.016382139176130295, 0.024765949696302414, -0.1378563642501831, -0.03624052554368973, 0.000023572709324071184, -0.039874933660030365, -0.08653631806373596, 0.08977028727531433, 0.07869274914264679, 0.044946182519197464, 0.0939265787601471, -0.07081327587366104, 0.05883190035820007, -0.17069712281227112, -0.02138501964509487, 0.004422955214977264, 0.015105982311069965, -0.043155066668987274, 0.0019028467359021306, 0.04781600832939148, -0.04275907948613167, 0.1536356657743454, 0.02411864697933197, 0.08973275125026703, 0.004931826144456863, -0.09094075858592987, 0.032953351736068726, 0.008701995015144348, 0.08251628279685974, -0.0021539092995226383, -0.0094075882807374, -0.08152598142623901, 0.11191051453351974, 0.00436008907854557, 0.08357217162847519, 0.01838085986673832, 0.11511262506246567, 0.09651806205511093, 0.03927840292453766, 0.015485716052353382, -0.0861673653125763, -0.03995964676141739, 0.04278611019253731, -0.022592727094888687, 0.06509983539581299, -0.02222367748618126, 0.1250898689031601, 0.1380329579114914, -0.12025320529937744, 0.13413050770759583, 0.0068785082548856735, -0.07397061586380005, -0.049793899059295654, -0.1464690864086151, -0.06169385835528374, -0.059055328369140625, -0.042425476014614105, -0.09594638645648956, -0.002907433779910207, 0.05262171849608421, 0.04896215349435806, -0.022882254794239998, 0.11796358227729797, -0.005704039242118597, -0.11819495260715485, 0.049897681921720505, -0.0023293725680559874, 0.09670567512512207, 0.020531702786684036, 0.04283805564045906, 0.0858030840754509, -0.0011305701918900013, 0.027162158861756325, 0.0630916878581047, -0.04579540342092514, -0.008075863122940063, 0.030430439859628677, -0.06034237518906593, -0.04997189715504646, 0.016509108245372772, 0.06752602010965347, 0.19067972898483276, 0.0548047237098217, -0.07240305840969086, -0.0392821729183197, 0.17367218434810638, -0.05867563933134079, -0.058399640023708344, -0.10858076065778732, 0.22266438603401184, 0.022253621369600296, 0.05414297431707382, -0.009094341658055782, -0.09837498515844345, -0.00021137717703823, 0.13304518163204193, 0.18452118337154388, -0.05611201003193855, -0.037738554179668427, 0.021687520667910576, -0.015403350815176964, 0.01638222485780716, 0.04515998438000679, 0.00604863278567791, 0.30506786704063416, -0.07876428961753845, 0.07007955014705658, -0.06020132452249527, 0.005537794902920723, 0.015087681822478771, 0.12841418385505676, -0.0016538412310183048, 0.02929222583770752, -0.06138657405972481, 0.09982912987470627, 0.0071087065152823925, -0.16618934273719788, 0.022066842764616013, -0.09801419824361801, -0.11632794886827469, -0.014231800101697445, -0.02465364709496498, 0.075813427567482, 0.08023715764284134, 0.0025535565800964832, 0.03579569235444069, 0.05002715811133385, 0.004573690705001354, -0.14558307826519012, -0.13756859302520752, 0.005912202410399914, 0.0009241701336577535, 0.09882569313049316, -0.004373915959149599, 0.14108096063137054, 0.09295712411403656, 0.002685046987608075, -0.09840882569551468, 0.09631437808275223, 0.026736551895737648, 0.019233206287026405, 0.10935669392347336, 0.06541126221418381, 0.0005670774262398481, 0.06299971044063568, 0.03634994104504585, -0.08392121642827988, 0.05676071345806122, -0.037743229418992996, -0.016620026901364326, -0.14625874161720276, 0.09637252241373062, -0.03492242842912674, 0.1399591714143753, 0.1818946897983551, -0.01904768869280815, -0.0130172623321414, -0.04290907457470894, 0.037797726690769196, -0.0031201159581542015, 0.07753292471170425, -0.020363137125968933, -0.19475044310092926, 0.011323881335556507, -0.021765345707535744, 0.0254839975386858, -0.2748437225818634, -0.044799961149692535, 0.0342676043510437, -0.053331077098846436, 0.025566313415765762, 0.05674046277999878, 0.0644230917096138, 0.06317279487848282, -0.04354134202003479, -0.08185505121946335, -0.010988320223987103, 0.10957420617341995, -0.11570071429014206, -0.10824121534824371 ]