sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus ruby dataset.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/ruby/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_base_code_documentation_generation_ruby
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus ruby dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10556253045797348,
0.022063298150897026,
-0.0010291228536516428,
0.0963207557797432,
0.15268293023109436,
0.006724341306835413,
0.09932102262973785,
0.06403470039367676,
-0.05175323784351349,
-0.02924375981092453,
0.09491511434316635,
0.18755190074443817,
0.01952844113111496,
0.10570422559976578,
-0.04370696097612381,
-0.1921980381011963,
-0.028733719140291214,
0.04833739250898361,
-0.15653087198734283,
0.12555357813835144,
0.1312444657087326,
-0.05490992218255997,
0.10662342607975006,
-0.014953993260860443,
-0.24534599483013153,
0.05497720465064049,
-0.005077112466096878,
-0.03210311383008957,
0.14354316890239716,
0.07336049526929855,
0.11581812053918839,
0.030973998829722404,
0.00561912264674902,
-0.22698715329170227,
0.03453093767166138,
-0.029603781178593636,
0.012331176549196243,
0.056629735976457596,
0.06744362413883209,
-0.0534360446035862,
0.1687280684709549,
-0.02443680912256241,
0.019851773977279663,
0.05666778236627579,
-0.1050042062997818,
-0.08434843271970749,
-0.0607566200196743,
0.027919519692659378,
0.04467623308300972,
0.06579399853944778,
0.017374154180288315,
0.1448184847831726,
-0.1046065017580986,
0.14581263065338135,
0.1513228565454483,
-0.21445441246032715,
-0.02823876217007637,
0.11858557164669037,
0.06490299105644226,
-0.04320589452981949,
-0.028044171631336212,
0.02209693193435669,
0.07602746039628983,
0.01832379586994648,
-0.005965677089989185,
-0.13038352131843567,
-0.14886592328548431,
0.0714426338672638,
-0.0851464718580246,
-0.05627618730068207,
0.24461187422275543,
-0.018714601173996925,
-0.05601796880364418,
-0.04076942801475525,
-0.03909878805279732,
-0.007992174476385117,
-0.01055075228214264,
0.011307654902338982,
0.005235565360635519,
-0.0077707841992378235,
-0.026843948289752007,
-0.009812078438699245,
-0.08552828431129456,
-0.14121493697166443,
0.03777853026986122,
0.09368553757667542,
0.012195765040814877,
0.03866833448410034,
-0.16712099313735962,
0.09377141296863556,
0.10118228942155838,
-0.061439480632543564,
0.010825978592038155,
-0.07915288209915161,
-0.03424656391143799,
0.002221033675596118,
-0.06301601231098175,
-0.11323859542608261,
0.09743832796812057,
0.12181844562292099,
-0.04343767464160919,
0.07705941051244736,
-0.00013300999125931412,
0.0789090022444725,
0.023116575554013252,
0.18302860856056213,
0.03571588173508644,
-0.07972485572099686,
0.05274825170636177,
-0.041821788996458054,
-0.049732401967048645,
0.009770119562745094,
-0.07423852384090424,
-0.0572369247674942,
0.052036602050065994,
0.12266764044761658,
-0.08835709095001221,
0.07288958132266998,
-0.07879249006509781,
-0.03274309262633324,
0.024872679263353348,
-0.11660496145486832,
-0.02506573125720024,
0.007018950767815113,
-0.0615200512111187,
-0.056185703724622726,
0.08138178288936615,
-0.05881461501121521,
-0.0914326086640358,
-0.01818827912211418,
-0.05017473176121712,
0.0008716486627236009,
-0.09157764911651611,
-0.08158586174249649,
-0.013376674614846706,
-0.004175305832177401,
0.08530127257108688,
-0.13320496678352356,
-0.18209899961948395,
-0.006124977488070726,
0.0780792310833931,
-0.002793537685647607,
0.026754146441817284,
-0.08264826983213425,
-0.006272620055824518,
-0.032429542392492294,
-0.0024262636434286833,
0.027215490117669106,
-0.05107029527425766,
0.0805152878165245,
0.110600046813488,
0.02941681630909443,
-0.06854084134101868,
0.03691127523779869,
-0.11766663193702698,
0.052899319678545,
-0.1516416370868683,
0.10405143350362778,
-0.042179759591817856,
0.11804378032684326,
-0.11383624374866486,
-0.05918409302830696,
0.04088157042860985,
0.056500133126974106,
0.07072951644659042,
0.14465351402759552,
-0.16297663748264313,
-0.04255250096321106,
0.16117332875728607,
-0.10908827185630798,
-0.20223934948444366,
0.0827571377158165,
-0.08212727308273315,
0.13175082206726074,
0.06796065717935562,
0.18078213930130005,
0.12906241416931152,
-0.028989067301154137,
0.04209697246551514,
0.07148995250463486,
-0.05030997097492218,
-0.06410012394189835,
0.07702500373125076,
0.07509056478738785,
-0.17287813127040863,
0.05267052724957466,
-0.014126372523605824,
0.07202009111642838,
-0.040112193673849106,
-0.047150783240795135,
-0.024005820974707603,
-0.07956469804048538,
0.08285710960626602,
-0.006071323063224554,
0.09085589647293091,
-0.0075478036887943745,
-0.015529729425907135,
-0.02073994278907776,
0.11640286445617676,
-0.09312272071838379,
0.0013587201246991754,
-0.11573874205350876,
0.05839060992002487,
-0.07048877328634262,
0.021592266857624054,
-0.23905330896377563,
0.066292904317379,
0.026197485625743866,
0.07995378971099854,
0.06587487459182739,
0.033198051154613495,
0.011219379492104053,
0.012580955401062965,
-0.005257444456219673,
-0.018970374017953873,
0.0061594038270413876,
-0.041024524718523026,
-0.041656363755464554,
-0.10823002457618713,
-0.03879490867257118,
-0.061129841953516006,
0.03123684599995613,
-0.2223583310842514,
-0.0025187861174345016,
0.060017019510269165,
0.05745770037174225,
0.04412571340799332,
0.023107891902327538,
0.01660621166229248,
0.05953868106007576,
-0.052651334553956985,
-0.018887845799326897,
0.06921364367008209,
0.031101347878575325,
-0.08350247889757156,
0.016336366534233093,
-0.023959843441843987,
0.02974439598619938,
0.09838727861642838,
-0.1455429494380951,
-0.07385379821062088,
-0.06928463280200958,
-0.025615597143769264,
-0.006694424897432327,
0.025404414162039757,
-0.001384096103720367,
0.2597721815109253,
0.0033248316030949354,
0.16920527815818787,
-0.10319515317678452,
-0.023532673716545105,
-0.031356025487184525,
-0.023071208968758583,
0.054858215153217316,
0.154707670211792,
0.05848615989089012,
-0.23635296523571014,
0.04005707427859306,
0.0275205560028553,
-0.032325636595487595,
0.16677561402320862,
-0.04826667159795761,
0.003915210720151663,
-0.02503267116844654,
0.06627701222896576,
-0.0035135073121637106,
0.13929536938667297,
-0.17248423397541046,
-0.034896112978458405,
0.007831042632460594,
-0.03447843715548515,
0.09871856123209,
-0.13615748286247253,
0.0010193935595452785,
0.027368148788809776,
-0.011440916918218136,
-0.10298792272806168,
0.04572047293186188,
-0.005993278697133064,
0.02814289927482605,
-0.004897852428257465,
-0.007649480365216732,
0.031653113663196564,
-0.03091689758002758,
-0.11662312597036362,
0.22573909163475037,
-0.08003067970275879,
-0.20090553164482117,
-0.16738606989383698,
0.009988346137106419,
-0.01997099630534649,
-0.009306794963777065,
0.060761045664548874,
-0.07424591481685638,
-0.01745668426156044,
-0.022400852292776108,
0.14355622231960297,
-0.07909911870956421,
-0.016409236937761307,
-0.017624298110604286,
0.08602222055196762,
-0.01361607201397419,
-0.17230106890201569,
-0.008751430548727512,
0.0032765772193670273,
0.06189187243580818,
0.029559306800365448,
-0.14588919281959534,
0.09709804505109787,
0.08214837312698364,
-0.04248456656932831,
0.04710448533296585,
-0.042633797973394394,
0.24512259662151337,
-0.055084869265556335,
-0.09837138652801514,
0.17774297297000885,
-0.10320264101028442,
0.029062459245324135,
0.03988425061106682,
0.013154399581253529,
-0.10806136578321457,
0.040721554309129715,
-0.05747261270880699,
-0.0542805977165699,
-0.26766741275787354,
-0.09483806043863297,
-0.08924639225006104,
0.11306152492761612,
0.03794844076037407,
0.03899776190519333,
-0.0861506387591362,
0.0683615580201149,
0.07701065391302109,
0.14382265508174896,
-0.016256019473075867,
0.07519569247961044,
0.016671130433678627,
0.01404496468603611,
0.005675271153450012,
-0.11078307032585144,
-0.054742053151130676,
0.03757861629128456,
0.08557292819023132,
0.22135807573795319,
0.00774407759308815,
0.1195661723613739,
0.061636004596948624,
0.058408431708812714,
0.06964769959449768,
0.17078691720962524,
-0.103343665599823,
0.016105804592370987,
-0.02165612019598484,
-0.03421071171760559,
-0.16155804693698883,
0.03306383267045021,
-0.0521705225110054,
0.016684427857398987,
-0.11325747519731522,
-0.03626325726509094,
0.05813801288604736,
0.11085688322782516,
0.009728047996759415,
-0.259811133146286,
-0.09659722447395325,
0.01752791926264763,
-0.05217090621590614,
-0.050147611647844315,
0.050043489784002304,
0.05993831157684326,
-0.13455690443515778,
0.010715640150010586,
-0.0772169902920723,
0.16743822395801544,
-0.11361629515886307,
0.03097446821630001,
-0.0701863169670105,
-0.04301760718226433,
0.010907206684350967,
0.14258405566215515,
-0.23130172491073608,
0.24238447844982147,
0.008157666772603989,
0.020301643759012222,
-0.0550265796482563,
0.03432429954409599,
0.019791757687926292,
0.1023540273308754,
0.12853802740573883,
-0.02918703481554985,
-0.08182623982429504,
-0.1700904816389084,
0.05137753114104271,
0.0669473186135292,
0.10919241607189178,
-0.029733039438724518,
0.07682450860738754,
-0.0453338548541069,
0.03464890271425247,
-0.024591362103819847,
-0.07221557199954987,
-0.12280154228210449,
-0.11919423937797546,
-0.010664671659469604,
-0.0828174576163292,
0.021636618301272392,
-0.031722292304039,
0.031741272658109665,
0.09994281828403473,
0.12968559563159943,
-0.09038997441530228,
-0.04810059443116188,
-0.10089373588562012,
0.029102645814418793,
0.08922933787107468,
-0.09465280920267105,
0.02215128019452095,
-0.0014756625751033425,
0.029076484963297844,
-0.007268078625202179,
-0.13460728526115417,
0.03855487331748009,
-0.07308269292116165,
-0.016905438154935837,
-0.03786090016365051,
0.0736636370420456,
-0.020283885300159454,
0.0015229634009301662,
0.054421503096818924,
-0.03853025659918785,
-0.06868048012256622,
-0.1231011301279068,
-0.07867726683616638,
-0.08104885369539261,
0.02918805368244648,
0.06135241687297821,
-0.11628358066082001,
0.04240838810801506,
-0.05159138888120651,
-0.03207821771502495,
0.23513129353523254,
0.14013609290122986,
-0.05395825207233429,
0.0070807537995278835,
0.09880348294973373,
-0.09866178780794144,
-0.26439112424850464,
-0.01667710952460766,
-0.005321829114109278,
0.06321695446968079,
0.043740201741456985,
-0.17007958889007568,
0.0731828585267067,
-0.0731416866183281,
0.025976423174142838,
-0.01954442262649536,
-0.24080698192119598,
-0.07924523204565048,
0.09458300471305847,
0.12457111477851868,
0.0951944962143898,
-0.10394987463951111,
-0.07495222240686417,
-0.10249117016792297,
-0.15162073075771332,
0.11932507157325745,
-0.09026606380939484,
0.093455970287323,
-0.022385206073522568,
0.03949205204844475,
0.03153211995959282,
-0.04235319048166275,
0.08548424392938614,
-0.016418958082795143,
0.1191054955124855,
-0.00843347143381834,
-0.14567595720291138,
0.0914023220539093,
-0.030514545738697052,
0.09483618289232254,
-0.15135915577411652,
0.12086945027112961,
-0.21166720986366272,
-0.034635577350854874,
-0.008713980205357075,
0.027250060811638832,
-0.006434024311602116,
-0.06804534047842026,
-0.07381878048181534,
0.016325896605849266,
0.017428450286388397,
0.016610588878393173,
0.10669150203466415,
-0.010918332263827324,
-0.025108467787504196,
0.13518360257148743,
0.12946932017803192,
0.04529067501425743,
-0.05900371074676514,
0.029580187052488327,
0.026883555576205254,
0.0945335328578949,
-0.2282688021659851,
0.07006028294563293,
0.11211100965738297,
0.03787785768508911,
0.1124548390507698,
0.07860274612903595,
-0.03388743847608566,
0.04142938554286957,
0.08464903384447098,
-0.1216389536857605,
-0.10328502207994461,
-0.06490243226289749,
-0.008904526941478252,
0.0046872165985405445,
0.0992584228515625,
0.14773157238960266,
-0.06435690820217133,
-0.022013040259480476,
-0.007079922128468752,
-0.021902775391936302,
-0.1395125389099121,
0.14516356587409973,
0.052016839385032654,
0.08070553094148636,
-0.08747052401304245,
0.07295553386211395,
0.04226658493280411,
-0.06526367366313934,
-0.026772039011120796,
0.09508273005485535,
-0.11893121898174286,
-0.08470537513494492,
0.01492843683809042,
0.300552636384964,
-0.09153445810079575,
-0.07725401967763901,
-0.1595374196767807,
-0.04032416641712189,
-0.011350765824317932,
0.15498997271060944,
0.1197410374879837,
0.08417481184005737,
-0.06887752562761307,
-0.0186066385358572,
-0.09000983834266663,
0.05693209543824196,
0.09640641510486603,
0.024678343906998634,
-0.14416389167308807,
0.06148383766412735,
-0.008772028610110283,
0.1419462263584137,
-0.050572022795677185,
-0.01598880998790264,
-0.17295445501804352,
0.08163589984178543,
-0.09809647500514984,
0.08383174985647202,
-0.007236359175294638,
0.036183781921863556,
-0.002730902284383774,
0.028725318610668182,
-0.03450356423854828,
0.04852602258324623,
-0.09334609657526016,
0.017926836386322975,
0.004959177691489458,
0.07022713869810104,
-0.06390794366598129,
-0.00985187478363514,
0.0910220667719841,
-0.06032627448439598,
0.10282725840806961,
0.01580033451318741,
-0.06508138030767441,
0.11634315550327301,
-0.1547505408525467,
-0.012769615277647972,
0.030077608302235603,
0.024484680965542793,
0.03451056778430939,
-0.030215036123991013,
0.03167083486914635,
-0.004905689507722855,
0.03378751128911972,
-0.009119759313762188,
0.14446844160556793,
-0.1133241280913353,
-0.08896508067846298,
-0.06136740744113922,
-0.10678893327713013,
-0.03494399040937424,
0.04494103416800499,
0.023815331980586052,
0.11646226048469543,
0.10370852798223495,
-0.023972176015377045,
0.042508814483881,
-0.06087281554937363,
-0.0105193005874753,
0.053676947951316833,
-0.10148314386606216,
-0.09630892425775528,
-0.11629629880189896,
0.037449005991220474,
-0.045285917818546295,
0.19468113780021667,
0.02185002900660038,
0.09774843603372574,
-0.025160765275359154,
-0.0033650975674390793,
-0.014880768023431301,
0.03608868643641472,
0.2212097942829132,
-0.02658838964998722,
0.05250583589076996,
-0.06044776365160942,
0.06304136663675308,
0.014594197273254395,
0.11436492949724197,
0.11523009836673737,
0.1822589486837387,
-0.014365483075380325,
0.07676757872104645,
0.03148875758051872,
0.0476241298019886,
-0.05986081063747406,
-0.08976487815380096,
0.08247361332178116,
0.03755943477153778,
-0.058259569108486176,
0.18376143276691437,
0.10795418918132782,
-0.07764153182506561,
0.09587329626083374,
-0.0040143621154129505,
-0.09011710435152054,
-0.06104359030723572,
0.004431746434420347,
-0.061038605868816376,
-0.15908141434192657,
0.01122203841805458,
-0.10532385110855103,
-0.06528417021036148,
0.08125177025794983,
0.006553972139954567,
-0.0475982129573822,
0.15133336186408997,
0.00012394454097375274,
-0.05239606648683548,
0.07225815206766129,
-0.028550630435347557,
-0.023061105981469154,
0.023103276267647743,
0.06712876260280609,
-0.014662308618426323,
-0.0470723994076252,
0.023824427276849747,
0.029035264626145363,
-0.08368180692195892,
0.00886018667370081,
-0.046580877155065536,
-0.04058399796485901,
-0.0303739495575428,
0.039125047624111176,
-0.0004324258188717067,
0.06344170868396759,
0.010697947815060616,
-0.02521129883825779,
-0.014423700049519539,
0.24588657915592194,
-0.04247559979557991,
-0.06580842286348343,
-0.11113846302032471,
0.18720398843288422,
0.04236215353012085,
0.07357825338840485,
0.010193577967584133,
-0.06665096431970596,
-0.017186405137181282,
0.2651093006134033,
0.20544211566448212,
-0.044466495513916016,
0.008872129023075104,
0.03266870975494385,
0.02704354003071785,
-0.00892898254096508,
0.13959725201129913,
0.028320491313934326,
0.19828779995441437,
-0.03435138612985611,
-0.06843003630638123,
-0.04930973798036575,
-0.05023317039012909,
0.04176989570260048,
0.11542133241891861,
0.04134276136755943,
-0.03904413804411888,
-0.048754628747701645,
0.11302193254232407,
-0.16914880275726318,
-0.15476630628108978,
0.029125981032848358,
-0.15830837190151215,
-0.08922480046749115,
-0.062424976378679276,
0.05773957073688507,
-0.021213501691818237,
0.07812689244747162,
-0.043296582996845245,
-0.010620727203786373,
0.01656617410480976,
0.02719072997570038,
-0.14121992886066437,
-0.09534897655248642,
0.06017545238137245,
-0.060124754905700684,
0.12004698067903519,
-0.04229143634438515,
0.12452113628387451,
0.11743658781051636,
0.03018055483698845,
-0.04062802344560623,
0.016011817380785942,
0.04737253114581108,
0.01345103420317173,
0.059854235500097275,
0.07953715324401855,
-0.03501897305250168,
0.09075261652469635,
-0.032394204288721085,
-0.08912211656570435,
0.04345233738422394,
0.04576386511325836,
0.0016522363293915987,
-0.13918110728263855,
-0.006506643258035183,
-0.1082218661904335,
0.07993561029434204,
0.16610507667064667,
-0.04481803998351097,
0.034830521792173386,
-0.06912075728178024,
0.11996340751647949,
0.018554698675870895,
-0.0013037814060226083,
-0.07415256649255753,
-0.1579577773809433,
-0.02802201174199581,
0.07394465804100037,
-0.03327066823840141,
-0.22117525339126587,
-0.022830503061413765,
-0.05738965421915054,
0.002033777302131057,
-0.05862024798989296,
0.12601353228092194,
0.1497228592634201,
0.02197480946779251,
-0.014064948074519634,
-0.22243869304656982,
-0.046996019780635834,
0.0644877552986145,
-0.10160787403583527,
-0.15185804665088654
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/ruby/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_base_code_documentation_generation_ruby_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.13035644590854645,
-0.017101194709539413,
-0.0009541710023768246,
0.1535404622554779,
0.11975249648094177,
0.02589966543018818,
0.0753096342086792,
0.07033619284629822,
-0.06443103402853012,
0.027037307620048523,
0.048092879354953766,
0.042805612087249756,
0.028453778475522995,
0.16928724944591522,
-0.0006485489429906011,
-0.1105956882238388,
-0.03448235243558884,
0.03343576937913895,
-0.05562499538064003,
0.12815535068511963,
0.09686100482940674,
-0.07361457496881485,
0.0551905520260334,
-0.07309660315513611,
-0.2585607171058655,
0.05506493151187897,
0.00008237514703068882,
-0.02997475489974022,
0.10644997656345367,
0.037180088460445404,
0.13001999258995056,
-0.007276636548340321,
0.017894422635436058,
-0.1501002013683319,
0.010608320124447346,
0.01142983976751566,
0.03678671643137932,
0.01868840865790844,
0.0706697553396225,
0.04230723902583122,
0.11515127867460251,
-0.007197017315775156,
0.04724469035863876,
0.06431975960731506,
-0.07184738665819168,
-0.11822381615638733,
-0.04321590065956116,
0.04750681668519974,
0.02019338496029377,
0.09528293460607529,
-0.013425121083855629,
0.13682399690151215,
-0.13287054002285004,
0.13872665166854858,
0.13993124663829803,
-0.2577648460865021,
-0.014461180195212364,
0.12630465626716614,
0.06506842374801636,
0.09941483289003372,
-0.04359642416238785,
-0.058029789477586746,
0.10428767651319504,
0.05070296302437782,
0.013750656507909298,
-0.0972263291478157,
-0.0897810086607933,
0.02720014378428459,
-0.09087380021810532,
-0.0668824166059494,
0.1916317194700241,
-0.009415716864168644,
-0.0930481031537056,
-0.04526025801897049,
-0.02564149536192417,
-0.15851236879825592,
0.028264382854104042,
0.03867784142494202,
0.008509390987455845,
-0.027802618220448494,
-0.0024384011048823595,
0.03715244308114052,
-0.05764475837349892,
-0.16096296906471252,
0.04483460634946823,
0.11225252598524094,
0.06875383108854294,
0.03142392262816429,
-0.09918686747550964,
0.10367192327976227,
0.05703024938702583,
-0.04193536564707756,
-0.030436120927333832,
-0.025345150381326675,
-0.1094529926776886,
0.04641928896307945,
-0.060705121606588364,
-0.15131545066833496,
0.02451176568865776,
0.060027968138456345,
-0.04309086501598358,
0.06721840053796768,
0.012331640347838402,
0.039725590497255325,
0.0026901315432041883,
0.2018684297800064,
0.08753302693367004,
-0.1246432214975357,
0.05848050117492676,
0.027302736416459084,
-0.036248307675123215,
-0.0006306453142315149,
-0.06923343986272812,
-0.10772839188575745,
0.12133088707923889,
0.10209017992019653,
-0.12816616892814636,
0.03642239794135094,
-0.0756463035941124,
-0.03861613571643829,
0.006936450023204088,
-0.14365047216415405,
0.004328363575041294,
0.02616008184850216,
-0.06855659931898117,
-0.061494845896959305,
0.06624548137187958,
-0.16512556374073029,
-0.13713620603084564,
-0.029564570635557175,
-0.059748485684394836,
-0.036157142370939255,
-0.15535622835159302,
-0.15839466452598572,
-0.031011689454317093,
-0.06659997254610062,
0.03052489086985588,
-0.09937271475791931,
-0.17020905017852783,
-0.02789422497153282,
0.008233631029725075,
0.003373226150870323,
-0.013562542386353016,
-0.07182187587022781,
-0.0013577997451648116,
-0.03124837949872017,
-0.025330932810902596,
-0.0030597185250371695,
-0.038327816873788834,
0.1207767054438591,
0.11899863183498383,
0.033790066838264465,
-0.024265553802251816,
0.047015342861413956,
-0.07176919281482697,
0.052727650851011276,
-0.09669443219900131,
0.11043544113636017,
-0.05751681327819824,
0.08099505305290222,
-0.0419064499437809,
-0.10452874004840851,
0.07002420723438263,
0.05793498829007149,
0.07745497673749924,
0.04917997866868973,
-0.16308598220348358,
-0.0234332587569952,
0.20439207553863525,
-0.12072689086198807,
-0.12495312094688416,
0.11855104565620422,
-0.046061791479587555,
0.03776929900050163,
0.09483920782804489,
0.13696981966495514,
0.13503237068653107,
-0.010138408280909061,
0.00236775865778327,
0.040110789239406586,
0.042940519750118256,
-0.11781615763902664,
0.08907202631235123,
0.06906192749738693,
-0.12238804250955582,
0.0586397759616375,
-0.00567219452932477,
0.07553523033857346,
-0.01147625781595707,
-0.02494620345532894,
-0.059658754616975784,
-0.0893925204873085,
0.04277418926358223,
0.003909670747816563,
0.0731167271733284,
-0.0848747044801712,
-0.0731516182422638,
0.030143998563289642,
0.181354358792305,
-0.10846172273159027,
0.0003206859400961548,
-0.09460077434778214,
0.051463767886161804,
-0.04466758295893669,
0.019453924149274826,
-0.18550622463226318,
0.07555197179317474,
0.08847218006849289,
0.01910189725458622,
0.07844506204128265,
0.12213722616434097,
0.020154224708676338,
0.039620645344257355,
-0.009400676004588604,
-0.026280568912625313,
-0.1191895604133606,
-0.07304450124502182,
-0.0712026059627533,
-0.06284182518720627,
-0.08014775067567825,
-0.06576051563024521,
-0.005002852529287338,
-0.21593543887138367,
0.014147191308438778,
0.023131633177399635,
0.001735796919092536,
0.04011423513293266,
-0.017874252051115036,
0.013683526776731014,
0.07823746651411057,
-0.06116556003689766,
-0.03512543812394142,
0.033987220376729965,
0.0348920002579689,
-0.03109157644212246,
-0.09338314086198807,
-0.051721274852752686,
-0.014037725515663624,
0.10648132115602493,
0.05130980536341667,
-0.09332427382469177,
0.0055993301793932915,
-0.013153713196516037,
-0.029592884704470634,
0.022566014900803566,
-0.047289613634347916,
0.19507373869419098,
-0.00009770614997250959,
0.19738273322582245,
-0.15229138731956482,
-0.02221304178237915,
-0.026344917714595795,
0.017435019835829735,
0.08119655400514603,
0.14375023543834686,
-0.033658575266599655,
-0.10142635554075241,
0.06042512506246567,
-0.021204248070716858,
-0.11132968962192535,
0.20197749137878418,
-0.053528834134340286,
-0.06129467859864235,
0.02236553654074669,
0.10587072372436523,
0.004812114406377077,
0.1521136462688446,
-0.16875210404396057,
-0.0291596669703722,
0.012055925093591213,
-0.007273602299392223,
0.05922394245862961,
-0.138814777135849,
0.008358142338693142,
0.014633823186159134,
-0.05531560257077217,
-0.06435190886259079,
-0.011079121381044388,
-0.013001763261854649,
0.035480670630931854,
-0.006165692117065191,
-0.023842720314860344,
0.016748426482081413,
-0.031983841210603714,
-0.10544811189174652,
0.215153768658638,
-0.10443135350942612,
-0.1798201948404312,
-0.1824807971715927,
0.07761925458908081,
-0.0648089051246643,
-0.015932070091366768,
0.04070882499217987,
-0.10563931614160538,
-0.028806889429688454,
-0.043771978467702866,
0.1773596853017807,
-0.09402618557214737,
0.0036455062218010426,
-0.014881373383104801,
0.08184272795915604,
-0.0007187314331531525,
-0.1960608810186386,
0.03263676539063454,
-0.014843156561255455,
-0.015645921230316162,
0.01910063438117504,
-0.10588669776916504,
0.08079168200492859,
0.15146003663539886,
-0.07581387460231781,
0.022154446691274643,
-0.0071654547937214375,
0.2126307487487793,
-0.030469371005892754,
-0.08393695205450058,
0.14957444369792938,
-0.02179764024913311,
0.009059025906026363,
0.022703377529978752,
-0.007614919450134039,
-0.09217828512191772,
0.06442215293645859,
-0.02248731628060341,
-0.018698209896683693,
-0.28895899653434753,
-0.013261125423014164,
-0.0820843055844307,
0.06757204234600067,
0.041274067014455795,
0.051185593008995056,
-0.09596219658851624,
0.03580085560679436,
0.06070161610841751,
0.15313327312469482,
-0.013490407727658749,
0.0642780140042305,
0.027812939137220383,
0.01103537529706955,
0.015696624293923378,
-0.10048413276672363,
0.00860365480184555,
0.0773162767291069,
0.10858280211687088,
0.28748708963394165,
-0.09457820653915405,
0.16505572199821472,
0.03803146257996559,
0.06110937148332596,
0.07403170317411423,
0.13264121115207672,
-0.12637484073638916,
0.0300047155469656,
-0.005292809568345547,
-0.002587523777037859,
-0.13126112520694733,
0.01254898402839899,
-0.06046364828944206,
0.06219489127397537,
-0.08870752900838852,
-0.0422607883810997,
0.0061072674579918385,
0.15975113213062286,
0.04816166311502457,
-0.2227165400981903,
-0.1157771423459053,
0.0062218643724918365,
-0.08153926581144333,
-0.08911535888910294,
0.06241220235824585,
0.17346765100955963,
-0.08607053011655807,
-0.023825282230973244,
-0.026300402358174324,
0.13515251874923706,
-0.06528959423303604,
-0.021717827767133713,
-0.053150735795497894,
0.06073569878935814,
0.010599185712635517,
0.12078940123319626,
-0.3085644245147705,
0.13534009456634521,
-0.007209899835288525,
0.06014205142855644,
-0.0211926456540823,
0.05541660636663437,
-0.026486780494451523,
0.07950824499130249,
0.039434656500816345,
-0.01553613506257534,
-0.003955554682761431,
-0.17258773744106293,
0.030979985371232033,
0.02697576768696308,
0.051097795367240906,
0.05850737914443016,
0.073387511074543,
-0.019947823137044907,
0.06545276194810867,
-0.024559644982218742,
-0.12089227885007858,
-0.08521560579538345,
-0.07967197149991989,
-0.023555938154459,
-0.06642943620681763,
-0.033615920692682266,
-0.047559723258018494,
-0.0013265148736536503,
0.07930925488471985,
0.14485833048820496,
-0.10451735556125641,
-0.06989345699548721,
-0.07177258282899857,
0.055307719856500626,
0.08055798709392548,
-0.09353021532297134,
0.04113175347447395,
-0.007300086785107851,
0.03821808844804764,
-0.017012059688568115,
-0.07170740514993668,
0.03475034981966019,
-0.04195650666952133,
-0.08475659042596817,
-0.017751555889844894,
0.0455499105155468,
-0.008921317756175995,
0.04110404849052429,
0.009293208830058575,
-0.07296706736087799,
-0.05044547840952873,
-0.10303010791540146,
-0.11301536858081818,
-0.0692700445652008,
-0.004969567991793156,
0.067850761115551,
-0.1269991397857666,
-0.04853538051247597,
-0.03202645480632782,
-0.0404864177107811,
0.15876270830631256,
0.18778111040592194,
-0.07124827802181244,
0.017805255949497223,
0.11146173626184464,
-0.05463917925953865,
-0.2044205218553543,
0.017190637066960335,
0.06573761254549026,
0.10308831185102463,
-0.02453467808663845,
-0.1880742311477661,
0.033599577844142914,
-0.01267328206449747,
0.029382828623056412,
0.019805746152997017,
-0.2823578119277954,
-0.10993709415197372,
0.07211408019065857,
0.15177540481090546,
0.13102495670318604,
-0.11597853899002075,
-0.042802777141332626,
-0.07123766094446182,
-0.10686730593442917,
0.06065820902585983,
-0.03824124485254288,
0.1322285532951355,
-0.07470198720693588,
0.019590822979807854,
0.03893331438302994,
-0.035820990800857544,
0.05954686552286148,
0.008059227839112282,
0.13017027080059052,
-0.026821276172995567,
-0.008454563096165657,
0.11375588923692703,
-0.029115689918398857,
0.138625368475914,
-0.17744180560112,
0.12168190628290176,
-0.23171013593673706,
-0.05511102080345154,
-0.046615079045295715,
-0.0145776541903615,
-0.02981553040444851,
-0.04457758367061615,
-0.08945438265800476,
0.029518654569983482,
-0.008049913682043552,
-0.006536230910569429,
0.04129542410373688,
-0.006242967210710049,
-0.03654792532324791,
0.12102407217025757,
0.09360835701227188,
0.04413493722677231,
-0.0934545248746872,
0.0420820452272892,
0.04944853112101555,
0.09947129338979721,
-0.19134877622127533,
0.021221164613962173,
0.10037097334861755,
0.03149440139532089,
0.12032020837068558,
0.046181757003068924,
-0.10452020913362503,
0.052247315645217896,
0.08297719061374664,
-0.06253086030483246,
-0.09986971318721771,
-0.02030222676694393,
-0.048748116940259933,
-0.06814566999673843,
0.06602568179368973,
0.10479561984539032,
-0.04723580554127693,
-0.030377931892871857,
-0.024602383375167847,
-0.01732185296714306,
-0.11508446931838989,
0.20114508271217346,
0.07904616743326187,
0.08850567787885666,
-0.06885942816734314,
0.0581158846616745,
0.07790541648864746,
-0.022619927302002907,
0.009440531954169273,
0.16490933299064636,
-0.10036841034889221,
-0.04878103360533714,
0.08693206310272217,
0.2325538992881775,
-0.01934811659157276,
-0.04951738566160202,
-0.15510690212249756,
-0.06404922902584076,
0.022738270461559296,
0.11681750416755676,
0.11766431480646133,
0.08144023269414902,
-0.04474906623363495,
-0.003509517991915345,
-0.09237075597047806,
0.08611752837896347,
0.08156915009021759,
0.05918671935796738,
-0.14202222228050232,
0.11468342691659927,
0.03519546985626221,
0.1184912770986557,
-0.021129805594682693,
-0.0020446323323994875,
-0.13956832885742188,
0.06878544390201569,
-0.10035429149866104,
0.053461845964193344,
0.03191690891981125,
0.05945258587598801,
-0.0284108929336071,
0.01608630083501339,
-0.02932244911789894,
0.06419236958026886,
-0.09185560792684555,
0.0017892821924760938,
0.00967579148709774,
0.05321284011006355,
-0.036895520985126495,
-0.015773752704262733,
0.038749128580093384,
-0.09035875648260117,
0.12910272181034088,
-0.03780874237418175,
-0.03167199343442917,
0.0912882536649704,
-0.029452363029122353,
0.03870132192969322,
0.006780743133276701,
0.05433867126703262,
0.003930829465389252,
0.03340047970414162,
0.07524140179157257,
0.011275175027549267,
0.05254686623811722,
0.02854669839143753,
0.1512325406074524,
-0.12660108506679535,
-0.0800085961818695,
-0.07363659143447876,
-0.10535619407892227,
-0.05364144966006279,
0.10738451778888702,
0.03411640226840973,
0.13030119240283966,
0.10431842505931854,
-0.03408432751893997,
0.019288087263703346,
-0.11045628786087036,
-0.062287017703056335,
0.02855578064918518,
-0.04582419618964195,
-0.11025868356227875,
-0.07196260243654251,
0.04652746766805649,
-0.015416223555803299,
0.12206906825304031,
0.03479759395122528,
0.009632831439375877,
-0.0355248786509037,
-0.03130339831113815,
-0.03326856344938278,
0.006763235665857792,
0.20934970676898956,
-0.07755552232265472,
0.04328196123242378,
0.0014909086748957634,
0.010796748101711273,
0.0013782198075205088,
0.15157979726791382,
0.1349511742591858,
0.2117381989955902,
-0.03500716760754585,
0.07976949214935303,
0.016815973445773125,
0.005860121455043554,
-0.07669449597597122,
0.010326805524528027,
0.021647244691848755,
0.04809781536459923,
-0.06516308337450027,
0.2564120292663574,
0.08102589845657349,
-0.10279322415590286,
0.11160658299922943,
0.02393716387450695,
-0.12271986156702042,
-0.04557545483112335,
0.03714928403496742,
-0.04190078377723694,
-0.17426756024360657,
0.03161603957414627,
-0.11573973298072815,
-0.030299801379442215,
0.0730220153927803,
0.0402546301484108,
-0.06618056446313858,
0.13460634648799896,
0.027310434728860855,
-0.05092311277985573,
0.06051686033606529,
-0.01725057139992714,
-0.0032905139960348606,
0.04902603104710579,
0.022025875747203827,
0.029352424666285515,
-0.052896611392498016,
0.05714676156640053,
0.01535387709736824,
-0.04789946600794792,
-0.013534951023757458,
-0.005787731613963842,
-0.007665022276341915,
-0.010573181323707104,
0.01774219051003456,
0.058747150003910065,
0.19097545742988586,
0.02748674526810646,
-0.06367198377847672,
-0.02487613819539547,
0.1860067993402481,
-0.027552347630262375,
-0.0913734957575798,
-0.10852017253637314,
0.13920730352401733,
0.04787611588835716,
0.021961016580462456,
0.027471382170915604,
-0.08858925104141235,
-0.04230288043618202,
0.2049131542444229,
0.06832180917263031,
-0.022675704210996628,
-0.018682148307561874,
0.022759150713682175,
0.005074746906757355,
-0.04618697986006737,
0.2080819457769394,
0.017600160092115402,
0.20786099135875702,
0.017187271267175674,
-0.00645929342135787,
-0.06862006336450577,
-0.04570692777633667,
0.004233479965478182,
0.11146295070648193,
-0.0331074520945549,
-0.0381595641374588,
-0.08613014221191406,
0.011724582873284817,
-0.020066166296601295,
-0.11667554080486298,
0.09220660477876663,
-0.1434704065322876,
-0.10132172703742981,
-0.03932315111160278,
0.052446335554122925,
-0.047594957053661346,
0.03732823207974434,
-0.02371383272111416,
0.043867532163858414,
0.03938480466604233,
-0.0376572385430336,
-0.09960167855024338,
-0.16553683578968048,
0.1009625494480133,
-0.05864612013101578,
0.12882770597934723,
-0.026246581226587296,
0.169164776802063,
0.10278721153736115,
0.0283119548112154,
-0.06236230581998825,
0.09186078608036041,
0.02358897216618061,
0.041773196309804916,
0.04746280238032341,
0.12809069454669952,
-0.05628896504640579,
0.10693695396184921,
-0.03796328976750374,
-0.014358896762132645,
-0.0205607358366251,
-0.03379037603735924,
-0.009795241989195347,
-0.18113075196743011,
-0.004055137280374765,
-0.11015430837869644,
0.08288461714982986,
0.19534869492053986,
-0.04265262186527252,
-0.015504099428653717,
-0.08587358146905899,
0.09256444871425629,
-0.0027496523689478636,
0.06983622908592224,
-0.03375164046883583,
-0.18735863268375397,
-0.0066093988716602325,
0.043247878551483154,
0.0069175162352621555,
-0.2700243592262268,
-0.019818520173430443,
-0.04258174076676369,
-0.01707635074853897,
-0.09405975043773651,
0.16939914226531982,
0.099967360496521,
0.029593998566269875,
-0.031059112399816513,
-0.20140290260314941,
-0.06332190334796906,
0.05283195525407791,
-0.13212363421916962,
-0.13904859125614166
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the ruby function/method.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/ruby/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 12,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_base_code_documentation_generation_ruby_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the ruby function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 12,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 12,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 12,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
109
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 12,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09281381964683533,
0.07455319166183472,
-0.0018988167867064476,
0.13153773546218872,
0.05681499093770981,
0.021567514166235924,
0.03330884873867035,
0.11134716123342514,
-0.043329868465662,
0.06900922954082489,
0.056657057255506516,
-0.03953693062067032,
0.04301949590444565,
0.1519271582365036,
0.024449553340673447,
-0.17518600821495056,
-0.04613810405135155,
0.011747809126973152,
-0.054252393543720245,
0.10565830022096634,
0.08594229072332382,
-0.06729991734027863,
0.0646188035607338,
-0.05141499638557434,
-0.1329227089881897,
0.043617717921733856,
-0.031120995059609413,
-0.005530969239771366,
0.09957639873027802,
0.04837551340460777,
0.10761325806379318,
-0.032389748841524124,
0.04660910367965698,
-0.20776568353176117,
0.003856124822050333,
0.02470317855477333,
0.06051720678806305,
0.02705102227628231,
0.07374552637338638,
0.06604330241680145,
0.12331727892160416,
-0.017272425815463066,
0.050665754824876785,
0.06425530463457108,
-0.06143634766340256,
-0.09492836892604828,
-0.07071862369775772,
0.07843801379203796,
0.06932035088539124,
0.0844484344124794,
-0.011502892710268497,
0.01123510580509901,
-0.06743098795413971,
0.09579858928918839,
0.14447340369224548,
-0.2498890459537506,
-0.02775225043296814,
0.14057354629039764,
0.07843747735023499,
0.06192879378795624,
-0.07272862643003464,
-0.02809465304017067,
0.10355504602193832,
0.04904773086309433,
0.036452893167734146,
-0.09150632470846176,
-0.04620848596096039,
-0.0011260384926572442,
-0.0573815256357193,
-0.05177717283368111,
0.10995341092348099,
0.02409176155924797,
-0.060391880571842194,
-0.10251963883638382,
-0.05529696121811867,
-0.20724093914031982,
0.03512831777334213,
0.032872069627046585,
0.01248394139111042,
0.0034230106975883245,
0.00925009697675705,
-0.01576954685151577,
-0.07412995398044586,
-0.12009801715612411,
0.03886597976088524,
0.04403859004378319,
0.07501489669084549,
0.03569124639034271,
-0.019506536424160004,
0.08001872152090073,
-0.006351626943796873,
-0.03977591171860695,
-0.03336160257458687,
0.0036422216799110174,
-0.12108022719621658,
0.045086897909641266,
-0.016160350292921066,
-0.025565946474671364,
0.009810893796384335,
0.09975025802850723,
-0.09925241768360138,
0.09458333998918533,
0.0789807140827179,
0.021777277812361717,
-0.01199826505035162,
0.20920096337795258,
0.06517710536718369,
-0.14555105566978455,
0.0193287692964077,
0.0066604577004909515,
0.006079858634620905,
0.0050899269990623,
-0.060655660927295685,
-0.05958437919616699,
0.035586368292570114,
0.0699692964553833,
-0.1252429038286209,
0.010170224122703075,
-0.0563737191259861,
-0.006465301848948002,
0.08705154061317444,
-0.10751574486494064,
0.0353112556040287,
0.009117699228227139,
-0.0586564727127552,
-0.04266022518277168,
0.059047020971775055,
-0.12050104141235352,
-0.10554993152618408,
0.04476713389158249,
-0.031372327357530594,
-0.02349763736128807,
-0.11164826154708862,
-0.1110239326953888,
-0.015089095570147038,
-0.08297213912010193,
0.01052346732467413,
-0.10414479672908783,
-0.11606863141059875,
-0.02275949902832508,
0.03150831535458565,
-0.016945362091064453,
-0.03380932658910751,
-0.04450617730617523,
0.012893036007881165,
-0.0050939577631652355,
-0.021735291928052902,
0.009371619671583176,
-0.028999902307987213,
0.09588510543107986,
0.08746710419654846,
0.04247456416487694,
0.0029561901465058327,
0.023797374218702316,
-0.08340451121330261,
0.07736308872699738,
-0.09817671030759811,
0.08497186005115509,
-0.02112812176346779,
0.06707087904214859,
-0.10091634094715118,
-0.08876649290323257,
0.017673509195446968,
0.049517661333084106,
0.0637572631239891,
0.031114688143134117,
-0.14041262865066528,
0.022677749395370483,
0.17802616953849792,
-0.11291609704494476,
-0.11237811297178268,
0.12796638906002045,
-0.00947489496320486,
-0.006873596925288439,
0.07649090886116028,
0.1505700647830963,
0.1479954570531845,
-0.06970334053039551,
-0.04237063229084015,
0.0665627270936966,
0.0682290643453598,
-0.07147131860256195,
0.0824705958366394,
0.018302129581570625,
-0.006139304954558611,
0.03711974248290062,
0.056270599365234375,
0.041657716035842896,
0.006786860525608063,
-0.030269650742411613,
-0.04737168177962303,
-0.09241749346256256,
0.005654110107570887,
-0.003010017331689596,
0.023828737437725067,
-0.0696917325258255,
-0.07196586579084396,
-0.02044326812028885,
0.1765812635421753,
-0.08509892225265503,
0.030072243884205818,
-0.07857237756252289,
-0.03251146525144577,
-0.058175161480903625,
0.02247108891606331,
-0.13433986902236938,
0.04719443991780281,
0.06628578901290894,
-0.006198754534125328,
0.06708221882581711,
0.08614300936460495,
0.015193325467407703,
0.023424118757247925,
-0.06490140408277512,
-0.047777704894542694,
-0.04082195833325386,
-0.08728989213705063,
-0.10323366522789001,
-0.026437364518642426,
-0.08001456409692764,
-0.040642816573381424,
-0.03654544800519943,
-0.17731286585330963,
0.005271461326628923,
0.002237581415101886,
0.022682147100567818,
0.03131936490535736,
-0.04074809327721596,
0.03401437774300575,
0.05761491879820824,
-0.04311433807015419,
-0.08265924453735352,
0.030157264322042465,
0.03589743748307228,
-0.07957668602466583,
-0.04333705082535744,
-0.06298649311065674,
-0.06829974800348282,
0.06493353098630905,
0.11019473522901535,
-0.10454218089580536,
-0.02498067356646061,
-0.01693187654018402,
-0.04357152804732323,
-0.04302014410495758,
-0.04872719198465347,
0.20573028922080994,
0.01324516348540783,
0.16033123433589935,
-0.13643424212932587,
-0.04423679783940315,
-0.02594490349292755,
-0.0043017989955842495,
0.04310927540063858,
0.16263459622859955,
-0.009279347956180573,
-0.10084395110607147,
0.04273217171430588,
-0.06536706537008286,
-0.07455597072839737,
0.15566174685955048,
-0.0126945236697793,
-0.051927242428064346,
0.005688678007572889,
0.12079903483390808,
0.0046995701268315315,
0.14409112930297852,
-0.06637943536043167,
-0.0006661555380560458,
-0.007689633406698704,
0.01371481642127037,
0.03929545357823372,
-0.13089942932128906,
0.01904594711959362,
0.037441834807395935,
-0.06245625391602516,
-0.032485246658325195,
-0.016839737072587013,
-0.043186500668525696,
0.03617805987596512,
0.012220772914588451,
0.020689046010375023,
-0.015408149920403957,
-0.021873464807868004,
-0.09061124175786972,
0.18977952003479004,
-0.0851982831954956,
-0.18782122433185577,
-0.16746486723423004,
0.045570384711027145,
-0.044514454901218414,
-0.02020096965134144,
0.04797222465276718,
-0.12900309264659882,
-0.03680909797549248,
-0.08272199332714081,
0.11658433824777603,
-0.1400698870420456,
0.004091512877494097,
-0.0331554114818573,
0.0631440281867981,
0.041430581361055374,
-0.16285686194896698,
0.025131916627287865,
-0.015263635665178299,
0.002514706691727042,
0.00580953061580658,
-0.06098436191678047,
0.07136669009923935,
0.12322629243135452,
-0.06379535049200058,
0.022194039076566696,
-0.012666952796280384,
0.1691497415304184,
-0.04054826870560646,
0.020241742953658104,
0.19637618958950043,
0.02015101909637451,
0.045798107981681824,
0.06037566065788269,
0.017611101269721985,
-0.08947112411260605,
0.06190275773406029,
0.049607161432504654,
-0.04645863175392151,
-0.25750499963760376,
-0.019002560526132584,
-0.07706832885742188,
0.07578025013208389,
0.11857854574918747,
0.0681549683213234,
-0.15051187574863434,
0.02610928937792778,
-0.00264989142306149,
0.15800906717777252,
-0.03581855818629265,
0.06268304586410522,
0.012993916869163513,
0.02036929689347744,
0.009900243021547794,
-0.09846227616071701,
0.017108267173171043,
0.0767945796251297,
0.11707807332277298,
0.22731272876262665,
-0.08991856127977371,
0.15594585239887238,
0.013201972469687462,
0.10339068621397018,
0.05535909906029701,
0.07952272891998291,
-0.14033684134483337,
0.008099510334432125,
0.00041237103869207203,
-0.015013518743216991,
-0.07409530878067017,
0.05119112879037857,
-0.03764462098479271,
0.04565347731113434,
-0.035593222826719284,
0.019957847893238068,
0.013444386422634125,
0.20156146585941315,
0.06725770980119705,
-0.16261190176010132,
-0.11472739279270172,
0.0112081840634346,
-0.08220567554235458,
-0.09962233155965805,
0.0727735385298729,
0.18650662899017334,
-0.05522851273417473,
0.03377479314804077,
-0.02694174461066723,
0.13546156883239746,
-0.11995194852352142,
-0.024249641224741936,
0.022763390094041824,
0.06427115947008133,
0.011194398626685143,
0.11145263910293579,
-0.2614656984806061,
0.07731259614229202,
0.013211138546466827,
0.08448295295238495,
-0.0007081215153448284,
0.06613604724407196,
-0.036216091364622116,
0.007869324646890163,
0.07188257575035095,
0.00807146541774273,
-0.08349063247442245,
-0.19536623358726501,
-0.03573690727353096,
0.017236949875950813,
0.061859648674726486,
-0.008776175789535046,
0.08566498756408691,
-0.026101719588041306,
0.049708280712366104,
-0.04195813462138176,
-0.1551072597503662,
-0.05478474497795105,
-0.13112638890743256,
-0.03992043063044548,
-0.016406096518039703,
-0.05454907938838005,
-0.024939341470599174,
0.035286590456962585,
0.03909394145011902,
0.20789910852909088,
-0.15262386202812195,
-0.09510792791843414,
-0.08281167596578598,
0.06490366905927658,
0.12160823494195938,
-0.10511598736047745,
0.03308695927262306,
0.006602330133318901,
0.038698483258485794,
-0.04929640516638756,
-0.06280624121427536,
0.021146951243281364,
-0.05119767412543297,
-0.08972059190273285,
-0.028553688898682594,
0.0955076664686203,
-0.032001033425331116,
0.050507802516222,
-0.004233585670590401,
-0.06297542154788971,
-0.03746342658996582,
-0.12286786735057831,
-0.06313357502222061,
-0.013888008892536163,
0.01343518029898405,
0.00038812693674117327,
-0.08991756290197372,
0.0823061540722847,
-0.01209639385342598,
-0.08879918605089188,
0.09071241319179535,
0.19552575051784515,
-0.08674222975969315,
0.022588595747947693,
0.09004667401313782,
-0.05218230187892914,
-0.17291046679019928,
-0.048713404685258865,
0.056107696145772934,
0.06685247272253036,
-0.018561922013759613,
-0.16499587893486023,
0.04374212771654129,
0.008086785674095154,
0.010552876628935337,
0.012905361130833626,
-0.28222519159317017,
-0.11759293079376221,
-0.006498364731669426,
0.0727972760796547,
0.08270685374736786,
-0.10929200053215027,
-0.05529194325208664,
-0.0630861222743988,
-0.036903828382492065,
0.011669347994029522,
0.06670556962490082,
0.12067782133817673,
-0.04881495609879494,
0.02651595138013363,
0.04141781106591225,
-0.023911118507385254,
0.06349359452724457,
-0.030521530658006668,
0.0984518975019455,
-0.009704580530524254,
0.009272708557546139,
0.030344421043992043,
-0.05883212387561798,
0.16051161289215088,
-0.19495876133441925,
0.10583814233541489,
-0.1873544156551361,
-0.0469806008040905,
-0.004964343272149563,
-0.026605920866131783,
-0.029965007677674294,
-0.05350076034665108,
-0.1144132912158966,
0.025676298886537552,
0.03544534370303154,
-0.024678625166416168,
0.0494239442050457,
-0.016953375190496445,
-0.043767765164375305,
0.09740792214870453,
0.053584564477205276,
0.02906086854636669,
-0.16378362476825714,
0.01647166907787323,
0.023692987859249115,
0.08370829373598099,
-0.2301737666130066,
0.010861127637326717,
0.10423920303583145,
0.030536526814103127,
0.095603808760643,
0.0077979289926588535,
-0.08770016580820084,
0.056926850229501724,
0.06935960054397583,
-0.04469231516122818,
-0.12734723091125488,
-0.009206769987940788,
-0.01895863749086857,
-0.08891729265451431,
0.050433479249477386,
0.09686329960823059,
-0.06460100412368774,
-0.020167667418718338,
-0.0025809078942984343,
0.016969501972198486,
-0.06782617419958115,
0.19053813815116882,
0.028235435485839844,
0.08440221846103668,
-0.0685507133603096,
0.07681148499250412,
0.09728521853685379,
-0.07500524818897247,
0.022254155948758125,
0.16047623753547668,
-0.080149345099926,
-0.028070759028196335,
0.06908974051475525,
0.10476066172122955,
-0.013523014262318611,
-0.04922289773821831,
-0.10337518900632858,
-0.06172400712966919,
0.01732594706118107,
-0.002149285050109029,
0.08206342905759811,
0.07146474719047546,
-0.04624627158045769,
0.001808789442293346,
-0.09496889263391495,
0.096571184694767,
0.07894140481948853,
0.06269223988056183,
-0.16437870264053345,
0.09930670261383057,
0.044135112315416336,
0.08042743057012558,
0.008095109835267067,
0.026901893317699432,
-0.1038322001695633,
0.04150671511888504,
-0.034599389880895615,
0.057744935154914856,
0.016335057094693184,
0.06240721419453621,
-0.04434855282306671,
0.043834276497364044,
-0.02476993389427662,
0.04761644825339317,
-0.04709317907691002,
-0.027642235159873962,
-0.03345303237438202,
0.03413200378417969,
-0.05916577950119972,
-0.021230747923254967,
0.010658910498023033,
-0.07223331928253174,
0.10696940869092941,
-0.06908536702394485,
-0.017541630193591118,
0.009643085300922394,
0.03394756838679314,
0.06755127012729645,
0.029257474467158318,
0.039614249020814896,
-0.020404152572155,
0.01701296865940094,
0.0347733348608017,
-0.00618787994608283,
-0.0004901997745037079,
0.005123454611748457,
0.09373883157968521,
-0.14522220194339752,
-0.07895609736442566,
-0.10054377466440201,
-0.08016572892665863,
-0.059734027832746506,
0.08110985159873962,
0.08467653393745422,
0.10118326544761658,
0.09478454291820526,
-0.03835683688521385,
0.016132378950715065,
-0.1369907408952713,
-0.04153948649764061,
0.0482596829533577,
-0.026858801022171974,
-0.1253952980041504,
-0.04457448795437813,
0.0490490198135376,
-0.027635131031274796,
0.10907869786024094,
0.005509600974619389,
0.04448126256465912,
-0.02571939490735531,
-0.0339580662548542,
-0.04674632102251053,
-0.008712840266525745,
0.157046839594841,
-0.09702565521001816,
0.00915191974490881,
0.00033843712299130857,
0.007954703643918037,
0.027337772771716118,
0.18776096403598785,
0.09393303841352463,
0.15898461639881134,
0.05347638204693794,
0.07257074862718582,
-0.04748134687542915,
-0.012886895798146725,
-0.16164103150367737,
0.07537448406219482,
-0.01625468023121357,
0.03335566446185112,
-0.056375838816165924,
0.18237781524658203,
0.12507769465446472,
-0.12977680563926697,
0.1049404889345169,
0.01827867329120636,
-0.08628883957862854,
-0.05319243669509888,
-0.07945016026496887,
-0.052623916417360306,
-0.13344791531562805,
0.012452439405024052,
-0.09153499454259872,
0.00984385795891285,
0.10292689502239227,
0.031908176839351654,
-0.019528457894921303,
0.1209699958562851,
-0.0026699930895119905,
-0.05027869716286659,
0.03578002005815506,
0.03065560571849346,
0.02124735713005066,
0.12569114565849304,
0.013337436132133007,
0.06520786136388779,
-0.07854858040809631,
0.0763995349407196,
0.03122517094016075,
-0.009573223069310188,
0.0061014266684651375,
0.01196792721748352,
-0.008280503563582897,
-0.04586511477828026,
0.011864612810313702,
0.0757310539484024,
0.18534287810325623,
0.047693222761154175,
-0.04905012249946594,
-0.049897726625204086,
0.20629800856113434,
-0.05007096752524376,
-0.05118096247315407,
-0.10972660034894943,
0.1584908813238144,
0.04151301458477974,
0.018118226900696754,
0.01470495667308569,
-0.07467106729745865,
-0.030386891216039658,
0.21791109442710876,
0.05052993819117546,
-0.015849938616156578,
-0.029038796201348305,
0.0021760764066129923,
-0.00658829789608717,
-0.03987970948219299,
0.16187697649002075,
0.005564571358263493,
0.22096164524555206,
0.006019077729433775,
-0.00538819283246994,
-0.038647234439849854,
-0.0461009219288826,
-0.023020226508378983,
0.17235508561134338,
-0.0433102622628212,
0.026527779176831245,
-0.09766808152198792,
0.0037036456633359194,
0.004466727841645479,
-0.11854273080825806,
0.11869741231203079,
-0.1270328015089035,
-0.08451597392559052,
0.019793039187788963,
0.08029769361019135,
-0.03209695592522621,
0.053829990327358246,
-0.01896597631275654,
0.05643913149833679,
0.014522666111588478,
-0.03503028303384781,
-0.10507440567016602,
-0.15735171735286713,
0.049389634281396866,
0.006211740896105766,
0.1303110271692276,
0.006861856672912836,
0.08385320007801056,
0.09412390738725662,
0.016239985823631287,
-0.08254203200340271,
0.07782359421253204,
0.025501465424895287,
-0.017994249239563942,
0.051617544144392014,
0.1322636753320694,
-0.04687676206231117,
0.1356549859046936,
0.02151954174041748,
-0.000480388494906947,
-0.028718434274196625,
0.0004105885454919189,
0.010430796071887016,
-0.17217983305454254,
0.018957728520035744,
-0.07143403589725494,
0.11977027356624603,
0.19707882404327393,
-0.04187418147921562,
-0.0232028029859066,
-0.04714204743504524,
0.06955458968877792,
-0.01807337999343872,
0.08515509217977524,
0.00039401836693286896,
-0.16914547979831696,
0.007745327427983284,
0.03421664983034134,
0.01302234549075365,
-0.19883549213409424,
-0.05496151000261307,
-0.030225466936826706,
-0.01567898690700531,
-0.09875522553920746,
0.15437032282352448,
0.07081257551908493,
0.018169449642300606,
-0.02975313737988472,
-0.22307178378105164,
-0.02941369079053402,
0.04324174299836159,
-0.12588045001029968,
-0.11991629004478455
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the ruby function/method.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_code_documentation_generation_ruby_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/ruby/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_base_code_documentation_generation_ruby_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the ruby function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09936226159334183,
0.05641235038638115,
-0.0012303452240303159,
0.11936061084270477,
0.047170497477054596,
0.024878187105059624,
0.046666838228702545,
0.11062204092741013,
-0.05556945875287056,
0.06041008606553078,
0.053351227194070816,
-0.04571954160928726,
0.06503932178020477,
0.16854943335056305,
0.00802221056073904,
-0.14220724999904633,
-0.04534033313393593,
0.02903488092124462,
-0.07521350681781769,
0.10946819931268692,
0.08290393650531769,
-0.08898895978927612,
0.0774410143494606,
-0.053098034113645554,
-0.13558784127235413,
0.044824909418821335,
-0.025680378079414368,
-0.004750253167003393,
0.09829708188772202,
0.06543570756912231,
0.12051327526569366,
-0.0183763075619936,
0.06649023294448853,
-0.21021594107151031,
0.0026993160136044025,
0.0284065380692482,
0.06611230224370956,
0.04621221497654915,
0.07146686315536499,
0.08100123703479767,
0.09845238924026489,
-0.028735361993312836,
0.042877428233623505,
0.05653927102684975,
-0.06351069360971451,
-0.05565549060702324,
-0.09166644513607025,
0.09804876148700714,
0.06895830482244492,
0.09015268832445145,
-0.004725164268165827,
0.05249916762113571,
-0.06361610442399979,
0.09371735155582428,
0.14171835780143738,
-0.25222891569137573,
-0.02504236437380314,
0.11889320611953735,
0.0763305127620697,
0.054205913096666336,
-0.07046877592802048,
-0.033204518258571625,
0.10580172389745712,
0.04162762686610222,
0.044316958636045456,
-0.08965497463941574,
-0.01926514506340027,
-0.0034559250343590975,
-0.060395874083042145,
-0.0492364726960659,
0.15850576758384705,
0.03380798175930977,
-0.06230833753943443,
-0.10194295644760132,
-0.03881020471453667,
-0.21005946397781372,
0.03360280394554138,
0.009557933546602726,
0.00960632599890232,
-0.007089819759130478,
-0.005362380761653185,
-0.00711815943941474,
-0.08148020505905151,
-0.1223042756319046,
0.038759052753448486,
0.024361595511436462,
0.06691979616880417,
0.03900795057415962,
-0.04359001666307449,
0.08604471385478973,
0.04828552156686783,
-0.03499385714530945,
-0.012491796165704727,
0.00019605027046054602,
-0.10999728739261627,
0.011917022988200188,
-0.013044701889157295,
-0.060993123799562454,
-0.004089429974555969,
0.0844244435429573,
-0.08806219696998596,
0.08650526404380798,
0.08610757440328598,
0.025031138211488724,
0.007341524586081505,
0.2089005410671234,
0.05837723985314369,
-0.1547912061214447,
0.025628743693232536,
0.0178474560379982,
-0.011109299026429653,
0.01568843238055706,
-0.04828409478068352,
-0.054539378732442856,
0.042686887085437775,
0.06678321957588196,
-0.12113670259714127,
0.024708136916160583,
-0.056972015649080276,
-0.014452122151851654,
0.08092552423477173,
-0.11486660689115524,
0.034760184586048126,
0.012341764755547047,
-0.061733677983284,
-0.03710499405860901,
0.06705182790756226,
-0.12220194190740585,
-0.11438677459955215,
0.036946021020412445,
-0.03168391436338425,
-0.0347253754734993,
-0.12048088759183884,
-0.114047110080719,
-0.021802974864840508,
-0.03202614188194275,
0.0029047203715890646,
-0.10507475584745407,
-0.10171059519052505,
-0.023503603413701057,
0.03513488546013832,
-0.008596819825470448,
-0.03366803005337715,
-0.03878689184784889,
0.010074751451611519,
-0.009152999147772789,
-0.01308218389749527,
0.01892983727157116,
-0.019698889926075935,
0.09528177976608276,
0.08165764063596725,
0.03477208688855171,
-0.001698188134469092,
0.021610695868730545,
-0.08167248219251633,
0.06985446065664291,
-0.10159610211849213,
0.07346245646476746,
-0.00761814508587122,
0.05797778442502022,
-0.11263089627027512,
-0.0805211216211319,
0.005610621068626642,
0.04644623398780823,
0.0793319046497345,
0.04529058188199997,
-0.15865223109722137,
0.03512437269091606,
0.15840359032154083,
-0.1131955161690712,
-0.1302087903022766,
0.11285987496376038,
-0.01659328117966652,
0.01706082746386528,
0.06892293691635132,
0.1261148750782013,
0.14247672259807587,
-0.06853767484426498,
-0.042340345680713654,
0.06678453087806702,
0.045951396226882935,
-0.06514965742826462,
0.06668440252542496,
0.0279526486992836,
-0.03649776428937912,
0.02114962972700596,
0.06270574778318405,
0.03970666229724884,
-0.006544079631567001,
-0.03433234617114067,
-0.0398654080927372,
-0.10009093582630157,
-0.03213623911142349,
-0.010193102061748505,
0.03333338722586632,
-0.051892489194869995,
-0.053710728883743286,
-0.04095127433538437,
0.16731292009353638,
-0.08437727391719818,
0.029512479901313782,
-0.08498706668615341,
-0.03797515481710434,
-0.05256868526339531,
0.01844291388988495,
-0.13729162514209747,
0.06577662378549576,
0.07225583493709564,
-0.003493786556646228,
0.06365139782428741,
0.08197034895420074,
0.009896575473248959,
0.013492058962583542,
-0.0633823350071907,
-0.049256831407547,
-0.03324412927031517,
-0.08105295896530151,
-0.11283041536808014,
-0.03786129876971245,
-0.08849934488534927,
-0.030981779098510742,
-0.04695663973689079,
-0.18298451602458954,
-0.0011558285914361477,
-0.00223827944137156,
0.03310509771108627,
0.04108845070004463,
-0.03667394071817398,
0.02046641707420349,
0.05348319932818413,
-0.04327063634991646,
-0.07565683871507645,
0.02062235400080681,
0.046286098659038544,
-0.08124437183141708,
-0.04995938017964363,
-0.062015168368816376,
-0.08521202951669693,
0.07152944058179855,
0.10110602527856827,
-0.1330358237028122,
-0.024488838389515877,
-0.022534657269716263,
-0.03447730094194412,
-0.04205573722720146,
-0.03983701393008232,
0.20293095707893372,
0.017955714836716652,
0.15941840410232544,
-0.13371075689792633,
-0.055998794734478,
-0.027713323011994362,
0.005750682670623064,
0.04177074134349823,
0.15259015560150146,
0.014493304304778576,
-0.13105608522891998,
0.028761757537722588,
-0.06448118388652802,
-0.0542624294757843,
0.1373520791530609,
-0.020816536620259285,
-0.048798978328704834,
-0.0020677740685641766,
0.10859637707471848,
0.010195179842412472,
0.17482790350914001,
-0.020373322069644928,
0.0023225515615195036,
-0.0080244280397892,
0.004003453534096479,
0.03778442367911339,
-0.13001346588134766,
0.03215113282203674,
0.038801733404397964,
-0.04771902784705162,
-0.022997500374913216,
-0.028570547699928284,
-0.04052296280860901,
0.04208940640091896,
0.01668996922671795,
0.03983866050839424,
-0.020832741633057594,
-0.031248481944203377,
-0.10385989397764206,
0.17565588653087616,
-0.07307795435190201,
-0.18664826452732086,
-0.1558961719274521,
0.08999177068471909,
-0.01808512769639492,
-0.0231351125985384,
0.029035314917564392,
-0.10057391971349716,
-0.04247410595417023,
-0.09675228595733643,
0.11942175775766373,
-0.1278616040945053,
0.009094360284507275,
-0.022686876356601715,
0.06938688457012177,
0.04462626203894615,
-0.15652036666870117,
0.03068036213517189,
-0.021020837128162384,
0.020617665722966194,
0.00020593326189555228,
-0.0600280836224556,
0.07059620320796967,
0.10406826436519623,
-0.07300078123807907,
0.021510353311896324,
-0.014745093882083893,
0.18066193163394928,
-0.05204755440354347,
0.03795000538229942,
0.17753303050994873,
0.008966061286628246,
0.035116974264383316,
0.06207253411412239,
0.012495240196585655,
-0.08846742659807205,
0.06660827249288559,
0.033917978405952454,
-0.024139748886227608,
-0.2297814041376114,
-0.022302750498056412,
-0.0763893723487854,
0.07452549785375595,
0.11996205151081085,
0.0498974546790123,
-0.16245071589946747,
0.027295809239149094,
-0.0034677735529839993,
0.16597680747509003,
-0.03171613812446594,
0.06559979170560837,
-0.025627803057432175,
0.02312466688454151,
0.0037238472141325474,
-0.10490675270557404,
0.0028490640688687563,
0.07514496147632599,
0.10348735749721527,
0.21396061778068542,
-0.08895646780729294,
0.13878731429576874,
0.0017366142710670829,
0.1189340129494667,
0.059491608291864395,
0.10586266964673996,
-0.13347694277763367,
0.010283676907420158,
-0.0028917714953422546,
-0.012346711941063404,
-0.08163201063871384,
0.04608119651675224,
-0.03558846935629845,
0.06252233684062958,
-0.05387919768691063,
0.022102907299995422,
0.01625409536063671,
0.20570391416549683,
0.06790406256914139,
-0.15663935244083405,
-0.11689336597919464,
0.0012753086630254984,
-0.07882261276245117,
-0.09512222558259964,
0.0678698942065239,
0.18125340342521667,
-0.06384043395519257,
0.02127978764474392,
-0.025537529960274696,
0.13394439220428467,
-0.11693892627954483,
-0.02306411974132061,
0.021554013714194298,
0.05698710307478905,
0.0022382892202585936,
0.10683518648147583,
-0.2719099819660187,
0.08358103781938553,
0.01715879514813423,
0.08935663849115372,
-0.015406576916575432,
0.06040972098708153,
-0.04719291999936104,
0.003932262305170298,
0.07943087071180344,
0.010709601454436779,
-0.08176105469465256,
-0.19257201254367828,
-0.028964033350348473,
0.01713506691157818,
0.07124148309230804,
0.0006811736384406686,
0.09203432500362396,
-0.030980991199612617,
0.04816007241606712,
-0.03302575275301933,
-0.12675940990447998,
-0.0782308429479599,
-0.13665562868118286,
-0.040685977786779404,
-0.025786759331822395,
-0.06260242313146591,
-0.0291550625115633,
0.05221250280737877,
0.049210187047719955,
0.1997506469488144,
-0.16390080749988556,
-0.06331625580787659,
-0.08831710368394852,
0.06743840128183365,
0.1196301132440567,
-0.09270040690898895,
0.020021837204694748,
0.016477182507514954,
0.05408778414130211,
-0.04709239304065704,
-0.06932312250137329,
0.01985304430127144,
-0.05909200757741928,
-0.09192320704460144,
-0.0364355742931366,
0.09075719118118286,
-0.01902111805975437,
0.05346531793475151,
0.00813612062484026,
-0.07138296216726303,
-0.03767133504152298,
-0.1171361654996872,
-0.06357544660568237,
-0.0414666123688221,
0.020481325685977936,
0.0021490994840860367,
-0.11774635314941406,
0.05616786703467369,
-0.022966476157307625,
-0.09292061626911163,
0.09132636338472366,
0.18377752602100372,
-0.08299601078033447,
0.018540576100349426,
0.06041410565376282,
-0.055775970220565796,
-0.18852324783802032,
-0.0426984578371048,
0.053510475903749466,
0.06962116807699203,
-0.013897119089961052,
-0.15156559646129608,
0.047231417149305344,
-0.025711631402373314,
0.0162627175450325,
-0.014416852965950966,
-0.24703972041606903,
-0.11833139508962631,
-0.003083133604377508,
0.072543203830719,
0.048267245292663574,
-0.09263744205236435,
-0.04960327968001366,
-0.06671229004859924,
-0.025773942470550537,
0.043276309967041016,
0.08010008186101913,
0.10719089955091476,
-0.03955037519335747,
0.017834732308983803,
0.04685383290052414,
-0.026948831975460052,
0.04291065037250519,
-0.03957942873239517,
0.11354217678308487,
-0.00643755029886961,
-0.016034677624702454,
0.03595571592450142,
-0.055721987038850784,
0.15669134259223938,
-0.1863052397966385,
0.11935614049434662,
-0.17758919298648834,
-0.0378175713121891,
-0.010837112553417683,
-0.017759235575795174,
-0.03626527264714241,
-0.04723555967211723,
-0.12368188053369522,
0.045498497784137726,
0.05690281465649605,
-0.02848738431930542,
0.03342922776937485,
-0.002117230324074626,
-0.06113114580512047,
0.08930689096450806,
0.07233580946922302,
0.04885700345039368,
-0.12478531152009964,
0.02802908420562744,
0.018010420724749565,
0.0885966569185257,
-0.18434281647205353,
0.022756999358534813,
0.10390997678041458,
0.022341223433613777,
0.09686607122421265,
0.011626003310084343,
-0.08830700069665909,
0.02441599778831005,
0.06746163219213486,
-0.05993802845478058,
-0.10172602534294128,
-0.013331201858818531,
0.010588474571704865,
-0.08734387159347534,
0.03885508328676224,
0.08943109959363937,
-0.06663717329502106,
-0.01936577819287777,
-0.0059590269811451435,
0.014049417339265347,
-0.07823298871517181,
0.1777644157409668,
0.019122019410133362,
0.08333364874124527,
-0.056509099900722504,
0.08161812275648117,
0.09552659839391708,
-0.07287599891424179,
0.028285833075642586,
0.14568597078323364,
-0.0837244838476181,
-0.021935945376753807,
0.11557365953922272,
0.14487217366695404,
-0.012831238098442554,
-0.04636311158537865,
-0.10506986081600189,
-0.07508993148803711,
0.0129086347296834,
0.022258246317505836,
0.07233264297246933,
0.07275372743606567,
-0.03286485746502876,
-0.006441221572458744,
-0.11803890764713287,
0.09508288651704788,
0.0836087241768837,
0.05239211022853851,
-0.14908190071582794,
0.13371844589710236,
0.03074834495782852,
0.0689023956656456,
0.00312375882640481,
0.041877631098032,
-0.11325492709875107,
0.03515153378248215,
-0.003197177778929472,
0.049596384167671204,
0.02419508807361126,
0.04988199099898338,
-0.037527669221162796,
0.05021083354949951,
-0.02751854807138443,
0.043727047741413116,
-0.043387118726968765,
-0.02420561946928501,
-0.034821026027202606,
0.02999294176697731,
-0.047078635543584824,
-0.020952560007572174,
0.014953727833926678,
-0.08192052692174911,
0.09118626266717911,
-0.06434939056634903,
-0.012249735184013844,
0.0011282479390501976,
0.03447386622428894,
0.05395421385765076,
0.00168329244479537,
0.05344266816973686,
-0.018457463011145592,
0.002967460546642542,
0.02421703189611435,
0.00598698016256094,
-0.0166595671325922,
-0.0010167709551751614,
0.10732374340295792,
-0.1369914710521698,
-0.07807335257530212,
-0.10076719522476196,
-0.06475132703781128,
-0.06131112575531006,
0.08752622455358505,
0.0759330466389656,
0.09083130955696106,
0.09619265794754028,
-0.0410291850566864,
0.013427878729999065,
-0.15081290900707245,
-0.03549240902066231,
0.05532175675034523,
-0.015468357130885124,
-0.13446027040481567,
-0.049784671515226364,
0.06311620026826859,
-0.026605794206261635,
0.11284057796001434,
0.002403859281912446,
0.012427386827766895,
-0.017919141799211502,
-0.048548780381679535,
-0.07214102149009705,
0.00399074936285615,
0.1761600524187088,
-0.10379743576049805,
0.0036703646183013916,
-0.007530563045293093,
0.005722720175981522,
0.01770690642297268,
0.1868162602186203,
0.12130331993103027,
0.16617421805858612,
0.03206793963909149,
0.06623253971338272,
-0.04724007472395897,
-0.027879957109689713,
-0.09565605968236923,
0.07266177982091904,
-0.027547914534807205,
0.030490899458527565,
-0.04979807510972023,
0.18826065957546234,
0.08310974389314651,
-0.1314050555229187,
0.1067531481385231,
-0.0008435228955931962,
-0.0873878225684166,
-0.038988709449768066,
-0.07257742434740067,
-0.04952123016119003,
-0.11525988578796387,
0.006195286754518747,
-0.0998905599117279,
-0.0030590782407671213,
0.0590045265853405,
0.02233981527388096,
-0.02265014871954918,
0.12733852863311768,
-0.04438666254281998,
-0.04786524176597595,
0.04516816511750221,
0.03784382343292236,
0.005978990811854601,
0.09373003244400024,
0.02029610425233841,
0.058362096548080444,
-0.07728967815637589,
0.07588838040828705,
0.03007001057267189,
-0.016352079808712006,
0.013364131562411785,
0.0449262373149395,
-0.015552341938018799,
-0.03850822150707245,
-0.020845795050263405,
0.08185960352420807,
0.17126484215259552,
0.035474829375743866,
-0.030722081661224365,
-0.05844901129603386,
0.20615284144878387,
-0.058502957224845886,
-0.05493541061878204,
-0.11461261659860611,
0.1560683250427246,
0.04200168699026108,
0.017486318945884705,
0.02317034639418125,
-0.08100900053977966,
-0.016602128744125366,
0.24160124361515045,
0.06596499681472778,
-0.045530080795288086,
-0.025459738448262215,
0.006653675809502602,
-0.004094691481441259,
-0.04367629066109657,
0.1464923918247223,
0.00648842565715313,
0.2003660798072815,
0.0028091799467802048,
0.005227000918239355,
-0.04291681945323944,
-0.04882633313536644,
0.00578042957931757,
0.19458559155464172,
-0.024568790569901466,
0.024675482884049416,
-0.10121024399995804,
-0.005930433515459299,
0.0034088159445673227,
-0.16540324687957764,
0.12309350818395615,
-0.14082071185112,
-0.0747184157371521,
0.013729900121688843,
0.06995774060487747,
-0.04402446001768112,
0.0582454577088356,
-0.01973172090947628,
0.07194805890321732,
0.025990759953856468,
-0.023797396570444107,
-0.09295473247766495,
-0.14465630054473877,
0.05145733430981636,
-0.0243088211864233,
0.12112953513860703,
0.011388091370463371,
0.09177181869745255,
0.08972779661417007,
0.00518985278904438,
-0.08559349924325943,
0.06722203642129898,
0.023477422073483467,
-0.006110933609306812,
0.04840518534183502,
0.13507120311260223,
-0.04501114785671234,
0.1566072404384613,
0.012774970382452011,
-0.022751890122890472,
-0.02105412445962429,
-0.011013932526111603,
-0.004304706584662199,
-0.16239199042320251,
0.015396510250866413,
-0.06309995800256729,
0.13575811684131622,
0.19672317802906036,
-0.04549727961421013,
-0.004492092877626419,
-0.04843619465827942,
0.07581037282943726,
-0.00637218588963151,
0.08771383762359619,
0.00353122316300869,
-0.16538377106189728,
0.007671186700463295,
-0.006661264691501856,
0.010826697573065758,
-0.19548656046390533,
-0.05436699092388153,
-0.0429023802280426,
-0.03166569396853447,
-0.10403791069984436,
0.1494446098804474,
0.08411125838756561,
0.025907019153237343,
-0.035766761749982834,
-0.1695692092180252,
-0.024401184171438217,
0.03856028616428375,
-0.11310208588838577,
-0.11476600915193558
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on Git Commit Message Generation dataset.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_commit_generation"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_commit_generation", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/commit%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_base_commit_generation
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on Git Commit Message Generation dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
114
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08367690443992615,
-0.011224727146327496,
-0.0013760110596194863,
0.027825066819787025,
0.15752777457237244,
0.008162670768797398,
0.10961870849132538,
0.08880123496055603,
0.03244130685925484,
-0.03982772305607796,
0.10039550811052322,
0.13547226786613464,
0.08249787241220474,
0.1757243573665619,
-0.011036526411771774,
-0.20749536156654358,
0.011975985020399094,
0.048911262303590775,
-0.07005131989717484,
0.13753749430179596,
0.13591472804546356,
-0.049010131508111954,
0.11676188558340073,
0.022014686837792397,
-0.20887267589569092,
0.029920345172286034,
-0.0011670624371618032,
-0.07493960857391357,
0.11421745270490646,
0.07794037461280823,
0.09037811309099197,
0.0642307847738266,
-0.015363178215920925,
-0.18069110810756683,
0.0343790277838707,
0.015733927488327026,
-0.02200421132147312,
0.030829010531306267,
-0.0007627108716405928,
-0.05519642308354378,
0.13863052427768707,
-0.03328347206115723,
-0.023216530680656433,
0.03677348420023918,
-0.1289716213941574,
0.07754312455654144,
-0.04396216943860054,
0.01172732561826706,
0.10323819518089294,
0.09075195342302322,
-0.003800070146098733,
0.11204589158296585,
-0.11071714758872986,
0.106952004134655,
0.18043038249015808,
-0.14462217688560486,
-0.015390202403068542,
0.15340638160705566,
0.0651751235127449,
0.02423004060983658,
-0.031667958945035934,
0.028300520032644272,
0.06060633435845375,
0.021029269322752953,
0.011229691095650196,
-0.08128442615270615,
-0.021065739914774895,
0.07403498888015747,
-0.1246660128235817,
-0.09302543848752975,
0.18880683183670044,
-0.024818124249577522,
-0.06449481099843979,
-0.07482515275478363,
-0.02803649567067623,
-0.055896565318107605,
-0.00042338905041106045,
0.03292836993932724,
-0.012562528252601624,
0.02239517867565155,
-0.10455730557441711,
-0.06270141154527664,
-0.12130130082368851,
-0.11666294187307358,
-0.04205981269478798,
0.09277022629976273,
0.012914113700389862,
0.037500035017728806,
-0.13757896423339844,
0.12285476922988892,
-0.04557338356971741,
-0.05575459823012352,
-0.030041374266147614,
-0.06567218899726868,
-0.04416266083717346,
-0.03761855140328407,
-0.06659678369760513,
-0.22917161881923676,
0.13702523708343506,
0.05809182673692703,
-0.07013816386461258,
0.03842891752719879,
0.014342543669044971,
0.053729090839624405,
0.10411590337753296,
0.21367903053760529,
-0.06673025339841843,
-0.0012706969864666462,
0.07736864686012268,
-0.03317539766430855,
-0.07697711139917374,
-0.0031046399381011724,
-0.09911423176527023,
-0.018744269385933876,
0.03619891032576561,
0.11579043418169022,
-0.07908577471971512,
0.10984252393245697,
-0.04997926205396652,
-0.03467177599668503,
0.024925025179982185,
-0.0960032045841217,
-0.05217140540480614,
-0.030414612963795662,
-0.06769706308841705,
-0.040194541215896606,
0.06204742193222046,
-0.020378082990646362,
-0.08549421280622482,
-0.03211631625890732,
-0.07284300029277802,
-0.02998972311615944,
-0.05708809569478035,
-0.10236044973134995,
0.010452677495777607,
-0.025035789236426353,
0.03486577048897743,
-0.13950283825397491,
-0.16813117265701294,
-0.0015713792527094483,
0.03581623360514641,
0.014216776005923748,
0.03592326119542122,
-0.04888816550374031,
0.0005723895155824721,
-0.02268511988222599,
-0.010224701836705208,
-0.05669930949807167,
-0.08719043433666229,
0.07726360857486725,
0.020718343555927277,
0.048290371894836426,
-0.05421515181660652,
0.0002925475127995014,
-0.12571126222610474,
0.04733733460307121,
-0.23039354383945465,
0.10474330931901932,
-0.0687050148844719,
0.17770224809646606,
-0.1301625370979309,
-0.029651766642928123,
0.06518877297639847,
0.029920848086476326,
0.08491981029510498,
0.16516177356243134,
-0.04353299364447594,
-0.07942265272140503,
0.1369006633758545,
-0.10108330100774765,
-0.1689728945493698,
0.11854202300310135,
-0.041839197278022766,
0.15259972214698792,
0.12328336387872696,
0.18653014302253723,
0.1490342617034912,
-0.11381164193153381,
0.009097574278712273,
0.09351866692304611,
-0.0895732119679451,
-0.018329165875911713,
0.033719006925821304,
0.07676980644464493,
-0.14074254035949707,
0.036462441086769104,
0.00727482233196497,
0.14081543684005737,
-0.07235219329595566,
-0.012903153896331787,
-0.027232013642787933,
-0.06355167180299759,
-0.04635733738541603,
-0.03317698836326599,
0.07722347974777222,
-0.010926572605967522,
-0.011403840966522694,
-0.027064742520451546,
0.08624488860368729,
-0.08943497389554977,
0.02399113029241562,
-0.11488377302885056,
0.056281451135873795,
-0.12296175211668015,
0.04544241353869438,
-0.11747544258832932,
-0.028270971029996872,
0.026438862085342407,
0.01521276868879795,
0.09531150013208389,
-0.03353404998779297,
0.02496224455535412,
-0.026362575590610504,
-0.009031442925333977,
-0.026547851040959358,
0.050501685589551926,
-0.02504451386630535,
-0.0735352411866188,
-0.10529258847236633,
-0.009992978535592556,
-0.026632122695446014,
0.016102192923426628,
-0.061838023364543915,
0.025060517713427544,
0.16274422407150269,
0.07747209072113037,
0.0026401870418339968,
-0.0013754005776718259,
0.05635271966457367,
0.000769572623539716,
-0.04325954616069794,
-0.06205001845955849,
0.012708166614174843,
0.027268165722489357,
-0.17078278958797455,
0.11174405366182327,
0.012349452823400497,
0.020367873832583427,
0.12321378290653229,
-0.06608559936285019,
-0.06224288418889046,
-0.09151487797498703,
-0.04020719230175018,
-0.003529265522956848,
0.012991705909371376,
-0.05176452174782753,
0.1800336092710495,
-0.015470676124095917,
0.1209413930773735,
-0.08473680913448334,
-0.026824185624718666,
-0.007464293856173754,
-0.02379455789923668,
0.015382660552859306,
0.1105770692229271,
0.040180694311857224,
-0.2587492763996124,
0.07111047208309174,
-0.019946785643696785,
0.010002008639276028,
0.19682639837265015,
-0.0232497900724411,
-0.03511546552181244,
-0.023100780323147774,
0.06858161836862564,
-0.017205379903316498,
0.1083938404917717,
-0.20374493300914764,
-0.05798196792602539,
0.0049155778251588345,
0.015581171959638596,
0.10925253480672836,
-0.12801925837993622,
-0.028302157297730446,
0.01748705469071865,
-0.04503677785396576,
-0.11661069840192795,
0.041119541972875595,
-0.007224045693874359,
0.03170355409383774,
0.04155557230114937,
0.06955141574144363,
0.06316971778869629,
-0.016421400010585785,
-0.09575524926185608,
0.22232769429683685,
-0.07864376157522202,
-0.31864500045776367,
-0.1448410600423813,
-0.0390712209045887,
-0.037283945828676224,
0.028336508199572563,
0.06481996178627014,
-0.15757881104946136,
-0.029294976964592934,
-0.02161167375743389,
0.20056843757629395,
-0.051743049174547195,
0.06415289640426636,
-0.06616941094398499,
0.04476580396294594,
-0.042112864553928375,
-0.16240504384040833,
-0.01778103969991207,
-0.00011769791308324784,
-0.02829822711646557,
0.0477091521024704,
-0.19522278010845184,
0.08851619064807892,
0.1005852073431015,
0.0006182213546708226,
0.07663270086050034,
-0.061468206346035004,
0.32919052243232727,
-0.1254148781299591,
-0.0256236232817173,
0.13534881174564362,
-0.025533664971590042,
0.02852095104753971,
0.05519156903028488,
-0.0034982801880687475,
-0.10629688948392868,
0.05323570966720581,
-0.04022420942783356,
-0.08985640108585358,
-0.24830254912376404,
-0.05863582715392113,
-0.062351249158382416,
0.13247716426849365,
0.03336544334888458,
0.044525232166051865,
0.0064160446636378765,
0.09199880063533783,
0.027531534433364868,
0.11811252683401108,
0.006708458531647921,
0.09409646689891815,
-0.05859334394335747,
-0.016134105622768402,
0.023220006376504898,
-0.052478231489658356,
-0.037236377596855164,
0.09429741650819778,
0.09638488292694092,
0.13706591725349426,
-0.021520113572478294,
0.17615877091884613,
0.051555879414081573,
0.06200369447469711,
0.047148704528808594,
0.11216343939304352,
-0.11442528665065765,
-0.018575657159090042,
0.006957256235182285,
-0.018211010843515396,
-0.10254675894975662,
0.06348057836294174,
0.007374170236289501,
-0.05001845583319664,
-0.11244622617959976,
-0.06293150782585144,
0.1131359338760376,
0.12230506539344788,
0.03223016858100891,
-0.235747829079628,
-0.11709984391927719,
0.019817311316728592,
-0.08610191196203232,
-0.057992126792669296,
0.04375471547245979,
0.09614568948745728,
-0.10067164897918701,
0.02217283472418785,
-0.03596247360110283,
0.14187923073768616,
-0.14668597280979156,
0.022257035598158836,
-0.06547984480857849,
0.01979539729654789,
-0.07738453894853592,
0.13604234158992767,
-0.2365654557943344,
0.15233290195465088,
0.009526730515062809,
0.00022103864466771483,
-0.09919141978025436,
0.015859799459576607,
-0.0018999867606908083,
0.14217832684516907,
0.1122584342956543,
0.004107311367988586,
-0.02853405848145485,
-0.15204329788684845,
-0.01830962300300598,
0.06208118051290512,
0.0815623477101326,
-0.09786263853311539,
0.06576652824878693,
0.010146877728402615,
0.01950472965836525,
-0.014485232532024384,
-0.03185005486011505,
-0.10043349862098694,
-0.11003637313842773,
0.048661280423402786,
0.009258205071091652,
0.10852988064289093,
-0.05454853177070618,
0.018753638491034508,
0.06870138645172119,
0.25492286682128906,
-0.024753129109740257,
-0.1138538047671318,
-0.13313153386116028,
0.07309871912002563,
0.10287291556596756,
-0.07334031164646149,
0.021672867238521576,
-0.019580919295549393,
0.057731907814741135,
-0.02101149410009384,
-0.10600380599498749,
0.09840323030948639,
-0.07178999483585358,
-0.011960912495851517,
-0.0200301855802536,
0.14574198424816132,
0.047719310969114304,
-0.008802524767816067,
0.03710267320275307,
-0.0902845486998558,
-0.06805593520402908,
-0.12871389091014862,
-0.04982593655586243,
-0.051818277686834335,
0.019034791737794876,
0.11448731273412704,
-0.017625605687499046,
0.06866446882486343,
-0.06959604471921921,
-0.0640854462981224,
0.18219178915023804,
0.11596264690160751,
0.024992672726511955,
0.02268054336309433,
0.12441378086805344,
-0.07201319932937622,
-0.23942644894123077,
0.025227220728993416,
-0.02535972371697426,
0.0504312701523304,
-0.09919015318155289,
-0.17028462886810303,
0.01948106475174427,
-0.034785766154527664,
0.026657478883862495,
0.0007174781640060246,
-0.28549516201019287,
-0.09599313884973526,
0.06362206488847733,
0.08591030538082123,
0.17832441627979279,
-0.1282079815864563,
-0.06599538773298264,
-0.09580086171627045,
-0.12195929884910583,
0.17448915541172028,
-0.1275881975889206,
0.1156550869345665,
0.00558518385514617,
0.09175658971071243,
0.03455542027950287,
-0.0640888512134552,
0.0922708809375763,
0.01778460666537285,
0.049081940203905106,
-0.033169861882925034,
-0.0771535336971283,
0.09642694890499115,
-0.03884893283247948,
0.10483774542808533,
-0.0984310731291771,
0.09377054125070572,
-0.11902549862861633,
-0.049037858843803406,
-0.07342833280563354,
0.05587079003453255,
0.0331101156771183,
-0.10017795115709305,
-0.14358671009540558,
0.029000623151659966,
0.012882214039564133,
0.02739282324910164,
0.1687764823436737,
-0.05514209344983101,
-0.00282480800524354,
0.08577650785446167,
0.06930051743984222,
-0.08481265604496002,
0.0019502625800669193,
0.054103922098875046,
-0.00302473665215075,
0.09993276000022888,
-0.27385106682777405,
0.06759057193994522,
0.08452988415956497,
0.016080494970083237,
0.14983230829238892,
0.0738237127661705,
-0.0893520638346672,
0.01956699788570404,
0.09586817026138306,
-0.1531781703233719,
-0.03066597320139408,
-0.07572067528963089,
-0.02286091446876526,
0.03411215543746948,
0.12647901475429535,
0.19699299335479736,
-0.03932284191250801,
-0.015408222563564777,
-0.029040895402431488,
0.002710748231038451,
-0.12742647528648376,
0.15044981241226196,
0.021863507106900215,
0.02066139504313469,
-0.10924611240625381,
0.049007706344127655,
0.030330289155244827,
-0.10303342342376709,
-0.022551096975803375,
0.1014520451426506,
-0.14077897369861603,
-0.0788818895816803,
-0.05387137085199356,
0.12727434933185577,
-0.1287747621536255,
-0.035438161343336105,
-0.10440023988485336,
-0.05037909746170044,
0.0036393830087035894,
0.19268952310085297,
0.0657939687371254,
0.09405320882797241,
-0.006928741931915283,
0.044581297785043716,
-0.08384182304143906,
0.010372337885200977,
0.06914100795984268,
0.037968218326568604,
-0.11141347885131836,
0.09505090862512589,
0.00021551619283854961,
0.14527368545532227,
-0.06126794591546059,
-0.020801249891519547,
-0.2062264233827591,
0.07527562230825424,
-0.09615851938724518,
0.012595970183610916,
-0.06410945951938629,
-0.006161185447126627,
0.013548451475799084,
-0.03657487407326698,
-0.02461441233754158,
0.03404480963945389,
-0.08666980266571045,
0.023159323260188103,
-0.010607807897031307,
0.04909622296690941,
-0.053550783544778824,
-0.0019082508515566587,
0.08840051293373108,
-0.06966356188058853,
0.16732753813266754,
0.061804141849279404,
-0.05813510715961456,
0.07738445699214935,
-0.162855327129364,
0.01056225597858429,
0.05565103515982628,
-0.021028192713856697,
0.03928820416331291,
-0.013791745528578758,
0.03827446699142456,
0.010807584971189499,
-0.006640161853283644,
0.010318849235773087,
0.07508422434329987,
-0.11055336892604828,
-0.06636371463537216,
0.02235502563416958,
-0.04620486497879028,
-0.04243862256407738,
0.019437067210674286,
0.06642891466617584,
0.08332819491624832,
0.04585300758481026,
-0.014400159008800983,
0.037237443029880524,
-0.08667335659265518,
-0.018530935049057007,
0.06842465698719025,
-0.10377601534128189,
0.03877165541052818,
-0.07630682736635208,
0.05190933123230934,
-0.056232742965221405,
0.2476428598165512,
0.010076934471726418,
0.09888097643852234,
-0.0023160423152148724,
0.09748879075050354,
0.10384082794189453,
0.06044713035225868,
0.19097542762756348,
-0.01764407753944397,
0.020689334720373154,
-0.09764055162668228,
0.045583680272102356,
0.05932276323437691,
-0.03866953030228615,
0.016768749803304672,
-0.015746954828500748,
0.0612085722386837,
0.08487148582935333,
0.011332298628985882,
0.08097763359546661,
-0.056735534220933914,
-0.1251714825630188,
0.0054282816126942635,
0.04980159550905228,
-0.023555677384138107,
0.11071198433637619,
0.10038138926029205,
-0.07512504607439041,
0.07096168398857117,
0.04365159943699837,
-0.10732629895210266,
-0.022197412326931953,
-0.11272822320461273,
-0.041822295635938644,
-0.19163839519023895,
0.044874805957078934,
-0.11834272742271423,
-0.004640371538698673,
0.0521123930811882,
0.06578721851110458,
-0.026316314935684204,
0.23571760952472687,
0.0625874251127243,
-0.09929864853620529,
0.09548068791627884,
-0.0126882866024971,
-0.018541594967246056,
0.04622414708137512,
0.014824631623923779,
0.01744534820318222,
0.0005861973040737212,
0.020532768219709396,
0.032514024525880814,
0.0015061218291521072,
0.02625378407537937,
-0.05012794956564903,
-0.04954473674297333,
-0.06479883939027786,
0.03624047338962555,
-0.050126563757658005,
0.015453685075044632,
-0.00865135621279478,
-0.030154529958963394,
0.005318440962582827,
0.22787559032440186,
-0.05974583327770233,
-0.041987478733062744,
-0.14551042020320892,
0.18181255459785461,
0.00821626279503107,
0.04936199635267258,
-0.0203553494066,
-0.08844157308340073,
-0.050299085676670074,
0.33511537313461304,
0.1758851408958435,
-0.07003290206193924,
-0.013416806235909462,
-0.01617337390780449,
0.009900013916194439,
0.03465943783521652,
0.09963095188140869,
0.008376795798540115,
0.17009539902210236,
-0.024131260812282562,
0.018209459260106087,
-0.02443755604326725,
-0.06694041937589645,
0.046763550490140915,
0.14198525249958038,
0.03482041880488396,
-0.02962365746498108,
-0.08412469923496246,
0.10199003666639328,
-0.19127699732780457,
-0.17286857962608337,
-0.00607152096927166,
-0.13025438785552979,
-0.05554448813199997,
-0.0370149165391922,
0.08499488234519958,
-0.029680438339710236,
0.0926528126001358,
-0.02857959270477295,
0.016535019502043724,
-0.016983982175588608,
0.03555118292570114,
-0.1438804417848587,
-0.11648868024349213,
0.09071414172649384,
-0.021722637116909027,
0.0969286784529686,
-0.03324344381690025,
0.09688082337379456,
0.10641578584909439,
-0.009130498394370079,
-0.0474458709359169,
0.008198882453143597,
0.02866271883249283,
-0.05283442512154579,
-0.00040691037429496646,
0.12354155629873276,
-0.028433259576559067,
0.078672856092453,
-0.027868209406733513,
-0.14844736456871033,
0.010946705937385559,
0.0019256984815001488,
0.04740244150161743,
-0.1437571495771408,
-0.03499029204249382,
-0.1088709831237793,
0.07480596005916595,
0.12581069767475128,
-0.04679220914840698,
0.06006934121251106,
-0.08766239136457443,
0.09438538551330566,
0.0009363985736854374,
-0.020442171022295952,
-0.06434491276741028,
-0.1705738753080368,
-0.05406275764107704,
0.12682844698429108,
0.0071241362020373344,
-0.20266056060791016,
0.009796518832445145,
-0.036507993936538696,
0.019813580438494682,
-0.05679332837462425,
0.1371651291847229,
0.11782880872488022,
0.023681575432419777,
-0.01267656497657299,
-0.05609683319926262,
0.017102327197790146,
0.029797954484820366,
-0.10409560799598694,
-0.13849687576293945
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_commit_generation_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_commit_generation_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/commit%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_base_commit_generation_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
145
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 480,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1205456405878067,
-0.01408988144248724,
-0.0015193078434094787,
0.11848306655883789,
0.1159985363483429,
0.01797415129840374,
0.08628130704164505,
0.07781051099300385,
-0.016551727429032326,
0.022775180637836456,
0.05408085510134697,
0.010640003718435764,
0.06723687797784805,
0.20007549226284027,
0.023955387994647026,
-0.14805108308792114,
-0.005820489954203367,
0.026504168286919594,
-0.020130380988121033,
0.13071924448013306,
0.10305506736040115,
-0.0718255266547203,
0.06094953790307045,
-0.033287763595581055,
-0.21681331098079681,
0.03674415126442909,
-0.006117207929491997,
-0.0593869611620903,
0.10157798230648041,
0.05123389884829521,
0.12164751440286636,
-0.0005236061406321824,
0.018151942640542984,
-0.12171947956085205,
0.01031363382935524,
0.04949326068162918,
0.023580310866236687,
0.016380229964852333,
0.047794442623853683,
0.04344015195965767,
0.12652990221977234,
-0.005867480766028166,
0.01919097639620304,
0.04560644552111626,
-0.07530258595943451,
-0.029616864398121834,
-0.03404097259044647,
0.031486790627241135,
0.06201465055346489,
0.10179958492517471,
-0.02122572436928749,
0.10285715758800507,
-0.1384131759405136,
0.12278306484222412,
0.14938096702098846,
-0.2134951949119568,
-0.012952122837305069,
0.11809306591749191,
0.06569668650627136,
0.11880403012037277,
-0.04743552953004837,
-0.04979895055294037,
0.09102065116167068,
0.058575019240379333,
0.024429911747574806,
-0.06032118201255798,
-0.019670309498906136,
0.0356781929731369,
-0.12008824199438095,
-0.08623035997152328,
0.1550297737121582,
-0.011546370573341846,
-0.08924709260463715,
-0.0736757218837738,
-0.035501763224601746,
-0.17603561282157898,
0.04210077226161957,
0.05015610530972481,
-0.004578468389809132,
-0.012686001136898994,
-0.020740021020174026,
0.0004255336825735867,
-0.09188908338546753,
-0.14293722808361053,
-0.004168912768363953,
0.10273487865924835,
0.06538725644350052,
0.03867533430457115,
-0.06870604306459427,
0.12001165002584457,
-0.02535233646631241,
-0.037423234432935715,
-0.04581470787525177,
-0.018410110846161842,
-0.12146635353565216,
0.02538948878645897,
-0.05844482034444809,
-0.20088273286819458,
0.03947250917553902,
0.00975754763931036,
-0.055544137954711914,
0.045143336057662964,
0.026615729555487633,
0.02296292968094349,
0.055933382362127304,
0.21948745846748352,
0.00250548729673028,
-0.0700591504573822,
0.06939796358346939,
0.045602910220623016,
-0.059783902019262314,
-0.017889978364109993,
-0.07775259017944336,
-0.07849524170160294,
0.0982874408364296,
0.09819678217172623,
-0.11443918943405151,
0.05547982454299927,
-0.05442116782069206,
-0.0442340262234211,
0.011904074810445309,
-0.14807447791099548,
-0.015345807187259197,
0.005775729659944773,
-0.06126163899898529,
-0.05145658552646637,
0.06386394053697586,
-0.14470502734184265,
-0.1435261368751526,
-0.04403671622276306,
-0.08104642480611801,
-0.056400541216135025,
-0.13505566120147705,
-0.15909811854362488,
-0.016291236504912376,
-0.07614852488040924,
-0.004274518694728613,
-0.09061893075704575,
-0.15696638822555542,
-0.018533172085881233,
0.004685613326728344,
0.009988857433199883,
0.007460147142410278,
-0.0514753982424736,
-0.005348642822355032,
-0.024598373100161552,
-0.031247593462467194,
-0.038416583091020584,
-0.06058550626039505,
0.1156054437160492,
0.07147979736328125,
0.034581441432237625,
-0.005774409975856543,
0.02213709056377411,
-0.07496044784784317,
0.055600833147764206,
-0.1463048756122589,
0.11311070621013641,
-0.08011209964752197,
0.08835306763648987,
-0.05199257284402847,
-0.09450920671224594,
0.07774440199136734,
0.04301401600241661,
0.08225321024656296,
0.064791738986969,
-0.09691070765256882,
-0.047535207122564316,
0.17330294847488403,
-0.11688672751188278,
-0.10000288486480713,
0.14739283919334412,
-0.031792052090168,
0.03236423805356026,
0.11551361531019211,
0.14283382892608643,
0.16024112701416016,
-0.06285930424928665,
-0.00870745163410902,
0.0643395334482193,
0.024679236114025116,
-0.08275460451841354,
0.06422330439090729,
0.06318707764148712,
-0.08903610706329346,
0.04789035767316818,
-0.017628811299800873,
0.11243820190429688,
-0.026726864278316498,
-0.01169163640588522,
-0.057113759219646454,
-0.07446946948766708,
-0.0407402478158474,
-0.02170868031680584,
0.05219264701008797,
-0.08114321529865265,
-0.07466471195220947,
0.025390587747097015,
0.16694927215576172,
-0.1183394268155098,
0.00022496137535199523,
-0.09068019688129425,
0.04513253644108772,
-0.0835401639342308,
0.03315308317542076,
-0.11919110268354416,
0.01679460145533085,
0.08788309246301651,
-0.030397113412618637,
0.08500844985246658,
0.10344140976667404,
0.01976699009537697,
0.042586494237184525,
0.0021626960951834917,
-0.024661263450980186,
-0.09035450965166092,
-0.06025448068976402,
-0.084666408598423,
-0.06372426450252533,
-0.07550443708896637,
-0.042527515441179276,
-0.02684864215552807,
-0.12961676716804504,
0.025078915059566498,
0.08140067011117935,
0.02054811455309391,
0.006600492633879185,
-0.030905770137906075,
0.02833564579486847,
0.03613504394888878,
-0.06356750428676605,
-0.061578333377838135,
-0.004553409293293953,
0.021471641957759857,
-0.07499756664037704,
-0.02533409744501114,
-0.04821307212114334,
0.004283674992620945,
0.11023107916116714,
0.08947240561246872,
-0.07207871228456497,
-0.005098058842122555,
-0.02916562370955944,
-0.029928909614682198,
0.010184964165091515,
-0.07325739413499832,
0.16481317579746246,
-0.0037984694354236126,
0.17248240113258362,
-0.15096765756607056,
-0.024215875193476677,
-0.012513128109276295,
0.014424442313611507,
0.04533296078443527,
0.12307416647672653,
-0.037383850663900375,
-0.12373673170804977,
0.07115352153778076,
-0.03621917590498924,
-0.07767659425735474,
0.22312097251415253,
-0.03604229912161827,
-0.09353821724653244,
0.011559337377548218,
0.10807690769433975,
-0.0119466008618474,
0.14156626164913177,
-0.18469145894050598,
-0.037956345826387405,
0.008049932308495045,
0.02388540655374527,
0.08020339161157608,
-0.14119169116020203,
-0.0021127224899828434,
0.013576468452811241,
-0.08332695066928864,
-0.0835454910993576,
-0.026403460651636124,
-0.01127253845334053,
0.03954608365893364,
0.016458436846733093,
0.016983812674880028,
0.02365954779088497,
-0.02691652812063694,
-0.09404688328504562,
0.2235134243965149,
-0.10138583928346634,
-0.2608528733253479,
-0.18450020253658295,
0.0602656751871109,
-0.07474032789468765,
0.014862805604934692,
0.0284048393368721,
-0.1459754854440689,
-0.04561008885502815,
-0.04490482434630394,
0.21593692898750305,
-0.08096576482057571,
0.04206405207514763,
-0.036750923842191696,
0.06117532029747963,
-0.008893678896129131,
-0.19702810049057007,
0.02833286114037037,
-0.009714018553495407,
-0.055025652050971985,
0.01127704232931137,
-0.12408901751041412,
0.0712868943810463,
0.16278010606765747,
-0.047733571380376816,
0.04166776314377785,
-0.008060320280492306,
0.25202903151512146,
-0.06999465823173523,
-0.03785566985607147,
0.13393506407737732,
0.014744703657925129,
0.010591540485620499,
0.014528719708323479,
-0.009493210352957249,
-0.08258730918169022,
0.06188970059156418,
-0.0013992898166179657,
-0.04032951593399048,
-0.2718929052352905,
0.007198537699878216,
-0.06768441945314407,
0.0840250626206398,
0.03935506194829941,
0.054217658936977386,
-0.05768207460641861,
0.04653649777173996,
0.021284816786646843,
0.13873858749866486,
0.001475111348554492,
0.05457468330860138,
-0.0069162361323833466,
-0.014809897169470787,
0.02184685505926609,
-0.06693942844867706,
0.007667604833841324,
0.10710876435041428,
0.12296950072050095,
0.23251189291477203,
-0.11579187959432602,
0.21088986098766327,
0.03929084166884422,
0.06999682635068893,
0.050640806555747986,
0.11780311167240143,
-0.13045822083950043,
0.019189216196537018,
0.012013055384159088,
-0.003830399364233017,
-0.11312927305698395,
0.029838263988494873,
-0.02313275821506977,
0.025482244789600372,
-0.09293953329324722,
-0.05034615099430084,
0.03765159472823143,
0.17262527346611023,
0.054407987743616104,
-0.2171865552663803,
-0.12738431990146637,
0.012101264670491219,
-0.10777506977319717,
-0.1028846949338913,
0.05987665802240372,
0.21094417572021484,
-0.06530553847551346,
-0.021217990666627884,
-0.009367948397994041,
0.12451022863388062,
-0.10359405726194382,
-0.018748607486486435,
-0.054243121296167374,
0.10328732430934906,
-0.028027504682540894,
0.13181675970554352,
-0.2729181945323944,
0.10220932960510254,
-0.0006927304202690721,
0.043593358248472214,
-0.06305444985628128,
0.05578751862049103,
-0.03764978051185608,
0.08852655440568924,
0.037223562598228455,
-0.0017962077399715781,
0.030674725770950317,
-0.17296995222568512,
-0.016441229730844498,
0.030462969094514847,
0.0516250841319561,
0.010685401037335396,
0.07114767283201218,
0.012036806903779507,
0.05182178318500519,
-0.014629821293056011,
-0.11995097249746323,
-0.07154729962348938,
-0.08869299292564392,
0.022599609568715096,
-0.025270730257034302,
-0.005751720163971186,
-0.0727510005235672,
-0.013233810663223267,
0.06470777094364166,
0.23689034581184387,
-0.08167504519224167,
-0.10324045270681381,
-0.09181723743677139,
0.06878451257944107,
0.12536264955997467,
-0.07685159891843796,
0.04426618292927742,
-0.018173376098275185,
0.05729866400361061,
-0.018345192074775696,
-0.06603004783391953,
0.06643347442150116,
-0.04670583829283714,
-0.07833017408847809,
-0.0059135062620043755,
0.0989769771695137,
0.03780296817421913,
0.016974037513136864,
-0.001819159253500402,
-0.10248690843582153,
-0.05965995416045189,
-0.11490605771541595,
-0.09776859730482101,
-0.04510365054011345,
-0.008945249952375889,
0.10290976613759995,
-0.07343697547912598,
-0.02261419966816902,
-0.03839937597513199,
-0.04662514850497246,
0.10462047159671783,
0.15831245481967926,
-0.02437537908554077,
0.02217094413936138,
0.1365852802991867,
-0.04454957693815231,
-0.17466624081134796,
0.03420960158109665,
0.06044034659862518,
0.10516250133514404,
-0.09851545840501785,
-0.19200098514556885,
0.013152358122169971,
0.018897943198680878,
0.029083987697958946,
0.047686800360679626,
-0.3167014718055725,
-0.11578366905450821,
0.06482958793640137,
0.11720887571573257,
0.1176094189286232,
-0.11728883534669876,
-0.04282884672284126,
-0.06172913685441017,
-0.07869347929954529,
0.10046711564064026,
-0.05081179365515709,
0.14730466902256012,
-0.046087756752967834,
0.05086814984679222,
0.04049551114439964,
-0.05464579910039902,
0.06869044154882431,
0.023393096402287483,
0.08899855613708496,
-0.042717620730400085,
0.032186541706323624,
0.11885562539100647,
-0.038370657712221146,
0.15656781196594238,
-0.13420678675174713,
0.11532577127218246,
-0.17220072448253632,
-0.073869489133358,
-0.08864372223615646,
0.01581524685025215,
-0.008160240016877651,
-0.06567811220884323,
-0.12673527002334595,
0.03447101637721062,
-0.020020974799990654,
-0.0023905704729259014,
0.06843768060207367,
-0.026836249977350235,
-0.030495617538690567,
0.09674433618783951,
0.059859201312065125,
-0.03753243014216423,
-0.0639849528670311,
0.053147610276937485,
0.03930424153804779,
0.09520124644041061,
-0.2126767337322235,
0.023188676685094833,
0.09047240763902664,
0.014183392748236656,
0.13541792333126068,
0.044743966311216354,
-0.1440865844488144,
0.0242756400257349,
0.0866740494966507,
-0.09434191137552261,
-0.062114033848047256,
-0.034087251871824265,
-0.06182315945625305,
-0.057138893753290176,
0.09133091568946838,
0.13206066191196442,
-0.0353844091296196,
-0.016949810087680817,
-0.03254491463303566,
-0.008570202626287937,
-0.10847857594490051,
0.21574437618255615,
0.06391791254281998,
0.06355247646570206,
-0.07848983258008957,
0.04556550830602646,
0.0734584704041481,
-0.06453390419483185,
0.009119370952248573,
0.17417113482952118,
-0.10696370899677277,
-0.04670819267630577,
0.028263069689273834,
0.11268478631973267,
-0.043282218277454376,
-0.027229519560933113,
-0.11844489723443985,
-0.06465686112642288,
0.03499757498502731,
0.1413465142250061,
0.08007719367742538,
0.10536091774702072,
-0.024110635742545128,
0.03520563617348671,
-0.08396978676319122,
0.06864802539348602,
0.06036372482776642,
0.05852065607905388,
-0.1319032907485962,
0.14625848829746246,
0.03817439451813698,
0.12309971451759338,
-0.030518664047122,
-0.0013100908836349845,
-0.13810765743255615,
0.06262245774269104,
-0.08908456563949585,
0.012735908851027489,
-0.010993222706019878,
0.04101835936307907,
-0.018638480454683304,
-0.02344481833279133,
-0.02284579910337925,
0.05993613973259926,
-0.08590016514062881,
0.007821805775165558,
-0.013239662162959576,
0.03716148063540459,
-0.04352477937936783,
-0.008906524628400803,
0.0438557006418705,
-0.10501427203416824,
0.1644587367773056,
-0.011944994330406189,
-0.02765730954706669,
0.06690754741430283,
-0.03717139735817909,
0.05746232345700264,
0.0209647323936224,
0.04061203449964523,
0.004079430364072323,
0.03918835148215294,
0.08051987737417221,
0.025117406621575356,
0.0359145924448967,
0.027678608894348145,
0.11033879220485687,
-0.12642110884189606,
-0.09407772868871689,
-0.019260143861174583,
-0.07516403496265411,
-0.05593566969037056,
0.09765557944774628,
0.06895232945680618,
0.09857277572154999,
0.06637086719274521,
-0.01638777181506157,
0.014740224927663803,
-0.13713888823986053,
-0.06146213039755821,
0.03987397253513336,
-0.05606276914477348,
-0.014898080378770828,
-0.047232650220394135,
0.05836457759141922,
-0.03027915023267269,
0.1714191883802414,
0.01622958667576313,
0.038956061005592346,
-0.023300716653466225,
0.017968904227018356,
0.05731412395834923,
0.034121569246053696,
0.1994009017944336,
-0.07427255809307098,
0.024951785802841187,
-0.03148052841424942,
-0.004946532193571329,
0.027686236426234245,
0.05618207901716232,
0.06930585205554962,
0.08116764575242996,
0.010058406740427017,
0.09051669389009476,
0.02635350450873375,
0.00912118423730135,
-0.08662097156047821,
-0.017872178927063942,
-0.022624114528298378,
0.06678453832864761,
-0.04803336039185524,
0.17456236481666565,
0.08349798619747162,
-0.09433486312627792,
0.09806045144796371,
0.05244215577840805,
-0.13948573172092438,
-0.027254648506641388,
-0.036705490201711655,
-0.026663176715373993,
-0.16735990345478058,
0.039047446101903915,
-0.12763524055480957,
0.0036283666267991066,
0.05875590071082115,
0.07278633117675781,
-0.06021956726908684,
0.2010907083749771,
0.07787186652421951,
-0.07983028143644333,
0.07053646445274353,
0.006448439788073301,
0.011754338629543781,
0.06757842749357224,
-0.010478882119059563,
0.056430526077747345,
-0.01422804407775402,
0.0593721866607666,
0.00896501261740923,
-0.005183989182114601,
0.0062666768208146095,
-0.007409319747239351,
-0.008481278084218502,
-0.033374473452568054,
0.005328561645001173,
0.02748066373169422,
0.1527196764945984,
0.009852634742856026,
-0.07428913563489914,
-0.018080707639455795,
0.17479044198989868,
-0.0481141060590744,
-0.07090037316083908,
-0.13092224299907684,
0.1501183658838272,
0.018611161038279533,
0.019192082807421684,
0.007550040725618601,
-0.096363365650177,
-0.05953861027956009,
0.2374526560306549,
0.06694776564836502,
-0.03893185779452324,
-0.03609925135970116,
-0.0037801028229296207,
-0.00889031682163477,
-0.02186335250735283,
0.18074025213718414,
0.009234289638698101,
0.21920323371887207,
0.025080464780330658,
0.02790132164955139,
-0.049245961010456085,
-0.041030291467905045,
0.0049485997296869755,
0.1377129852771759,
-0.03413914144039154,
-0.019947130233049393,
-0.10431945323944092,
0.007600618060678244,
-0.010040785185992718,
-0.1323334127664566,
0.052133187651634216,
-0.12089989334344864,
-0.09448839724063873,
-0.022130658850073814,
0.06889642030000687,
-0.05959479510784149,
0.04589868709445,
-0.027475161477923393,
0.05715102702379227,
0.030733294785022736,
-0.028657371178269386,
-0.11017654836177826,
-0.16704532504081726,
0.10374834388494492,
-0.025148775428533554,
0.12217123061418533,
-0.012019799090921879,
0.13902465999126434,
0.09668362140655518,
0.007611950859427452,
-0.06600876152515411,
0.08663731068372726,
0.01639936864376068,
0.004913764074444771,
0.017409231513738632,
0.14296339452266693,
-0.04667731001973152,
0.09968798607587814,
-0.036858998239040375,
-0.05702072009444237,
-0.032406002283096313,
-0.057209208607673645,
0.01205494999885559,
-0.181098073720932,
-0.021212024614214897,
-0.11247290670871735,
0.08584810793399811,
0.17076285183429718,
-0.04210423305630684,
-0.00041446054819971323,
-0.10279399156570435,
0.08497834205627441,
-0.023556137457489967,
0.06176308915019035,
-0.0282128993421793,
-0.19804611802101135,
-0.024027135223150253,
0.05618566274642944,
0.023748543113470078,
-0.25348514318466187,
-0.0019448032835498452,
-0.02355780266225338,
-0.010353963822126389,
-0.08423276990652084,
0.16772034764289856,
0.07806951552629471,
0.04152870550751686,
-0.023810256272554398,
-0.09023386985063553,
-0.020990176126360893,
0.03148689493536949,
-0.1323145478963852,
-0.13005977869033813
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the git commit message generation task for the java commit changes.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_commit_generation_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_commit_generation_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/commit%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_base_commit_generation_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the git commit message generation task for the java commit changes.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 16,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08594922721385956,
0.05492645874619484,
-0.0019491245038807392,
0.10277140885591507,
0.059663865715265274,
0.025168485939502716,
0.06826309114694595,
0.09656370431184769,
-0.019444160163402557,
0.07311385124921799,
0.07413879781961441,
-0.028909927234053612,
0.07434502989053726,
0.1772640496492386,
0.03157714009284973,
-0.1679355949163437,
-0.013686462305486202,
0.02069207839667797,
-0.0439172200858593,
0.10477381944656372,
0.09652048349380493,
-0.08700217306613922,
0.06717489659786224,
-0.03145183250308037,
-0.11565423756837845,
0.044407542794942856,
-0.0443652905523777,
-0.041630472987890244,
0.0850689709186554,
0.0647062212228775,
0.10404506325721741,
-0.02210131473839283,
0.057490479201078415,
-0.19124314188957214,
0.0008867444703355432,
0.031951237469911575,
0.04803526774048805,
0.023000316694378853,
0.05725320428609848,
0.07850230485200882,
0.12179561704397202,
-0.04024902358651161,
0.026210196316242218,
0.04863550141453743,
-0.05759513005614281,
-0.034039855003356934,
-0.05839499458670616,
0.06911487132310867,
0.12125606089830399,
0.08672424405813217,
-0.016319118440151215,
0.012703406624495983,
-0.08364672213792801,
0.08610695600509644,
0.1348753720521927,
-0.218458890914917,
-0.022135397419333458,
0.10668535530567169,
0.09198351949453354,
0.07106819748878479,
-0.08249115943908691,
-0.022840019315481186,
0.10932696610689163,
0.037622079253196716,
0.04081070050597191,
-0.07169345766305923,
-0.024137215688824654,
0.005887072533369064,
-0.06583552807569504,
-0.0636986792087555,
0.10332626849412918,
0.03787636384367943,
-0.06084764003753662,
-0.11169381439685822,
-0.04101717472076416,
-0.2067604511976242,
0.0443793423473835,
0.03180563077330589,
0.014425274915993214,
-0.01263451762497425,
0.020313290879130363,
-0.01614641584455967,
-0.09321427345275879,
-0.10632885247468948,
-0.003966846968978643,
0.08518312871456146,
0.07931586354970932,
0.029339000582695007,
-0.0017003918765112758,
0.07037514448165894,
-0.02475859969854355,
-0.05066460743546486,
-0.0377022810280323,
0.01861877553164959,
-0.12404555827379227,
0.032356999814510345,
-0.026352226734161377,
-0.07109494507312775,
-0.0007202064152806997,
0.08873778581619263,
-0.08206083625555038,
0.07176226377487183,
0.12104853987693787,
0.010232239030301571,
0.005610381253063679,
0.23361681401729584,
0.017827725037932396,
-0.11654312908649445,
0.01154231745749712,
0.030486037954688072,
0.0022720571141690016,
-0.013695443980395794,
-0.07205186784267426,
-0.02660726010799408,
0.015913665294647217,
0.06091984361410141,
-0.13372103869915009,
0.020153295248746872,
-0.039678335189819336,
-0.015859762206673622,
0.07559867948293686,
-0.12307988107204437,
0.026578033342957497,
0.006283499766141176,
-0.0736149400472641,
-0.05631478875875473,
0.05243510752916336,
-0.10530677437782288,
-0.10844852030277252,
0.032230425626039505,
-0.0577935129404068,
-0.039516713470220566,
-0.11499346792697906,
-0.13281212747097015,
-0.0110130300745368,
-0.07944800704717636,
0.015623959712684155,
-0.10227737575769424,
-0.1045386791229248,
-0.01518634520471096,
0.021580759435892105,
-0.006322733126580715,
-0.016074709594249725,
-0.04437877610325813,
0.012700841762125492,
-0.0012398648541420698,
-0.027538441121578217,
0.0017865973059087992,
-0.04350077360868454,
0.09452381730079651,
0.0819685235619545,
0.05798887833952904,
0.032705239951610565,
0.020547732710838318,
-0.06403176486492157,
0.06489983201026917,
-0.09570888429880142,
0.0664728432893753,
-0.0319337360560894,
0.07095877826213837,
-0.08864618092775345,
-0.08597328513860703,
0.06655565649271011,
0.049884095788002014,
0.06547015905380249,
0.02868025004863739,
-0.08257631957530975,
0.0024380802642554045,
0.13263288140296936,
-0.09144917875528336,
-0.12446700781583786,
0.12965336441993713,
0.004349199123680592,
-0.024075830355286598,
0.07503504306077957,
0.13864727318286896,
0.14819757640361786,
-0.09984470158815384,
-0.06020985171198845,
0.08479305356740952,
0.06639669835567474,
-0.052228882908821106,
0.09452468156814575,
0.0392235666513443,
0.03330572694540024,
0.019411299377679825,
0.0451168529689312,
0.07079269737005234,
-0.015079040080308914,
-0.028273507952690125,
-0.023707661777734756,
-0.0899314433336258,
-0.03021812066435814,
-0.020370077341794968,
0.020705657079815865,
-0.07958803325891495,
-0.08044111728668213,
-0.001045453129336238,
0.1790316253900528,
-0.09757325798273087,
0.023789221420884132,
-0.09928493201732635,
-0.051317889243364334,
-0.08721411973237991,
0.021772364154458046,
-0.08587729930877686,
0.009822214022278786,
0.05869046598672867,
-0.043130114674568176,
0.07687769085168839,
0.07702931761741638,
0.0032880580984055996,
0.04110055789351463,
-0.032722316682338715,
-0.04729616269469261,
-0.04836805909872055,
-0.06497061997652054,
-0.1294645369052887,
-0.019094273447990417,
-0.09008749574422836,
-0.02327336184680462,
-0.06077124923467636,
-0.15118512511253357,
0.021040406078100204,
-0.009075634181499481,
0.021550552919507027,
-0.005759679712355137,
-0.023759566247463226,
0.03561457246541977,
0.03635302558541298,
-0.057081978768110275,
-0.09424787014722824,
0.003512802766636014,
0.017247900366783142,
-0.11919102072715759,
-0.024076830595731735,
-0.10190385580062866,
-0.04539112374186516,
0.07973610609769821,
0.09903517365455627,
-0.06539574265480042,
0.007303853984922171,
-0.02767556719481945,
-0.05752037838101387,
-0.026910819113254547,
-0.08126122504472733,
0.17416231334209442,
0.010997953824698925,
0.1641228199005127,
-0.14049836993217468,
-0.04966409131884575,
-0.02179761603474617,
-0.012869460508227348,
0.01339868176728487,
0.15027789771556854,
-0.013265611603856087,
-0.09133075922727585,
0.05366988107562065,
-0.030909786000847816,
-0.061083968728780746,
0.15695340931415558,
0.0011558739934116602,
-0.08984792977571487,
0.01931109093129635,
0.09779715538024902,
-0.013240145519375801,
0.13172894716262817,
-0.08972036838531494,
-0.012499550357460976,
-0.0028390439692884684,
0.02541639283299446,
0.04920750483870506,
-0.13709452748298645,
0.020576151087880135,
0.058586813509464264,
-0.07951494306325912,
-0.0469326414167881,
-0.03541402518749237,
-0.04747812822461128,
0.037929702550172806,
0.008108094334602356,
0.005617009475827217,
-0.009111744351685047,
-0.020066218450665474,
-0.0869026631116867,
0.20833086967468262,
-0.08991862088441849,
-0.237225741147995,
-0.16969841718673706,
0.010372058488428593,
-0.06434745341539383,
0.007293278351426125,
0.04794581979513168,
-0.1468476504087448,
-0.06176093965768814,
-0.07926495373249054,
0.14815250039100647,
-0.1192425936460495,
0.02377854846417904,
-0.0008116402896121144,
0.028859194368124008,
0.021546322852373123,
-0.1753835678100586,
0.03234396502375603,
-0.0071387095376849174,
-0.018315427005290985,
0.00621231971308589,
-0.06495707482099533,
0.08775964379310608,
0.1346322000026703,
-0.07822803407907486,
0.022064069285988808,
-0.011649464257061481,
0.19260962307453156,
-0.059604495763778687,
0.026905914768576622,
0.20601396262645721,
0.03313940018415451,
0.03772849217057228,
0.03184747323393822,
0.011056180112063885,
-0.08964408934116364,
0.06471027433872223,
0.055136557668447495,
-0.03722069412469864,
-0.2622569501399994,
0.004749277140945196,
-0.05914754420518875,
0.058178480714559555,
0.11131840199232101,
0.05380808189511299,
-0.12075188755989075,
0.039457887411117554,
-0.019704945385456085,
0.15956687927246094,
-0.038158610463142395,
0.05234822258353233,
-0.0013190321624279022,
0.008660370483994484,
0.015627359971404076,
-0.08191978186368942,
0.016425523906946182,
0.08706578612327576,
0.1227789893746376,
0.19652500748634338,
-0.06919022649526596,
0.20872467756271362,
0.02169247716665268,
0.06744802743196487,
0.02073013409972191,
0.10202258080244064,
-0.13301622867584229,
-0.006998751778155565,
0.006261848844587803,
-0.005410410929471254,
-0.078798808157444,
0.054619863629341125,
-0.018489224836230278,
0.05077500268816948,
-0.06535127013921738,
0.04030577838420868,
0.027909429743885994,
0.16628336906433105,
0.06516031920909882,
-0.18383640050888062,
-0.12109468877315521,
0.019060619175434113,
-0.1052960455417633,
-0.10894287377595901,
0.06626061350107193,
0.20729608833789825,
-0.030951641499996185,
0.018608758226037025,
-0.009723687544465065,
0.13461337983608246,
-0.09104928374290466,
-0.01727481000125408,
0.025456100702285767,
0.09435883909463882,
0.0013775028055533767,
0.13418792188167572,
-0.2660975158214569,
0.06951630860567093,
0.017477424815297127,
0.08316970616579056,
-0.02904001995921135,
0.06424479186534882,
-0.03812749311327934,
0.01282906997948885,
0.0640145018696785,
0.0014025539858266711,
-0.0662226751446724,
-0.2188410609960556,
-0.06492722779512405,
0.021261677145957947,
0.06588630378246307,
-0.022799182683229446,
0.0920681282877922,
-0.0026499901432543993,
0.05942332372069359,
-0.03139601647853851,
-0.12576746940612793,
-0.056009020656347275,
-0.13063539564609528,
-0.012016762979328632,
0.014514187350869179,
-0.03505701944231987,
-0.035525742918252945,
0.014744391664862633,
-0.020947754383087158,
0.23965884745121002,
-0.14730684459209442,
-0.11647108942270279,
-0.09282694011926651,
0.08067484945058823,
0.1278620958328247,
-0.10118529200553894,
0.020793378353118896,
0.014475109986960888,
0.05752229690551758,
-0.05063432455062866,
-0.048962049186229706,
0.02928299456834793,
-0.06520291417837143,
-0.08523299545049667,
-0.019711636006832123,
0.09214609116315842,
-0.013248787261545658,
0.04925970733165741,
-0.0013253313954919577,
-0.09711118042469025,
-0.054164763540029526,
-0.13612183928489685,
-0.07549603283405304,
-0.03407188877463341,
0.028383800759911537,
0.02042304165661335,
-0.05955240875482559,
0.11157216876745224,
-0.025878820568323135,
-0.091905876994133,
0.06669323146343231,
0.19571703672409058,
-0.05278017744421959,
0.003039267612621188,
0.12105758488178253,
-0.04839611053466797,
-0.15999232232570648,
-0.056355446577072144,
0.05201248079538345,
0.08769124746322632,
-0.05261530727148056,
-0.15429195761680603,
0.05331861227750778,
0.0322747603058815,
0.024440553039312363,
0.025582144036889076,
-0.2916682958602905,
-0.1259143203496933,
0.03227440267801285,
0.07367674261331558,
0.07864171266555786,
-0.12236765772104263,
-0.04297617822885513,
-0.06424818187952042,
-0.059631507843732834,
0.039480406790971756,
0.05338597670197487,
0.13713937997817993,
-0.03568823263049126,
0.04230176657438278,
0.029937313869595528,
-0.030223926529288292,
0.106387197971344,
-0.000007136495241866214,
0.09246576577425003,
-0.02019367553293705,
0.03690359741449356,
0.05287295579910278,
-0.06634368002414703,
0.16427701711654663,
-0.18738271296024323,
0.09020069986581802,
-0.20611436665058136,
-0.05687443166971207,
-0.009974027052521706,
-0.01653282716870308,
-0.0316108837723732,
-0.056407488882541656,
-0.12512513995170593,
0.008435389026999474,
0.032464221119880676,
-0.01799919083714485,
0.08218446373939514,
-0.017942920327186584,
-0.05574432760477066,
0.058979280292987823,
0.06306619197130203,
-0.04856930300593376,
-0.13707105815410614,
0.019726576283574104,
0.03502952307462692,
0.08849193155765533,
-0.22233866155147552,
0.01607084646821022,
0.11777440458536148,
0.0038208540063351393,
0.11365686357021332,
0.008659233339130878,
-0.08992908149957657,
0.04267718642950058,
0.07203260064125061,
-0.04535961151123047,
-0.08199279010295868,
-0.01703774370253086,
-0.013747165910899639,
-0.0852365717291832,
0.04994253069162369,
0.09942500293254852,
-0.05242592841386795,
-0.01570807211101055,
-0.017179908230900764,
0.012341697700321674,
-0.0631214827299118,
0.2128879725933075,
0.030281612649559975,
0.08802187442779541,
-0.06799125671386719,
0.0751892700791359,
0.09750419855117798,
-0.10041478276252747,
0.018165504559874535,
0.16213974356651306,
-0.07176126539707184,
-0.02653469704091549,
0.03265216574072838,
0.06580215692520142,
-0.047112204134464264,
-0.062314022332429886,
-0.09960862249135971,
-0.07029502838850021,
0.021611163392663002,
0.01465024147182703,
0.06458491086959839,
0.0752895176410675,
-0.02721223048865795,
0.025371694937348366,
-0.09517078846693039,
0.09125100076198578,
0.06391613930463791,
0.05541478842496872,
-0.1506703495979309,
0.14145880937576294,
0.045709628611803055,
0.10682492703199387,
0.0006173608126118779,
0.04356866702437401,
-0.10022561252117157,
0.046605952084064484,
-0.024971453472971916,
0.04129880294203758,
-0.005041245371103287,
0.046264104545116425,
-0.02867434173822403,
0.01696479134261608,
-0.03008461929857731,
0.04813703894615173,
-0.040552496910095215,
-0.026126207783818245,
-0.03057413548231125,
0.03612479194998741,
-0.054547395557165146,
-0.022694004699587822,
0.011704966425895691,
-0.08343625068664551,
0.11628412455320358,
-0.07305970042943954,
-0.0076503511518239975,
-0.0019075402524322271,
-0.0014133313670754433,
0.07561463862657547,
0.02798290364444256,
0.05286548659205437,
-0.010325256735086441,
0.004523172974586487,
0.04848114401102066,
0.007412377744913101,
-0.009070892818272114,
-0.004250697325915098,
0.038500308990478516,
-0.13798780739307404,
-0.08953588455915451,
-0.09102866053581238,
-0.05337746813893318,
-0.06599020957946777,
0.08358737081289291,
0.08698699623346329,
0.0648808553814888,
0.08147673308849335,
-0.01857658289372921,
-0.0010266085155308247,
-0.1441553235054016,
-0.034338466823101044,
0.054583318531513214,
-0.029765348881483078,
-0.07747451961040497,
-0.047432806342840195,
0.058875683695077896,
-0.041256170719861984,
0.13355404138565063,
0.0036491488572210073,
0.04713011905550957,
-0.013665059581398964,
-0.016792597249150276,
0.0028088735416531563,
0.00030826826696284115,
0.19234326481819153,
-0.09331950545310974,
0.023469656705856323,
-0.006944069638848305,
-0.007796797435730696,
0.056526605039834976,
0.10851637274026871,
0.08599069714546204,
0.11104471981525421,
0.06730348616838455,
0.11363401263952255,
-0.04642075300216675,
-0.034048110246658325,
-0.17826758325099945,
0.0409996323287487,
-0.016956087201833725,
0.03880434110760689,
-0.025256097316741943,
0.10293779522180557,
0.14405730366706848,
-0.12598304450511932,
0.09190116077661514,
0.03214409202337265,
-0.10582578182220459,
-0.046484507620334625,
-0.056730665266513824,
-0.04507036134600639,
-0.10707131773233414,
0.026161815971136093,
-0.1130945011973381,
0.0272993016988039,
0.08478661626577377,
0.04716666787862778,
-0.022798219695687294,
0.153005912899971,
0.0015478492714464664,
-0.055048465728759766,
0.017243681475520134,
0.026446079835295677,
0.03986665979027748,
0.1118529662489891,
-0.01166615728288889,
0.08430351316928864,
-0.04939129203557968,
0.08996547013521194,
0.013433706946671009,
0.020808173343539238,
0.03217407688498497,
0.003483450273051858,
-0.009946024045348167,
-0.05017831549048424,
-0.00471455929800868,
0.08350160717964172,
0.16266584396362305,
0.025932790711522102,
-0.042503610253334045,
-0.05017905682325363,
0.1626047044992447,
-0.05173856019973755,
-0.05348173528909683,
-0.10851506143808365,
0.1459912210702896,
0.04894041270017624,
0.025952167809009552,
0.005008126609027386,
-0.08260248601436615,
-0.0653158500790596,
0.25061795115470886,
0.016953356564044952,
-0.04022516682744026,
-0.04857473075389862,
-0.01771792396903038,
-0.011216146871447563,
-0.03474150970578194,
0.15264391899108887,
0.024213187396526337,
0.19848798215389252,
0.009758582338690758,
-0.00389384594745934,
-0.03813609853386879,
-0.031633518636226654,
-0.029149573296308517,
0.18168562650680542,
-0.03619374334812164,
0.03919489309191704,
-0.09380234032869339,
-0.018116645514965057,
0.04028679057955742,
-0.12292192131280899,
0.09412115067243576,
-0.09628890454769135,
-0.07375392317771912,
0.034244436770677567,
0.09760402888059616,
-0.025664012879133224,
0.0463777594268322,
-0.010201185941696167,
0.052723489701747894,
0.015713997185230255,
-0.036153990775346756,
-0.0965440645813942,
-0.12478747963905334,
0.0496186725795269,
-0.004240135196596384,
0.1613435298204422,
0.01925339177250862,
0.09142950922250748,
0.09370901435613632,
0.004913369193673134,
-0.07719443738460541,
0.10091181099414825,
0.027971019968390465,
-0.006461385637521744,
0.06667032837867737,
0.13147170841693878,
-0.03605685383081436,
0.11722907423973083,
0.0012372060446068645,
-0.04517819359898567,
-0.04030769318342209,
-0.02469765394926071,
0.0003331718617118895,
-0.14766007661819458,
0.0035777920857071877,
-0.06807160377502441,
0.12193238735198975,
0.17851921916007996,
-0.04415157809853554,
-0.0245277788490057,
-0.03696943074464798,
0.06768987327814102,
-0.02654753439128399,
0.10297446697950363,
-0.003243825165554881,
-0.1754770576953888,
0.0075624724850058556,
-0.003580532968044281,
0.019465994089841843,
-0.1946420967578888,
-0.035571638494729996,
-0.0326235331594944,
-0.02930489182472229,
-0.08719902485609055,
0.14496837556362152,
0.06976918876171112,
0.024937454611063004,
-0.04030909016728401,
-0.18699373304843903,
-0.02173035405576229,
0.04096850007772446,
-0.1423964500427246,
-0.12533362209796906
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the git commit message generation task for the java commit changes.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_commit_generation_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_commit_generation_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/commit%20generation/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_base_commit_generation_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the git commit message generation task for the java commit changes.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
110
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.07995710521936417,
0.06923852115869522,
-0.00198376108892262,
0.1090051606297493,
0.05143113061785698,
0.02076282538473606,
0.043814510107040405,
0.11671967804431915,
-0.025290755555033684,
0.06960590183734894,
0.06365779042243958,
-0.06801668554544449,
0.06831049174070358,
0.18427622318267822,
0.029806572943925858,
-0.18325957655906677,
-0.02558937482535839,
0.01867530308663845,
-0.06431472301483154,
0.10750452429056168,
0.08501055091619492,
-0.06911011040210724,
0.07413533329963684,
-0.04097892343997955,
-0.11107675731182098,
0.04411061853170395,
-0.02280462719500065,
-0.024200137704610825,
0.09123358130455017,
0.06371410191059113,
0.10614346712827682,
-0.018039649352431297,
0.05076274275779724,
-0.19632172584533691,
0.0028181851375848055,
0.044981829822063446,
0.05509640648961067,
0.03331751748919487,
0.04217367246747017,
0.061569564044475555,
0.12941384315490723,
-0.029485823586583138,
0.03243125230073929,
0.05168239399790764,
-0.0678316056728363,
-0.03750720992684364,
-0.06211131066083908,
0.0580458790063858,
0.09256855398416519,
0.09372449666261673,
-0.014486835338175297,
-0.007008223794400692,
-0.09085845947265625,
0.07628225535154343,
0.14150112867355347,
-0.2175551801919937,
-0.018315710127353668,
0.11447369307279587,
0.08369293808937073,
0.07645322382450104,
-0.0801468938589096,
-0.02317523956298828,
0.09641299396753311,
0.039663802832365036,
0.057930734008550644,
-0.07617093622684479,
-0.016488224267959595,
0.0005860932287760079,
-0.06490316987037659,
-0.060714710503816605,
0.11777627468109131,
0.024200642481446266,
-0.05327770859003067,
-0.11920589953660965,
-0.05939552187919617,
-0.21330852806568146,
0.042815517634153366,
0.02287336066365242,
0.011653794907033443,
0.00791393406689167,
-0.01461187843233347,
-0.039781227707862854,
-0.09304669499397278,
-0.1138211190700531,
0.005242749582976103,
0.02685193531215191,
0.0644448846578598,
0.03903115913271904,
-0.0149724455550313,
0.09214644879102707,
-0.019487326964735985,
-0.04661387950181961,
-0.03320963680744171,
0.015620597638189793,
-0.12308330088853836,
0.028297321870923042,
-0.020234618335962296,
-0.06324940174818039,
0.008535665459930897,
0.08394937217235565,
-0.08940622210502625,
0.07449769228696823,
0.09616111218929291,
0.015769807621836662,
0.008119773119688034,
0.22543765604496002,
0.02364964224398136,
-0.13657864928245544,
0.03205646947026253,
0.02687971666455269,
0.001673191785812378,
-0.005365470424294472,
-0.07216974347829819,
-0.04207982122898102,
0.026154739782214165,
0.07273346185684204,
-0.11624486744403839,
0.018051467835903168,
-0.05123615637421608,
-0.0075182728469371796,
0.0705941766500473,
-0.11894624680280685,
0.03233812376856804,
0.0073217893950641155,
-0.069553904235363,
-0.0357954241335392,
0.07263585925102234,
-0.11625177413225174,
-0.11184027045965195,
0.023831330239772797,
-0.04604528099298477,
-0.039991751313209534,
-0.11302927136421204,
-0.11827674508094788,
-0.006489831022918224,
-0.058933157473802567,
-0.005087214056402445,
-0.10447555780410767,
-0.10228487849235535,
-0.012150761671364307,
0.028229108080267906,
-0.008705493994057178,
-0.025005793198943138,
-0.04754949361085892,
0.007178104016929865,
0.0015599681064486504,
-0.023267284035682678,
0.005952870938926935,
-0.043683551251888275,
0.09262814372777939,
0.05301406979560852,
0.0556732639670372,
0.004139578901231289,
0.01499022077769041,
-0.08374479413032532,
0.07788629829883575,
-0.11492760479450226,
0.08456778526306152,
-0.01839039847254753,
0.05393550544977188,
-0.10240266472101212,
-0.07763539999723434,
0.017430441454052925,
0.04247397929430008,
0.06596637517213821,
0.03550807386636734,
-0.11151662468910217,
0.008941346779465675,
0.15002849698066711,
-0.10935530066490173,
-0.09915757179260254,
0.13472241163253784,
0.0016806938219815493,
0.0019090080168098211,
0.08861464262008667,
0.14092083275318146,
0.16236373782157898,
-0.09380678087472916,
-0.03671862185001373,
0.08189224451780319,
0.043386831879615784,
-0.05080018937587738,
0.06710878014564514,
0.023721646517515182,
0.020792370662093163,
0.02574959211051464,
0.05013386532664299,
0.06618943810462952,
-0.002353143412619829,
-0.03082437999546528,
-0.04407278820872307,
-0.09171625226736069,
-0.040972936898469925,
-0.02058788761496544,
0.01661086454987526,
-0.05948615074157715,
-0.07362818717956543,
0.003112024161964655,
0.16887465119361877,
-0.09903653711080551,
0.03399483859539032,
-0.0803893506526947,
-0.042366381734609604,
-0.0575016550719738,
0.03415219858288765,
-0.09535399824380875,
0.0040145814418792725,
0.064399354159832,
-0.03225516155362129,
0.07039953023195267,
0.08121820539236069,
0.011082813143730164,
0.023159276694059372,
-0.05485378950834274,
-0.04456310346722603,
-0.031901028007268906,
-0.075358085334301,
-0.11457622796297073,
-0.016676906496286392,
-0.08024325221776962,
-0.014626836404204369,
-0.02891521528363228,
-0.14303866028785706,
0.011843093670904636,
0.015555715188384056,
0.03425073251128197,
0.016454482451081276,
-0.03857346624135971,
0.026848826557397842,
0.031003328040242195,
-0.04630885273218155,
-0.09809016436338425,
0.006035852245986462,
0.029798638075590134,
-0.10665208101272583,
-0.00844612531363964,
-0.08656084537506104,
-0.05623652786016464,
0.07449179887771606,
0.125686913728714,
-0.08908485621213913,
-0.01627104915678501,
-0.025290533900260925,
-0.049740057438611984,
-0.06046668440103531,
-0.07427211850881577,
0.16812671720981598,
0.01065734215080738,
0.1551390439271927,
-0.13598226010799408,
-0.046795815229415894,
-0.03231658786535263,
-0.006452756933867931,
0.027520623058080673,
0.1422172635793686,
-0.01691126823425293,
-0.1093406155705452,
0.046209461987018585,
-0.051582809537649155,
-0.06577814370393753,
0.1708548218011856,
-0.002061985433101654,
-0.08179296553134918,
-0.00018131147953681648,
0.10711260885000229,
-0.00442036148160696,
0.14742650091648102,
-0.058286696672439575,
0.0024582084733992815,
-0.01046870369464159,
0.023667503148317337,
0.048367757350206375,
-0.13573886454105377,
0.023866472765803337,
0.04261381924152374,
-0.08792655915021896,
-0.030086101964116096,
-0.03508811071515083,
-0.04260828346014023,
0.04463975504040718,
0.02146584913134575,
0.03137190267443657,
-0.01124805212020874,
-0.024690814316272736,
-0.09365752339363098,
0.18985828757286072,
-0.08803898841142654,
-0.24613842368125916,
-0.16757480800151825,
0.05708472058176994,
-0.035385485738515854,
0.001095756539143622,
0.03334374353289604,
-0.1224210187792778,
-0.05434419587254524,
-0.0955900028347969,
0.13173840939998627,
-0.11220080405473709,
0.0215971190482378,
-0.04767066240310669,
0.059001386165618896,
0.041885022073984146,
-0.15896499156951904,
0.028699133545160294,
-0.014540813863277435,
-0.014186778105795383,
-0.0074663786217570305,
-0.07141799479722977,
0.06642375141382217,
0.1347113698720932,
-0.0627460777759552,
0.024678407236933708,
-0.01302583422511816,
0.1748923808336258,
-0.06227812543511391,
0.045787129551172256,
0.18535643815994263,
0.028522783890366554,
0.04113255813717842,
0.048415474593639374,
0.0038328187074512243,
-0.0834999680519104,
0.06990358233451843,
0.05220771208405495,
-0.051906537264585495,
-0.22817645967006683,
-0.007287704851478338,
-0.06572069227695465,
0.08067955821752548,
0.1274554431438446,
0.0581606961786747,
-0.1380620151758194,
0.026922497898340225,
-0.021333202719688416,
0.14767470955848694,
-0.012370840646326542,
0.04906298592686653,
-0.0066785672679543495,
0.002098629716783762,
0.017773358151316643,
-0.07956800609827042,
0.016087282449007034,
0.09000791609287262,
0.1189165934920311,
0.18831104040145874,
-0.10079841315746307,
0.18848173320293427,
0.0022607746068388224,
0.1073460802435875,
0.04706225544214249,
0.07234516739845276,
-0.1269831359386444,
0.007978226989507675,
0.00996134988963604,
-0.013933200389146805,
-0.06949755549430847,
0.05725836381316185,
-0.004659196361899376,
0.05016718804836273,
-0.056359682232141495,
0.029295584186911583,
0.026923364028334618,
0.19638407230377197,
0.08278971165418625,
-0.17098280787467957,
-0.11638835817575455,
0.018778236582875252,
-0.09811142086982727,
-0.10115251690149307,
0.0695524588227272,
0.22914756834506989,
-0.04740443080663681,
0.012349228374660015,
-0.010256413370370865,
0.12980923056602478,
-0.12269014120101929,
-0.024545922875404358,
0.02142762951552868,
0.09326829016208649,
-0.006035180762410164,
0.11815761774778366,
-0.26191091537475586,
0.04911082610487938,
0.015354675240814686,
0.07671855390071869,
-0.018626896664500237,
0.056333042681217194,
-0.04087137058377266,
0.002877121791243553,
0.06110133230686188,
0.011539838276803493,
-0.06561896950006485,
-0.18830606341362,
-0.069451242685318,
0.01852244883775711,
0.060856401920318604,
-0.023548047989606857,
0.08316394686698914,
-0.019775820896029472,
0.04009100794792175,
-0.023965267464518547,
-0.15196402370929718,
-0.05352103337645531,
-0.14048033952713013,
-0.0249957162886858,
0.014342553913593292,
-0.02237115055322647,
-0.04412389174103737,
0.03803519159555435,
0.04742453619837761,
0.2565673887729645,
-0.129991352558136,
-0.10229148715734482,
-0.09520259499549866,
0.08458401262760162,
0.13624203205108643,
-0.09288088232278824,
0.03261334449052811,
0.01684148982167244,
0.06519319862127304,
-0.046337202191352844,
-0.06847614049911499,
0.036564718931913376,
-0.05291038006544113,
-0.08687857538461685,
-0.023399606347084045,
0.12399502843618393,
-0.0033320169895887375,
0.046324193477630615,
0.0018366097938269377,
-0.08673112839460373,
-0.04072188213467598,
-0.12673380970954895,
-0.07570791244506836,
-0.00449036993086338,
0.008632232435047626,
0.032491087913513184,
-0.07801477611064911,
0.08746831864118576,
-0.012864136137068272,
-0.08440011739730835,
0.07626516371965408,
0.17452509701251984,
-0.06289773434400558,
0.030138326808810234,
0.094627745449543,
-0.04655276983976364,
-0.1787334531545639,
-0.04135175049304962,
0.04871658980846405,
0.07795314490795135,
-0.052402544766664505,
-0.15610577166080475,
0.03662845492362976,
0.030992794781923294,
0.01760837249457836,
0.030183792114257812,
-0.31125518679618835,
-0.12878826260566711,
0.0012470970395952463,
0.06390891224145889,
0.07707639783620834,
-0.10122747719287872,
-0.051152702420949936,
-0.06976059824228287,
-0.021543309092521667,
0.06573638319969177,
0.0450790673494339,
0.12604382634162903,
-0.030968952924013138,
0.031029600650072098,
0.0419217124581337,
-0.025965817272663116,
0.07593907415866852,
-0.024203144013881683,
0.08096425235271454,
-0.01925245113670826,
0.038331370800733566,
0.05240525305271149,
-0.07187529653310776,
0.16464021801948547,
-0.1809806525707245,
0.0972168892621994,
-0.15247097611427307,
-0.04889987036585808,
-0.020558765158057213,
-0.003376164473593235,
-0.02035822719335556,
-0.055596377700567245,
-0.13978473842144012,
0.025716528296470642,
0.038753047585487366,
-0.017442695796489716,
0.04416407272219658,
-0.026793818920850754,
-0.061469949781894684,
0.08085977286100388,
0.056645218282938004,
-0.024424225091934204,
-0.12924525141716003,
0.03186885267496109,
0.02318268083035946,
0.08730653673410416,
-0.2152087688446045,
0.024470413103699684,
0.10174234956502914,
0.013920358382165432,
0.10774356871843338,
0.010322623886168003,
-0.09528211504220963,
0.031795673072338104,
0.07264707237482071,
-0.057391174137592316,
-0.0886320024728775,
-0.02337607927620411,
-0.031036224216222763,
-0.08874152600765228,
0.03007751703262329,
0.10951922088861465,
-0.05732273310422897,
-0.008976970799267292,
-0.003248668974265456,
0.022657064720988274,
-0.06877384334802628,
0.197302907705307,
0.021211009472608566,
0.0683070570230484,
-0.06739207357168198,
0.08117525279521942,
0.090835340321064,
-0.10244005173444748,
0.02000431902706623,
0.16626162827014923,
-0.08334505558013916,
-0.022678611800074577,
0.04437241703271866,
0.05722161754965782,
-0.03846652805805206,
-0.043623633682727814,
-0.09086540341377258,
-0.07269932329654694,
0.020268762484192848,
0.019759580492973328,
0.06114808842539787,
0.09192857146263123,
-0.032387036830186844,
0.02332925610244274,
-0.10217761248350143,
0.0933794230222702,
0.07492277771234512,
0.054080937057733536,
-0.13947874307632446,
0.13351930677890778,
0.0461609773337841,
0.08084522932767868,
-0.0021349936723709106,
0.028016895055770874,
-0.11644989252090454,
0.043238427489995956,
-0.022127823904156685,
0.03705284371972084,
-0.0017748342361301184,
0.04098406806588173,
-0.0350240021944046,
0.024185510352253914,
-0.027414174750447273,
0.04335174709558487,
-0.03616700693964958,
-0.03292807191610336,
-0.04794025421142578,
0.019255004823207855,
-0.06365015357732773,
-0.020172661170363426,
0.016239482909440994,
-0.08916232734918594,
0.11523483693599701,
-0.05522724241018295,
-0.0061470018699765205,
-0.0003791640338022262,
0.012689118273556232,
0.05965660512447357,
0.03378396853804588,
0.04402144253253937,
-0.014519975520670414,
0.006344643887132406,
0.03914522007107735,
0.011518611572682858,
-0.011998330242931843,
0.00006015413600835018,
0.06472203135490417,
-0.13737636804580688,
-0.07746745645999908,
-0.08871755748987198,
-0.05248020961880684,
-0.06562842428684235,
0.07728464901447296,
0.08708009123802185,
0.08316037803888321,
0.08376494795084,
-0.03227313980460167,
0.00444044079631567,
-0.15315520763397217,
-0.03687175363302231,
0.053779203444719315,
-0.02128031849861145,
-0.0720122754573822,
-0.03253159299492836,
0.06138232350349426,
-0.03278692066669464,
0.13266339898109436,
-0.0014924780698493123,
0.046726249158382416,
-0.013146918267011642,
-0.012590580619871616,
-0.004138162825256586,
0.014500847086310387,
0.18313074111938477,
-0.0839969739317894,
0.00781328696757555,
-0.007903137244284153,
0.0007965299882926047,
0.05310375988483429,
0.13071608543395996,
0.08942441642284393,
0.09780318289995193,
0.06430185586214066,
0.09647144377231598,
-0.047516487538814545,
-0.014982487075030804,
-0.14958325028419495,
0.07196817547082901,
-0.04348338022828102,
0.037807922810316086,
-0.03959818556904793,
0.13007640838623047,
0.11568743735551834,
-0.13381366431713104,
0.09328712522983551,
0.03548533469438553,
-0.10404494404792786,
-0.04780275747179985,
-0.10251624882221222,
-0.04768530651926994,
-0.12418095022439957,
0.006306982133537531,
-0.10483730584383011,
0.022559333592653275,
0.07374843209981918,
0.03802083060145378,
-0.022694526240229607,
0.14580631256103516,
-0.0034468742087483406,
-0.07126802206039429,
0.03470428287982941,
0.043079596012830734,
0.031067736446857452,
0.12566913664340973,
-0.00394415482878685,
0.07491349428892136,
-0.0503254160284996,
0.081763856112957,
0.02560785785317421,
0.020789075642824173,
0.022827938199043274,
0.010005722753703594,
-0.005885722115635872,
-0.054394789040088654,
-0.0005765488022007048,
0.06692011654376984,
0.14701706171035767,
0.04170829802751541,
-0.05066433176398277,
-0.04620414227247238,
0.20614399015903473,
-0.06349708884954453,
-0.058013204485177994,
-0.12404836714267731,
0.16183589398860931,
0.028872333467006683,
0.021521996706724167,
0.00355891278013587,
-0.08261246979236603,
-0.04471219703555107,
0.24625471234321594,
0.04965110123157501,
-0.05294377729296684,
-0.0352352038025856,
-0.015074544586241245,
-0.015596378594636917,
-0.025293108075857162,
0.15373894572257996,
0.01057824119925499,
0.2254069745540619,
0.011383134871721268,
0.0036891160998493433,
-0.02615421637892723,
-0.04559861496090889,
-0.028077907860279083,
0.19758179783821106,
-0.04619217291474342,
0.04169141501188278,
-0.10477056354284286,
-0.01729990541934967,
0.011120499111711979,
-0.12592750787734985,
0.0958603173494339,
-0.1135282814502716,
-0.07358261197805405,
0.025920456275343895,
0.0814419537782669,
-0.04419785365462303,
0.053191814571619034,
-0.016923628747463226,
0.06517151743173599,
0.026904959231615067,
-0.03256797045469284,
-0.11102356016635895,
-0.14232558012008667,
0.04269883781671524,
0.005124218296259642,
0.12849536538124084,
0.012136180885136127,
0.07170877605676651,
0.08666139841079712,
0.00988058466464281,
-0.08135053515434265,
0.0818830281496048,
0.023291422054171562,
-0.03569859638810158,
0.044487014412879944,
0.13531841337680817,
-0.04167445749044418,
0.14060743153095245,
0.017889322713017464,
-0.03265124931931496,
-0.036804743111133575,
-0.025635292753577232,
0.0068707638420164585,
-0.15998095273971558,
-0.011133951134979725,
-0.06293566524982452,
0.12430207431316376,
0.1804579496383667,
-0.04184411093592644,
-0.010752188973128796,
-0.04629243537783623,
0.0629618912935257,
-0.025694256648421288,
0.09512072056531906,
0.008812782354652882,
-0.16665947437286377,
-0.007207331247627735,
0.01348644308745861,
0.01046727690845728,
-0.18722175061702728,
-0.046427078545093536,
-0.03380231931805611,
-0.026938151568174362,
-0.08878279477357864,
0.1602897197008133,
0.0619325190782547,
0.028837338089942932,
-0.029770998284220695,
-0.16028881072998047,
-0.004669758025556803,
0.0406881719827652,
-0.13463087379932404,
-0.1189807802438736
] |
null | null |
transformers
|
# CodeTrans model for program synthesis
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on Program Synthesis dataset.
## Intended uses & limitations
The model could be used to generate lisp inspired DSL code based on the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_program_synthese"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_program_synthese", skip_special_tokens=True),
device=0
)
tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/program%20synthesis/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | LISP |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 89.43 |
| CodeTrans-ST-Base | 89.65 |
| CodeTrans-TF-Small | 90.30 |
| CodeTrans-TF-Base | 90.24 |
| CodeTrans-TF-Large | 90.21 |
| CodeTrans-MT-Small | 82.88 |
| CodeTrans-MT-Base | 86.99 |
| CodeTrans-MT-Large | 90.27 |
| CodeTrans-MT-TF-Small | **90.31** |
| CodeTrans-MT-TF-Base | 90.30 |
| CodeTrans-MT-TF-Large | 90.17 |
| State of the art | 85.80 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
|
summarization
|
SEBIS/code_trans_t5_base_program_synthese
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for program synthesis
=====================================
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on Program Synthesis dataset.
Intended uses & limitations
---------------------------
The model could be used to generate lisp inspired DSL code based on the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
114
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10197243094444275,
0.052408263087272644,
-0.0008475699578411877,
0.07747628539800644,
0.1253296434879303,
0.0003255880146753043,
0.12152953445911407,
0.04511275142431259,
0.05444084852933884,
-0.05780824273824692,
0.11050792038440704,
0.17079880833625793,
-0.019645173102617264,
0.1482776701450348,
-0.027185792103409767,
-0.13108791410923004,
0.01266967412084341,
0.04069361090660095,
-0.14536556601524353,
0.12656627595424652,
0.1282953917980194,
-0.04141490161418915,
0.09692001342773438,
-0.03897906467318535,
-0.17773506045341492,
0.061040058732032776,
-0.02040889859199524,
-0.039826277643442154,
0.09927734732627869,
0.06610997021198273,
0.12230369448661804,
0.019010573625564575,
0.04727138578891754,
-0.2565366327762604,
0.018060119822621346,
-0.03919696807861328,
0.011944363825023174,
0.0250169076025486,
0.015064563602209091,
-0.0789656788110733,
0.14202935993671417,
-0.025330569595098495,
0.01100156269967556,
0.035046618431806564,
-0.07802321016788483,
-0.019268272444605827,
-0.038531847298145294,
-0.0380454920232296,
0.10299495607614517,
0.10869420319795609,
0.02419857867062092,
0.08626067638397217,
-0.1634482890367508,
0.145771324634552,
0.1363670825958252,
-0.17612147331237793,
-0.013410902582108974,
0.13485229015350342,
0.06834079325199127,
-0.03916085511445999,
-0.002011894015595317,
0.031909458339214325,
0.07825463265180588,
-0.010129458270967007,
-0.02607346698641777,
-0.11439294368028641,
-0.1276097595691681,
0.10080505162477493,
-0.09198480099439621,
-0.0371447391808033,
0.30969637632369995,
-0.006383324041962624,
-0.022124918177723885,
-0.049324147403240204,
-0.018734024837613106,
-0.003863545134663582,
-0.015873586758971214,
0.003010186366736889,
0.02470802329480648,
0.020994314923882484,
-0.013490252196788788,
-0.03829893097281456,
-0.08563075959682465,
-0.140803262591362,
0.005951819010078907,
0.1296750158071518,
0.00312270806171,
0.008569779805839062,
-0.14156359434127808,
0.10797200351953506,
0.15156155824661255,
-0.06517653912305832,
0.005838689394295216,
-0.05981644615530968,
-0.022410040721297264,
0.027636896818876266,
-0.09191755950450897,
-0.14533385634422302,
0.08579141646623611,
0.09138021618127823,
0.07797209918498993,
0.0498286634683609,
0.021464744582772255,
0.050120528787374496,
-0.0013990483712404966,
0.1754143238067627,
0.003032058710232377,
-0.06348174065351486,
0.045996565371751785,
-0.01634620502591133,
0.01493498869240284,
0.00037166595575399697,
-0.11040442436933517,
0.0043921698816120625,
-0.027445532381534576,
0.13297349214553833,
-0.08764024823904037,
0.09052393585443497,
-0.09865985065698624,
-0.004623937886208296,
-0.06520423293113708,
-0.17077411711215973,
0.005518765188753605,
-0.010281041264533997,
-0.06588198244571686,
-0.03328533470630646,
0.11219573765993118,
-0.05447700619697571,
-0.08251484483480453,
-0.021448086947202682,
-0.06478255987167358,
0.010386989451944828,
-0.11901751905679703,
-0.09427035599946976,
0.015622269362211227,
0.011459133587777615,
0.04744971916079521,
-0.10885083675384521,
-0.25637608766555786,
-0.01006782241165638,
0.07674853503704071,
0.024961352348327637,
0.049995213747024536,
-0.15544050931930542,
-0.06621075421571732,
-0.02437831088900566,
-0.015694936737418175,
-0.01468450017273426,
-0.0708550214767456,
0.08645149320363998,
0.07800460606813431,
0.041108205914497375,
-0.10081292688846588,
0.03892887383699417,
-0.12886470556259155,
0.04439535737037659,
-0.11212771385908127,
0.15730509161949158,
-0.0913442000746727,
0.12435916811227798,
-0.05935702845454216,
-0.04574967175722122,
0.03242708742618561,
0.09016111493110657,
0.07486328482627869,
0.1565067619085312,
-0.23173697292804718,
-0.03993069753050804,
0.1187950074672699,
-0.1168685331940651,
-0.16001427173614502,
0.0483824759721756,
-0.079312264919281,
0.1422954946756363,
0.1109020859003067,
0.20740720629692078,
0.16592885553836823,
-0.042271971702575684,
0.02538086473941803,
0.10786233097314835,
-0.018395163118839264,
-0.08602164685726166,
0.039129480719566345,
0.09537657350301743,
-0.11292758584022522,
0.08744866400957108,
-0.10144074261188507,
0.13592010736465454,
-0.018091244623064995,
-0.0631723701953888,
-0.03372138738632202,
-0.07666319608688354,
0.004444223828613758,
0.010765351355075836,
0.05478523299098015,
-0.03137199580669403,
-0.03696350380778313,
0.11295577138662338,
0.12403365224599838,
-0.12640461325645447,
0.0339149609208107,
-0.14328017830848694,
0.03899872303009033,
-0.04135704040527344,
0.027792086824774742,
-0.18247660994529724,
-0.03398760035634041,
-0.027753813192248344,
0.0879887044429779,
0.0769856795668602,
0.03627989813685417,
0.005133970640599728,
-0.040356945246458054,
0.016710743308067322,
-0.0058478424325585365,
0.013616083189845085,
-0.02611788548529148,
-0.03324611112475395,
-0.1107526496052742,
-0.046368494629859924,
-0.05707653611898422,
0.05121150240302086,
-0.15661777555942535,
-0.0003541907062754035,
0.03305904567241669,
0.09197414666414261,
0.002301138360053301,
0.024273119866847992,
0.020840546116232872,
0.08154038339853287,
-0.06970921158790588,
-0.02156679891049862,
0.05641176551580429,
-0.007618376985192299,
-0.15806855261325836,
0.030058180913329124,
-0.020100802183151245,
0.01072128489613533,
0.13407869637012482,
-0.1796996295452118,
-0.0219466183334589,
-0.008615524508059025,
-0.0036713408771902323,
0.010075420141220093,
0.0555516742169857,
-0.05590241774916649,
0.17373301088809967,
0.0013977307826280594,
0.13309480249881744,
-0.10639460384845734,
-0.05150628089904785,
-0.08927128463983536,
-0.02796814776957035,
0.007084365468472242,
0.15629664063453674,
0.08168385177850723,
-0.11759497225284576,
0.06663627177476883,
0.06925886124372482,
-0.04088655114173889,
0.1374179571866989,
-0.06645732372999191,
-0.015543279238045216,
-0.01202527154237032,
0.0927605852484703,
-0.04196268320083618,
0.10328168421983719,
-0.1598808616399765,
-0.06325890868902206,
-0.00108517671469599,
-0.015301755629479885,
0.09473638236522675,
-0.14856204390525818,
-0.0449524000287056,
0.017503894865512848,
-0.07646927982568741,
-0.08379952609539032,
0.030866919085383415,
-0.008089267648756504,
0.0457417368888855,
-0.03797382116317749,
-0.08149941265583038,
0.05498448386788368,
-0.0508771650493145,
-0.10683105885982513,
0.20699791610240936,
-0.12859393656253815,
-0.25635334849357605,
-0.1695123016834259,
-0.015389503911137581,
-0.04778007045388222,
0.011227809824049473,
0.04801603779196739,
-0.12533575296401978,
-0.05950907990336418,
-0.07312387973070145,
0.08811221271753311,
-0.03378482908010483,
-0.04928421229124069,
0.013856424018740654,
0.049982331693172455,
-0.01732848398387432,
-0.2123628407716751,
-0.0008907285518944263,
-0.0026342396158725023,
0.05923747271299362,
-0.007909467443823814,
-0.12362751364707947,
0.10857321321964264,
0.12512587010860443,
-0.007135148625820875,
0.04666110500693321,
-0.018486754968762398,
0.18555693328380585,
-0.04408169165253639,
-0.09573926776647568,
0.15024474263191223,
-0.08737480640411377,
-0.0004103745159227401,
0.061759669333696365,
0.017391400411725044,
-0.16445422172546387,
0.04608149081468582,
-0.009712813422083855,
-0.06816675513982773,
-0.24986611306667328,
-0.13600021600723267,
-0.09490014612674713,
0.05311083793640137,
-0.025729242712259293,
0.04038788750767708,
-0.13025671243667603,
0.046638187021017075,
0.06914054602384567,
0.08670233190059662,
-0.0016955804312601686,
0.05550495162606239,
0.1354140192270279,
-0.03533070534467697,
0.03275904059410095,
-0.11511999368667603,
-0.08222530782222748,
0.025858262553811073,
0.057777415961027145,
0.16374611854553223,
-0.018701422959566116,
0.15635572373867035,
0.06974513828754425,
0.020909497514367104,
0.008458435535430908,
0.12894085049629211,
-0.0662231594324112,
0.009237287566065788,
0.011704185977578163,
-0.03914977237582207,
-0.13555003702640533,
0.05263596028089523,
-0.08089753985404968,
0.02884642407298088,
-0.08261779695749283,
0.006605192087590694,
0.06763255596160889,
0.039448678493499756,
0.01097510289400816,
-0.2607683539390564,
-0.023683497682213783,
0.06192915514111519,
-0.05438767373561859,
-0.02656654827296734,
0.08554151654243469,
0.11243930459022522,
-0.04026053473353386,
0.03151124715805054,
-0.0003430916403885931,
0.14614629745483398,
-0.06050194799900055,
0.00730432802811265,
-0.03483608365058899,
-0.02479305863380432,
0.043128352612257004,
0.15735647082328796,
-0.25901830196380615,
0.19615206122398376,
0.00368595402687788,
0.027927985414862633,
-0.047392670065164566,
0.02793155424296856,
-0.0031717848032712936,
0.06214221566915512,
0.12785594165325165,
-0.01693674363195896,
0.004450258333235979,
-0.14879870414733887,
-0.024025801569223404,
0.06708026677370071,
0.09864993393421173,
-0.031540174037218094,
0.06519901752471924,
-0.026578368619084358,
0.01539810374379158,
0.009934237226843834,
-0.07631414383649826,
-0.0735655277967453,
-0.14148831367492676,
-0.008794756606221199,
0.03341185301542282,
0.035031482577323914,
-0.037862129509449005,
0.00243186904117465,
-0.016989804804325104,
0.24331818521022797,
-0.06602924317121506,
-0.08821844309568405,
-0.08665511757135391,
0.028588445857167244,
0.11111630499362946,
-0.05906679108738899,
0.003492946969345212,
0.04273160919547081,
-0.017346318811178207,
0.01705705188214779,
-0.14770086109638214,
0.03008114919066429,
-0.042512767016887665,
0.01994442567229271,
-0.042993731796741486,
0.13908495008945465,
-0.0039034434594213963,
-0.0027746609412133694,
0.05009292811155319,
-0.0616571381688118,
-0.030462082475423813,
-0.14343813061714172,
-0.136999249458313,
0.03953826427459717,
-0.008757945150136948,
0.09080782532691956,
-0.1442836970090866,
0.09390941262245178,
-0.02140532061457634,
-0.02187572792172432,
0.20965741574764252,
0.14467096328735352,
-0.04322554171085358,
0.04114757850766182,
0.05901169031858444,
-0.11194127798080444,
-0.2711361050605774,
0.0004447541432455182,
-0.027758464217185974,
0.05712663382291794,
-0.009584333747625351,
-0.16833461821079254,
0.14305497705936432,
-0.05865904688835144,
0.03624743968248367,
0.015608653426170349,
-0.29508712887763977,
-0.09234601259231567,
0.13854126632213593,
0.09086595475673676,
0.10687554627656937,
-0.12011826783418655,
-0.06777264922857285,
-0.11218960583209991,
-0.16235864162445068,
0.16550643742084503,
-0.0031450947280973196,
0.08291763067245483,
0.012506026774644852,
0.08699990808963776,
0.035920482128858566,
-0.025492243468761444,
0.10692226886749268,
0.03777278587222099,
0.13838104903697968,
-0.050515513867139816,
-0.0874079242348671,
0.05728909373283386,
-0.029163165017962456,
0.15835069119930267,
-0.14096644520759583,
0.08286964893341064,
-0.184068500995636,
-0.03315519541501999,
-0.008597043342888355,
0.04535728320479393,
0.008510200306773186,
-0.050143346190452576,
-0.11108126491308212,
-0.009060479700565338,
0.019504647701978683,
0.025747127830982208,
0.13587945699691772,
-0.03833925351500511,
-0.035332195460796356,
0.016482064500451088,
0.19673019647598267,
-0.040642380714416504,
0.008118872530758381,
0.0448707677423954,
0.011631980538368225,
0.08622443675994873,
-0.1825338453054428,
0.050514791160821915,
0.13160361349582672,
0.042131535708904266,
0.11230851709842682,
0.0803757756948471,
-0.020584065467119217,
-0.0035434223245829344,
0.10809065401554108,
-0.1389201134443283,
-0.05402104929089546,
-0.07009914517402649,
-0.058827389031648636,
0.009431853890419006,
0.08290817588567734,
0.16295398771762848,
-0.04016874358057976,
0.007245820481330156,
0.013051774352788925,
-0.008008076809346676,
-0.13035504519939423,
0.12483188509941101,
0.053820930421352386,
0.03329365327954292,
-0.11941368132829666,
0.08117605000734329,
-0.004625940695405006,
-0.07370613515377045,
0.007058634422719479,
0.09430256485939026,
-0.12076801061630249,
-0.07085883617401123,
-0.0473967045545578,
0.27106279134750366,
-0.0838240385055542,
-0.07864529639482498,
-0.13255976140499115,
-0.08041581511497498,
0.006529126316308975,
0.20609739422798157,
0.1188986599445343,
0.12334427237510681,
-0.05537223815917969,
0.0021504175383597612,
-0.06442450731992722,
0.06625306606292725,
0.10409357398748398,
0.021915771067142487,
-0.15778064727783203,
0.09556083381175995,
-0.022583147510886192,
0.10671664774417877,
-0.061286695301532745,
-0.016187211498618126,
-0.16115480661392212,
0.10083941370248795,
-0.12014411389827728,
0.04169413819909096,
-0.04675066098570824,
0.04090399295091629,
0.029208514839410782,
-0.004866926930844784,
-0.04088929668068886,
0.03521238639950752,
-0.10048376768827438,
0.0022348077036440372,
0.0195778738707304,
0.05440254509449005,
-0.11296883225440979,
-0.03383845090866089,
0.05579494312405586,
-0.05520040914416313,
0.06943847239017487,
0.046635303646326065,
-0.06726484000682831,
0.09916603565216064,
-0.2313963770866394,
-0.012264705263078213,
0.0935153067111969,
0.058173686265945435,
0.02197861112654209,
-0.008072096854448318,
0.014838019385933876,
0.07343710958957672,
0.03485919535160065,
0.005209240596741438,
0.10351873934268951,
-0.11608358472585678,
-0.07642608880996704,
-0.0497170127928257,
-0.08639748394489288,
-0.0288887657225132,
0.03400222957134247,
0.11523039638996124,
0.12857235968112946,
0.13156287372112274,
-0.005415842402726412,
0.006243838928639889,
-0.06541134417057037,
-0.006029303651303053,
0.034586112946271896,
-0.05670313537120819,
-0.10617318749427795,
-0.09038481116294861,
0.029347913339734077,
-0.07310518622398376,
0.21806177496910095,
0.006932105869054794,
0.012860793620347977,
-0.03020663745701313,
0.014995740726590157,
0.11369605362415314,
0.04432497173547745,
0.25085484981536865,
-0.013540809042751789,
0.06856530904769897,
-0.0524868443608284,
0.0358741320669651,
0.05230825021862984,
0.04725231975317001,
0.07009942829608917,
0.1255354881286621,
-0.01932981237769127,
0.13328178226947784,
0.013123364187777042,
0.022074181586503983,
-0.027415426447987556,
-0.09323026239871979,
0.03749207407236099,
0.04320749267935753,
-0.04970518499612808,
0.20338140428066254,
0.043025903403759,
-0.07681310921907425,
0.07950281351804733,
0.037588492035865784,
-0.10256728529930115,
-0.08256936818361282,
-0.06935223191976547,
-0.050860900431871414,
-0.1524939090013504,
0.018674882128834724,
-0.13458721339702606,
-0.04580722376704216,
0.0350465402007103,
0.020323842763900757,
-0.047875452786684036,
0.16603900492191315,
-0.0689837783575058,
-0.05053047835826874,
0.03106723539531231,
-0.028366032987833023,
-0.01750113256275654,
-0.0032647172920405865,
0.0430578775703907,
-0.0011571983341127634,
-0.005359356757253408,
-0.016736097633838654,
0.05691055953502655,
-0.00883474387228489,
0.009235511533915997,
-0.06655067950487137,
-0.005006698425859213,
-0.05706116184592247,
0.061824552714824677,
0.04027272015810013,
0.0839872658252716,
0.035225220024585724,
-0.07189159840345383,
-0.00456170504912734,
0.20008207857608795,
-0.06489595770835876,
-0.08344591408967972,
-0.09145213663578033,
0.18612858653068542,
0.012102172710001469,
0.082295261323452,
-0.03969011455774307,
-0.06856589764356613,
-0.04355476051568985,
0.2825552523136139,
0.19163374602794647,
-0.08958663791418076,
0.0015211064601317048,
-0.014673097990453243,
0.019053669646382332,
-0.041346438229084015,
0.1186487153172493,
0.03864984214305878,
0.23552769422531128,
-0.010652695782482624,
-0.10219451040029526,
-0.08240645378828049,
-0.04094749689102173,
0.012977763079106808,
0.10186095535755157,
0.002361572813242674,
-0.07467856258153915,
-0.05607173964381218,
0.08497972786426544,
-0.18731757998466492,
-0.03410477191209793,
0.03545243293046951,
-0.1314840018749237,
-0.0592392235994339,
-0.040641192346811295,
0.07580254971981049,
-0.049782559275627136,
0.02825554460287094,
-0.048431478440761566,
0.0090985968708992,
0.04135182872414589,
-0.00047955120680853724,
-0.1742834895849228,
-0.08761770278215408,
0.03646211326122284,
-0.05053464323282242,
0.08978638797998428,
-0.02088901400566101,
0.13620267808437347,
0.08675394207239151,
0.04930444061756134,
-0.02658168599009514,
0.07340103387832642,
0.061765048652887344,
-0.008376398123800755,
0.06630877405405045,
0.01970832236111164,
-0.048169828951358795,
0.1737658530473709,
-0.012114029377698898,
-0.036259256303310394,
0.04677837714552879,
0.034324776381254196,
-0.045176904648542404,
-0.15930160880088806,
-0.04314899444580078,
-0.14656077325344086,
0.0937579795718193,
0.14941558241844177,
-0.04021914675831795,
0.014652356505393982,
-0.0638158991932869,
0.10716073215007782,
-0.03579201176762581,
-0.03805633261799812,
-0.031066307798027992,
-0.16093981266021729,
-0.045237742364406586,
0.11294134706258774,
-0.011064646765589714,
-0.1615082174539566,
-0.013740132562816143,
-0.06120913103222847,
-0.009671958163380623,
-0.030894817784428596,
0.11232427507638931,
0.15480583906173706,
0.041804637759923935,
-0.020118655636906624,
-0.1642126739025116,
0.02128474786877632,
0.0668124333024025,
-0.10598921030759811,
-0.13764208555221558
] |
null | null |
transformers
|
# CodeTrans model for program synthesis
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_program_synthese_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_program_synthese_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/program%20synthesis/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | LISP |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 89.43 |
| CodeTrans-ST-Base | 89.65 |
| CodeTrans-TF-Small | 90.30 |
| CodeTrans-TF-Base | 90.24 |
| CodeTrans-TF-Large | 90.21 |
| CodeTrans-MT-Small | 82.88 |
| CodeTrans-MT-Base | 86.99 |
| CodeTrans-MT-Large | 90.27 |
| CodeTrans-MT-TF-Small | **90.31** |
| CodeTrans-MT-TF-Base | 90.30 |
| CodeTrans-MT-TF-Large | 90.17 |
| State of the art | 85.80 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
|
summarization
|
SEBIS/code_trans_t5_base_program_synthese_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for program synthesis
=====================================
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
63,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.13456664979457855,
-0.007191256154328585,
-0.0006199516355991364,
0.14608000218868256,
0.10420586913824081,
0.01699300855398178,
0.0920940563082695,
0.0523594468832016,
-0.01570509746670723,
0.010495971888303757,
0.057281773537397385,
0.03522452712059021,
0.00617870781570673,
0.18224474787712097,
0.0043756612576544285,
-0.11733466386795044,
-0.002398913959041238,
0.024785595014691353,
-0.06173918396234512,
0.12446107715368271,
0.0964890718460083,
-0.07284894585609436,
0.04759537801146507,
-0.06544472277164459,
-0.2093069702386856,
0.05077285319566727,
-0.013985035941004753,
-0.04350816085934639,
0.09752117097377777,
0.04567253962159157,
0.1281091570854187,
-0.008603371679782867,
0.05116443708539009,
-0.15779192745685577,
0.0026396820321679115,
0.009429767727851868,
0.03569260984659195,
0.00777116883546114,
0.047834500670433044,
0.02928060293197632,
0.13934975862503052,
-0.0011867958819493651,
0.046997588127851486,
0.050841789692640305,
-0.05743628740310669,
-0.09699534624814987,
-0.027151115238666534,
0.015785453841090202,
0.05806148052215576,
0.1176188513636589,
-0.006940316874533892,
0.11438919603824615,
-0.16156935691833496,
0.13433851301670074,
0.1229969710111618,
-0.23062053322792053,
-0.009971392340958118,
0.10595984756946564,
0.07355636358261108,
0.08703984320163727,
-0.035122159868478775,
-0.04932288080453873,
0.0951969102025032,
0.039855461567640305,
0.015662122517824173,
-0.08392516523599625,
-0.0766746923327446,
0.0405559316277504,
-0.10524983704090118,
-0.051774654537439346,
0.228317990899086,
0.005431030411273241,
-0.06805417686700821,
-0.07377641648054123,
-0.02573682926595211,
-0.1238715797662735,
0.0273640938103199,
0.02782776579260826,
0.01241381000727415,
-0.023879779502749443,
0.02413294091820717,
0.02012128010392189,
-0.08275188505649567,
-0.1509687453508377,
0.028220128268003464,
0.12728355824947357,
0.05863650515675545,
0.020917965099215508,
-0.08043736964464188,
0.1140846237540245,
0.08375067263841629,
-0.04711665213108063,
-0.03255140781402588,
-0.023327190428972244,
-0.11304763704538345,
0.05067028850317001,
-0.07478957623243332,
-0.16888432204723358,
0.004576494451612234,
0.03861164674162865,
0.02311522327363491,
0.047993410378694534,
0.032853174954652786,
0.017578547820448875,
0.009355791844427586,
0.19876258075237274,
0.03876504674553871,
-0.09297295659780502,
0.051977287977933884,
0.04726047068834305,
-0.01584176905453205,
-0.022327207028865814,
-0.08259157091379166,
-0.07797057926654816,
0.06269541382789612,
0.1012682393193245,
-0.12981471419334412,
0.053068578243255615,
-0.07643868774175644,
-0.03733622282743454,
-0.027656884863972664,
-0.17784501612186432,
0.010819006711244583,
0.01850234903395176,
-0.06095121055841446,
-0.03366825729608536,
0.10049501061439514,
-0.15843957662582397,
-0.13872994482517242,
-0.02863571047782898,
-0.07265765219926834,
-0.035732366144657135,
-0.1615576595067978,
-0.15039201080799103,
-0.012094276025891304,
-0.04645480588078499,
0.008932994678616524,
-0.07548857480287552,
-0.1819656938314438,
-0.023667076602578163,
0.02744327299296856,
0.016923170536756516,
0.006780892610549927,
-0.10495580732822418,
-0.02635594643652439,
-0.01970265619456768,
-0.03156306594610214,
-0.006910468451678753,
-0.045666441321372986,
0.11904596537351608,
0.10748810321092606,
0.03411756083369255,
-0.03362797573208809,
0.05053771287202835,
-0.07514876127243042,
0.04766650125384331,
-0.08092440664768219,
0.12721918523311615,
-0.09153168648481369,
0.06849022209644318,
-0.010863261297345161,
-0.10631908476352692,
0.07265684008598328,
0.06964123249053955,
0.06793496012687683,
0.055418554693460464,
-0.1663421094417572,
-0.029864192008972168,
0.16765733063220978,
-0.12885645031929016,
-0.11315455287694931,
0.09987054765224457,
-0.051378872245550156,
0.05306757614016533,
0.09980566054582596,
0.13902726769447327,
0.16458094120025635,
-0.042597897350788116,
0.00940462201833725,
0.06299088895320892,
0.05931074544787407,
-0.12463750690221786,
0.06895051896572113,
0.07558798044919968,
-0.07796056568622589,
0.07604645192623138,
-0.06655892729759216,
0.11072733253240585,
-0.004479775670915842,
-0.037380121648311615,
-0.05828871950507164,
-0.07879412174224854,
-0.003080896334722638,
0.012198020704090595,
0.05697259679436684,
-0.08559636026620865,
-0.0800083726644516,
0.10364829003810883,
0.17675654590129852,
-0.13584432005882263,
0.012198113836348057,
-0.09438583254814148,
0.03971001133322716,
-0.04309924319386482,
0.01922665722668171,
-0.16136473417282104,
0.00625877920538187,
0.05518433451652527,
0.01582203432917595,
0.0799170657992363,
0.1285219043493271,
0.012876283377408981,
0.03298410773277283,
0.0040708682499825954,
-0.014620883390307426,
-0.11750531941652298,
-0.0598110556602478,
-0.061668287962675095,
-0.06955473870038986,
-0.09035556763410568,
-0.06768839061260223,
-0.0005177783314138651,
-0.18169307708740234,
0.006809660233557224,
0.0030575927812606096,
0.017823083326220512,
0.013692001812160015,
-0.016550522297620773,
0.012051665224134922,
0.07997997850179672,
-0.06479240953922272,
-0.03657270967960358,
0.02330806851387024,
0.011382217518985271,
-0.0728437677025795,
-0.06351817399263382,
-0.07383383065462112,
-0.00375765236094594,
0.11912967264652252,
0.017748722806572914,
-0.06050065532326698,
0.023555034771561623,
-0.012137450277805328,
-0.029007617384195328,
0.032730281352996826,
-0.06850718706846237,
0.15548396110534668,
0.006576015148311853,
0.17837369441986084,
-0.1584976613521576,
-0.04274630919098854,
-0.050588544458150864,
0.016868703067302704,
0.043262653052806854,
0.14460304379463196,
-0.0017363205552101135,
-0.06089627370238304,
0.06151844188570976,
0.013117248192429543,
-0.09167694300413132,
0.1948029100894928,
-0.0630667507648468,
-0.08886035531759262,
0.023339681327342987,
0.10888900607824326,
-0.018283456563949585,
0.14198532700538635,
-0.1649007946252823,
-0.03475594520568848,
0.013113726861774921,
0.0053505077958106995,
0.06443733721971512,
-0.1399994194507599,
-0.01074288971722126,
0.015239689499139786,
-0.08291222155094147,
-0.07117799669504166,
-0.010658099316060543,
-0.010049900971353054,
0.042433999478816986,
-0.023112768307328224,
-0.0453108474612236,
0.0236212108284235,
-0.04450305923819542,
-0.10035252571105957,
0.21269211173057556,
-0.12137924879789352,
-0.21779029071331024,
-0.19640031456947327,
0.06827358156442642,
-0.07928279042243958,
-0.001967580523341894,
0.026061885058879852,
-0.11467849463224411,
-0.06649000942707062,
-0.06882230937480927,
0.16866432130336761,
-0.07108107954263687,
-0.018110809847712517,
0.004124469589442015,
0.06392373144626617,
0.0008817054331302643,
-0.22573399543762207,
0.03712141513824463,
-0.019703716039657593,
-0.009517760947346687,
-0.00829650741070509,
-0.0828307494521141,
0.08743701875209808,
0.16769179701805115,
-0.053671516478061676,
0.028150349855422974,
0.007127732038497925,
0.18351303040981293,
-0.029351316392421722,
-0.07220721989870071,
0.14291170239448547,
-0.024723419919610023,
-0.004108701832592487,
0.018772931769490242,
0.00012059866276104003,
-0.11814778298139572,
0.059323105961084366,
0.007741468492895365,
-0.026460373774170876,
-0.2759576737880707,
-0.03829753398895264,
-0.08628573268651962,
0.03157963976264,
0.016252949833869934,
0.048179008066654205,
-0.10480286926031113,
0.023763330653309822,
0.04674057289958,
0.11772433668375015,
0.002807255368679762,
0.0391690619289875,
0.10423300415277481,
-0.01605362445116043,
0.02334575168788433,
-0.10125433653593063,
-0.01341644860804081,
0.06718048453330994,
0.08189549297094345,
0.258888840675354,
-0.10788584500551224,
0.1952972561120987,
0.04503326117992401,
0.048135895282030106,
0.02652142010629177,
0.13860797882080078,
-0.10406896471977234,
0.026439746841788292,
0.010708716697990894,
-0.013674955815076828,
-0.11477870494127274,
0.027058877050876617,
-0.06201846897602081,
0.07106950879096985,
-0.09595733135938644,
-0.0242453645914793,
0.00866161659359932,
0.12094521522521973,
0.03783602640032768,
-0.22169752418994904,
-0.08822465687990189,
0.02827947773039341,
-0.0792447179555893,
-0.08767662942409515,
0.07854819297790527,
0.2086639553308487,
-0.04099290445446968,
-0.016625218093395233,
-0.002501708921045065,
0.12393627315759659,
-0.04717943072319031,
-0.030908502638339996,
-0.034952860325574875,
0.06547635793685913,
0.030168719589710236,
0.1389293074607849,
-0.2898772656917572,
0.12688688933849335,
-0.00877261720597744,
0.05979544296860695,
-0.03930359706282616,
0.059009213000535965,
-0.04120656102895737,
0.05716261640191078,
0.051810551434755325,
-0.013716591522097588,
0.04559982568025589,
-0.16043898463249207,
-0.012056428007781506,
0.03366631641983986,
0.05640709772706032,
0.04527425020933151,
0.07186421751976013,
-0.010745142586529255,
0.05058002099394798,
0.004927703645080328,
-0.12422257661819458,
-0.06911949813365936,
-0.11207716912031174,
-0.007581177167594433,
-0.02612835355103016,
-0.0408293791115284,
-0.05418625473976135,
-0.027510184794664383,
0.01275599841028452,
0.21850666403770447,
-0.07851653546094894,
-0.08849029242992401,
-0.0771162286400795,
0.04378728196024895,
0.12727926671504974,
-0.07474871724843979,
0.03336961567401886,
0.015311469323933125,
0.015072766691446304,
-0.0066225859336555,
-0.09355060756206512,
0.03888764604926109,
-0.026936687529087067,
-0.06749899685382843,
-0.01407704595476389,
0.08745981007814407,
0.015693524852395058,
0.021218575537204742,
-0.0009643891826272011,
-0.08627928048372269,
-0.033656079322099686,
-0.12635564804077148,
-0.1332942545413971,
-0.009207617491483688,
-0.010678845457732677,
0.0663459524512291,
-0.1445085108280182,
-0.010114041157066822,
-0.01414887048304081,
-0.02660692296922207,
0.13048622012138367,
0.1795252114534378,
-0.058450039476156235,
0.04424862191081047,
0.10575209558010101,
-0.06417298316955566,
-0.19365209341049194,
0.01979241333901882,
0.052350472658872604,
0.1057952269911766,
-0.041451577097177505,
-0.18074551224708557,
0.0759415328502655,
-0.003062970470637083,
0.037573494017124176,
0.05132525786757469,
-0.3153302073478699,
-0.12153391540050507,
0.10419946163892746,
0.1420249044895172,
0.0955050140619278,
-0.11006872355937958,
-0.03345013037323952,
-0.06803207099437714,
-0.11509036272764206,
0.10252086073160172,
0.009791014716029167,
0.12680067121982574,
-0.05095939710736275,
0.046481214463710785,
0.03973780572414398,
-0.033938635140657425,
0.07074432075023651,
0.03670544549822807,
0.13759243488311768,
-0.05010705441236496,
0.02108566090464592,
0.10845252126455307,
-0.03379160910844803,
0.17756107449531555,
-0.15277841687202454,
0.09892378002405167,
-0.2064133882522583,
-0.06998364627361298,
-0.05961025506258011,
0.012968582101166248,
-0.029084263369441032,
-0.04650741070508957,
-0.0961485281586647,
0.01851830445230007,
-0.010818497277796268,
-0.007940009236335754,
0.04938822239637375,
-0.023357955738902092,
-0.0447831004858017,
0.06304518133401871,
0.13056039810180664,
-0.006805849261581898,
-0.050356701016426086,
0.043946173042058945,
0.047488465905189514,
0.09423907101154327,
-0.16582618653774261,
0.016242176294326782,
0.11605623364448547,
0.02539222501218319,
0.11604227870702744,
0.05245106667280197,
-0.1027093231678009,
0.017856240272521973,
0.08989857882261276,
-0.08657368272542953,
-0.0747642070055008,
-0.026843858882784843,
-0.09927214682102203,
-0.060123372822999954,
0.061412371695041656,
0.11063096672296524,
-0.03885024040937424,
-0.008560167625546455,
-0.02003789320588112,
-0.021737249568104744,
-0.10709141939878464,
0.18563567101955414,
0.07732291519641876,
0.06481561809778214,
-0.07894496619701385,
0.06139132007956505,
0.0584745854139328,
-0.06402850896120071,
0.02000207081437111,
0.16845129430294037,
-0.09637965261936188,
-0.04543892294168472,
0.050941601395606995,
0.19613783061504364,
-0.018429379910230637,
-0.04450828954577446,
-0.1302744299173355,
-0.07723451405763626,
0.039798278361558914,
0.1595718413591385,
0.10762196034193039,
0.10466574877500534,
-0.05440249666571617,
0.013956807553768158,
-0.08928117901086807,
0.0915234386920929,
0.076499804854393,
0.04966343194246292,
-0.14486919343471527,
0.13293568789958954,
0.022881347686052322,
0.10008145123720169,
-0.03508032485842705,
-0.006022102199494839,
-0.12164484709501266,
0.07074373960494995,
-0.09576719254255295,
0.024219445884227753,
-0.0038325327914208174,
0.06423600763082504,
-0.014525331556797028,
-0.004826871212571859,
-0.022192727774381638,
0.06406215578317642,
-0.09545068442821503,
-0.00299659906886518,
0.013420728035271168,
0.040585536509752274,
-0.0745001807808876,
-0.033229272812604904,
0.02743067406117916,
-0.0940341055393219,
0.1221097856760025,
-0.010534416884183884,
-0.04309634491801262,
0.08546154201030731,
-0.08112376183271408,
0.04695885255932808,
0.036331336945295334,
0.07655510306358337,
0.0020888291765004396,
0.03292374685406685,
0.06384102255105972,
0.05245576426386833,
0.05842231214046478,
0.026193344965577126,
0.1162404790520668,
-0.13376091420650482,
-0.08815933018922806,
-0.05859161540865898,
-0.10436175763607025,
-0.05211472511291504,
0.0974702313542366,
0.08223049342632294,
0.1177746132016182,
0.1114315390586853,
-0.020573068410158157,
0.007827975787222385,
-0.12644989788532257,
-0.054816871881484985,
0.029115252196788788,
-0.024740496650338173,
-0.1016441211104393,
-0.06560496985912323,
0.0441291369497776,
-0.03530194237828255,
0.14862608909606934,
0.0017777836183086038,
0.0034941763151437044,
-0.035546690225601196,
-0.02831602282822132,
0.04845970496535301,
0.020634761080145836,
0.23393605649471283,
-0.0722682997584343,
0.04888933151960373,
0.006443624850362539,
-0.004139772150665522,
0.016756633296608925,
0.1064532995223999,
0.11327299475669861,
0.15539948642253876,
-0.03252706676721573,
0.10550776869058609,
0.023921281099319458,
-0.010728077962994576,
-0.0664728656411171,
-0.006053828168660402,
0.01401799451559782,
0.053786784410476685,
-0.05795758217573166,
0.21888196468353271,
0.04907657951116562,
-0.09612054377794266,
0.10326340049505234,
0.04182076454162598,
-0.13282868266105652,
-0.05206849426031113,
-0.015064803883433342,
-0.032412394881248474,
-0.15155476331710815,
0.035108305513858795,
-0.13472631573677063,
-0.018728934228420258,
0.04146914556622505,
0.056886881589889526,
-0.07165025919675827,
0.16183601319789886,
0.0023359956685453653,
-0.04717744514346123,
0.04068432375788689,
-0.009229474700987339,
0.006744795013219118,
0.031900741159915924,
0.01716822385787964,
0.03455119952559471,
-0.022387610748410225,
0.030254734680056572,
0.023097390308976173,
-0.02127675525844097,
-0.008207105100154877,
-0.017905253916978836,
0.012673189863562584,
-0.02424827218055725,
0.03103085421025753,
0.07205895334482193,
0.19353051483631134,
0.03339528664946556,
-0.09507738053798676,
-0.030303167179226875,
0.1578601896762848,
-0.04877380281686783,
-0.10527972131967545,
-0.1069193035364151,
0.13652372360229492,
0.030530638992786407,
0.030459707602858543,
0.00422641122713685,
-0.07937349379062653,
-0.057154636830091476,
0.20991162955760956,
0.07078614085912704,
-0.040150657296180725,
-0.02166276052594185,
0.0021549290977418423,
-0.00016443799540866166,
-0.06000461056828499,
0.19083146750926971,
0.023434586822986603,
0.2462959736585617,
0.026216629892587662,
-0.03517293184995651,
-0.0788743868470192,
-0.029853561893105507,
-0.0036315680481493473,
0.1259911209344864,
-0.04441870003938675,
-0.04958800971508026,
-0.08446070551872253,
0.0015344879357144237,
-0.01699850894510746,
-0.050533078610897064,
0.08553465455770493,
-0.11629894375801086,
-0.09200379997491837,
-0.03278293460607529,
0.05509257689118385,
-0.07132399082183838,
0.01229003444314003,
-0.03068923018872738,
0.044691506773233414,
0.0746365487575531,
-0.03594662621617317,
-0.12620887160301208,
-0.139256089925766,
0.08888211846351624,
-0.04851995036005974,
0.12254888564348221,
-0.005366608500480652,
0.1623135358095169,
0.08204635232686996,
0.04400712624192238,
-0.05481716990470886,
0.1145511046051979,
0.03177579119801521,
0.029270196333527565,
0.05428583547472954,
0.09971751272678375,
-0.056439027190208435,
0.14359165728092194,
-0.038339052349328995,
-0.012805215083062649,
-0.007569316774606705,
-0.052640605717897415,
-0.03575115278363228,
-0.17635639011859894,
-0.02430296503007412,
-0.12441997230052948,
0.10051890462636948,
0.1814233809709549,
-0.03400217741727829,
-0.02997766248881817,
-0.08828228712081909,
0.0965811014175415,
-0.0342867337167263,
0.05403931066393852,
-0.01987680234014988,
-0.18562427163124084,
-0.015952179208397865,
0.05422009155154228,
0.015628576278686523,
-0.2405683696269989,
-0.019794775173068047,
-0.035113319754600525,
-0.02820410206913948,
-0.06965690106153488,
0.15552181005477905,
0.09953438490629196,
0.04134758189320564,
-0.032967034727334976,
-0.14818502962589264,
-0.023864131420850754,
0.05992252007126808,
-0.1420575976371765,
-0.12712594866752625
] |
null | null |
transformers
|
# CodeTrans model for program synthesis
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
## Intended uses & limitations
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_program_synthese_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_program_synthese_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/program%20synthesis/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 30,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | LISP |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 89.43 |
| CodeTrans-ST-Base | 89.65 |
| CodeTrans-TF-Small | 90.30 |
| CodeTrans-TF-Base | 90.24 |
| CodeTrans-TF-Large | 90.21 |
| CodeTrans-MT-Small | 82.88 |
| CodeTrans-MT-Base | 86.99 |
| CodeTrans-MT-Large | 90.27 |
| CodeTrans-MT-TF-Small | **90.31** |
| CodeTrans-MT-TF-Base | 90.30 |
| CodeTrans-MT-TF-Large | 90.17 |
| State of the art | 85.80 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
|
summarization
|
SEBIS/code_trans_t5_base_program_synthese_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for program synthesis
=====================================
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
Intended uses & limitations
---------------------------
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 30,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 30,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 30,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
63,
88,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 30,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10718099772930145,
0.06920895725488663,
-0.001527317683212459,
0.10269445925951004,
0.060810040682554245,
0.013246600516140461,
0.10274489223957062,
0.07606281340122223,
-0.03749335557222366,
0.05015675723552704,
0.08078916370868683,
0.024942101910710335,
0.044152263551950455,
0.19125477969646454,
0.04308801144361496,
-0.14722536504268646,
0.002787339501082897,
0.030317699536681175,
-0.05621819198131561,
0.11729216575622559,
0.09298983216285706,
-0.08797021210193634,
0.06667208671569824,
-0.04810154438018799,
-0.13922268152236938,
0.05455891042947769,
-0.06210194155573845,
-0.02594584785401821,
0.06948014348745346,
0.05350455269217491,
0.09620825201272964,
-0.01970553956925869,
0.0996728241443634,
-0.18762177228927612,
-0.007379595655947924,
0.022550804540514946,
0.03420233726501465,
0.022964870557188988,
0.08983873575925827,
0.05499764904379845,
0.15444523096084595,
-0.03058151714503765,
0.02513067238032818,
0.05186513811349869,
-0.05857416242361069,
-0.07525306195020676,
-0.0576920323073864,
0.07395590096712112,
0.1403525322675705,
0.10099131613969803,
-0.018552325665950775,
0.029723692685365677,
-0.06901618838310242,
0.1019488051533699,
0.13468998670578003,
-0.1985086053609848,
-0.027434760704636574,
0.06385248154401779,
0.06805096566677094,
0.04971839860081673,
-0.0312331710010767,
-0.036052003502845764,
0.09569405764341354,
0.02341141365468502,
0.04287160933017731,
-0.08078093826770782,
-0.03582668676972389,
-0.009640871547162533,
-0.0895998552441597,
-0.0565979890525341,
0.1455971598625183,
0.03462590277194977,
-0.05484132841229439,
-0.12352779507637024,
-0.038736436516046524,
-0.16693362593650818,
0.03919548541307449,
0.027842046692967415,
0.02559751644730568,
-0.0157006848603487,
0.07034932076931,
-0.009012283757328987,
-0.1086927205324173,
-0.12109412997961044,
0.061440180987119675,
0.08189640939235687,
0.07787033915519714,
0.01672292873263359,
-0.0038212649524211884,
0.1008778065443039,
0.06497988849878311,
-0.03621019050478935,
-0.011932204477488995,
0.00478019705042243,
-0.12709328532218933,
0.04866912588477135,
-0.05341847613453865,
-0.09441147744655609,
-0.043098073452711105,
0.0675487369298935,
0.007790741976350546,
0.05077182501554489,
0.15016032755374908,
0.008673145435750484,
-0.006401500664651394,
0.19097667932510376,
0.004742079880088568,
-0.08324167132377625,
-0.009733829647302628,
0.02988203801214695,
-0.0065331836231052876,
-0.010672342032194138,
-0.07713247090578079,
-0.0322556272149086,
-0.01252810563892126,
0.0718742087483406,
-0.12719029188156128,
0.024501092731952667,
-0.04261501878499985,
-0.029117921367287636,
0.052237801253795624,
-0.1479962170124054,
0.01865217462182045,
-0.0015603582141920924,
-0.046056680381298065,
-0.06280823051929474,
0.07912591844797134,
-0.09746149927377701,
-0.12592190504074097,
0.009228435344994068,
-0.03376922011375427,
-0.02572922222316265,
-0.13344472646713257,
-0.12103070318698883,
-0.01045198179781437,
-0.09676630049943924,
0.006565867457538843,
-0.08766396343708038,
-0.14491678774356842,
-0.04355315864086151,
0.05708879604935646,
-0.0020562075078487396,
-0.00947880744934082,
-0.06115914136171341,
0.01644892431795597,
-0.011618834920227528,
-0.018668431788682938,
0.03635872155427933,
-0.01725129596889019,
0.1006668359041214,
0.08053656667470932,
0.04247996211051941,
-0.007071377709507942,
0.04400579631328583,
-0.0663757175207138,
0.05926596373319626,
-0.03241617977619171,
0.08678063750267029,
-0.016367023810744286,
0.0686282366514206,
-0.07793743908405304,
-0.07804536819458008,
0.060019440948963165,
0.059981416910886765,
0.0627375915646553,
0.03394804149866104,
-0.13367564976215363,
0.008918097242712975,
0.13222195208072662,
-0.09656152874231339,
-0.1106930673122406,
0.07523588836193085,
-0.010594222694635391,
0.05114830285310745,
0.06948976963758469,
0.1340477019548416,
0.14456498622894287,
-0.06595886498689651,
-0.028546007350087166,
0.05717078968882561,
0.0820242390036583,
-0.10056173801422119,
0.08282937854528427,
0.034235529601573944,
0.038319677114486694,
0.0350579097867012,
-0.009229635819792747,
0.07724149525165558,
-0.006835663225501776,
-0.050452519208192825,
-0.017168156802654266,
-0.10060330480337143,
-0.07798134535551071,
-0.007508307695388794,
0.04274158552289009,
-0.0390925407409668,
-0.0723237618803978,
0.03289595991373062,
0.17011070251464844,
-0.11517303436994553,
0.04227364435791969,
-0.07904335111379623,
-0.03700144216418266,
-0.09678672254085541,
0.01884518936276436,
-0.13658015429973602,
0.0288778617978096,
0.04187193140387535,
-0.04404658451676369,
0.08235788345336914,
0.07917752861976624,
-0.0018023605225607753,
0.03715810924768448,
-0.02192990481853485,
-0.045785922557115555,
-0.05707823485136032,
-0.06839245557785034,
-0.1275428682565689,
-0.03708495944738388,
-0.12614226341247559,
-0.049033381044864655,
-0.05954311415553093,
-0.15035493671894073,
0.01576905883848667,
-0.04363274201750755,
0.03456107899546623,
0.006870870944112539,
-0.044220924377441406,
0.04022872820496559,
0.06507173180580139,
-0.049227651208639145,
-0.07992249727249146,
0.025633037090301514,
0.008410228416323662,
-0.11819980293512344,
-0.024955185130238533,
-0.11995861679315567,
-0.06177406758069992,
0.06752462685108185,
0.04582582041621208,
-0.07608085870742798,
0.0018086433410644531,
-0.030748002231121063,
-0.050816621631383896,
0.01575099490582943,
-0.07096637785434723,
0.12478295713663101,
0.00707769300788641,
0.151512011885643,
-0.13888320326805115,
-0.07224761694669724,
-0.03692349046468735,
0.015406319871544838,
-0.002855353057384491,
0.16904133558273315,
0.04460820555686951,
-0.05904685705900192,
0.03831059858202934,
0.025226660072803497,
-0.02082415483891964,
0.13989660143852234,
-0.018042780458927155,
-0.11535188555717468,
0.015130938030779362,
0.10077851265668869,
-0.015060300938785076,
0.10492159426212311,
-0.057049915194511414,
-0.027808330953121185,
0.007508230395615101,
0.0033060512505471706,
0.04050754755735397,
-0.1344652771949768,
0.016206011176109314,
0.05326562374830246,
-0.06321797519922256,
-0.026609506458044052,
-0.012706732377409935,
-0.05913873016834259,
0.040599722415208817,
-0.019127674400806427,
0.0011087623424828053,
0.0010470086708664894,
-0.03616558015346527,
-0.08608445525169373,
0.17981810867786407,
-0.0785205289721489,
-0.18382105231285095,
-0.18433792889118195,
0.015308702364563942,
-0.05977852642536163,
0.015041816979646683,
0.044454965740442276,
-0.12389489263296127,
-0.06420126557350159,
-0.1011272594332695,
0.13048429787158966,
-0.10525668412446976,
-0.0001635810185689479,
0.0007544108084402978,
0.03904898837208748,
0.034843239933252335,
-0.18133610486984253,
0.046131718903779984,
-0.020531823858618736,
0.006558890454471111,
-0.008242988027632236,
-0.03518020361661911,
0.09619449824094772,
0.11579296737909317,
-0.0562751330435276,
0.023066624999046326,
0.0031611216254532337,
0.20025749504566193,
-0.04064932465553284,
0.014925586991012096,
0.19732855260372162,
-0.0006903987959958613,
0.03128224238753319,
0.044136714190244675,
0.021501069888472557,
-0.13282057642936707,
0.05367303639650345,
0.04790421202778816,
-0.037914521992206573,
-0.2563672959804535,
-0.015822192654013634,
-0.06597588956356049,
0.013040182180702686,
0.11031991988420486,
0.06270049512386322,
-0.14516015350818634,
0.043006714433431625,
-0.013065442442893982,
0.14824335277080536,
-0.0494534969329834,
0.05064323544502258,
-0.0009898816933855414,
0.0008592160884290934,
0.00017264466441702098,
-0.09280407428741455,
-0.003950503654778004,
0.06030416488647461,
0.09244655817747116,
0.22374626994132996,
-0.06488233804702759,
0.2183564305305481,
0.027165502309799194,
0.07179220020771027,
-0.0069792564027011395,
0.14602023363113403,
-0.09927649796009064,
0.021909397095441818,
0.011646296828985214,
-0.030126065015792847,
-0.07855924218893051,
0.0877690389752388,
-0.028884612023830414,
0.03180578351020813,
-0.06345482170581818,
0.022742565721273422,
-0.006954728160053492,
0.18628722429275513,
0.0454285591840744,
-0.1936974823474884,
-0.06394173204898834,
0.027284177020192146,
-0.08193756639957428,
-0.11448122560977936,
0.0619988776743412,
0.17190542817115784,
-0.005940910894423723,
0.0015721558593213558,
-0.007145076524466276,
0.1373065561056137,
-0.07604844868183136,
-0.03459345921874046,
0.03803512454032898,
0.04814722761511803,
0.04073093459010124,
0.12794312834739685,
-0.23555542528629303,
0.1070643737912178,
0.018229912966489792,
0.0777655690908432,
-0.04214799404144287,
0.07965488731861115,
-0.06968129426240921,
-0.017557989805936813,
0.11850282549858093,
0.0036521432921290398,
-0.038234829902648926,
-0.1964503973722458,
-0.07833170145750046,
0.009095081128180027,
0.08015427738428116,
-0.043559495359659195,
0.08457498997449875,
0.009402263909578323,
0.0519489161670208,
-0.022737858816981316,
-0.09628293663263321,
-0.08696623891592026,
-0.16876760125160217,
0.00592779042199254,
0.017232265323400497,
-0.029727395623922348,
-0.04040193557739258,
0.010654843412339687,
-0.05015568062663078,
0.243587926030159,
-0.12771621346473694,
-0.10681002587080002,
-0.09173484891653061,
0.0458722859621048,
0.14494588971138,
-0.08534134924411774,
0.013807049952447414,
0.030450360849499702,
0.025697678327560425,
-0.03728713467717171,
-0.07623770833015442,
0.03565695136785507,
-0.04796798154711723,
-0.08109498023986816,
-0.036703310906887054,
0.08777692168951035,
0.010003725066781044,
0.02901884727180004,
-0.018411504104733467,
-0.08291786909103394,
-0.0397317036986351,
-0.11670390516519547,
-0.06760711967945099,
0.012341408990323544,
0.034342411905527115,
0.0004231167840771377,
-0.12486571073532104,
0.1094730868935585,
-0.019247358664870262,
-0.0968102440237999,
0.059448953717947006,
0.21437956392765045,
-0.05752807483077049,
0.05011148378252983,
0.11803589761257172,
-0.07279719412326813,
-0.15907029807567596,
-0.06517568975687027,
0.05885983258485794,
0.0817534402012825,
-0.023168662562966347,
-0.18114854395389557,
0.06216494366526604,
-0.006567396689206362,
0.02967906929552555,
0.026762235909700394,
-0.28332850337028503,
-0.1419486403465271,
0.07437802106142044,
0.09559483826160431,
0.006120610516518354,
-0.1241069957613945,
-0.04141310602426529,
-0.07096344232559204,
-0.054949160665273666,
0.05888349562883377,
0.10976886004209518,
0.12745186686515808,
-0.04008724167943001,
0.017824850976467133,
0.03025193139910698,
-0.027411209419369698,
0.09797839820384979,
0.031092235818505287,
0.12128996849060059,
-0.03524408116936684,
0.035418201237916946,
0.06894925981760025,
-0.06038748472929001,
0.1590283066034317,
-0.15004302561283112,
0.07212527841329575,
-0.22783976793289185,
-0.06738615781068802,
-0.018913771957159042,
-0.01755402237176895,
-0.04120554402470589,
-0.06947876513004303,
-0.09056300669908524,
0.007248809561133385,
0.05224944278597832,
-0.018687767907977104,
0.047902993857860565,
-0.03250216692686081,
-0.06705296039581299,
0.0862206444144249,
0.0859294980764389,
0.0013602186227217317,
-0.07624367624521255,
-0.0010985025437548757,
0.029535071924328804,
0.08855931460857391,
-0.16005828976631165,
-0.005456083919852972,
0.12232010811567307,
0.0022088242694735527,
0.09476836770772934,
0.009247922338545322,
-0.0855896845459938,
0.016672788187861443,
0.07697224617004395,
-0.06158952787518501,
-0.08846300840377808,
-0.02016768418252468,
-0.031079202890396118,
-0.07348895817995071,
0.03403061255812645,
0.08642909675836563,
-0.05550330504775047,
-0.021707506850361824,
-0.01940213516354561,
-0.0019737386610358953,
-0.07711956650018692,
0.18246136605739594,
0.023072589188814163,
0.06683415919542313,
-0.07013599574565887,
0.08136769384145737,
0.09760166704654694,
-0.10353187471628189,
0.024690790101885796,
0.14364123344421387,
-0.07384345680475235,
-0.038109540939331055,
0.058886609971523285,
0.08195540308952332,
-0.04030947387218475,
-0.059208475053310394,
-0.0787610337138176,
-0.06207674741744995,
0.046606581658124924,
0.02450532279908657,
0.07581046968698502,
0.08893188089132309,
-0.0320003442466259,
0.012141420505940914,
-0.09548748284578323,
0.07754556834697723,
0.07640606164932251,
0.048704106360673904,
-0.15922671556472778,
0.14468812942504883,
0.017326299101114273,
0.09413286298513412,
-0.0031167338602244854,
0.041323740035295486,
-0.06807797402143478,
0.049739353358745575,
-0.0314960852265358,
0.004890576004981995,
-0.019704194739460945,
0.04051881283521652,
-0.019023951143026352,
0.04189540445804596,
-0.006031909957528114,
0.054707031697034836,
-0.07007329910993576,
-0.030877957120537758,
-0.030106298625469208,
0.03336028382182121,
-0.059917982667684555,
-0.034418508410453796,
-0.009510372765362263,
-0.080459825694561,
0.09964799135923386,
-0.04690088331699371,
-0.032191675156354904,
-0.011672299355268478,
-0.02565395087003708,
0.09077006578445435,
0.020641852170228958,
0.054922476410865784,
-0.032678257673978806,
0.002551526529714465,
0.02138538286089897,
0.027581049129366875,
-0.0173674114048481,
-0.022613074630498886,
0.047997552901506424,
-0.15169353783130646,
-0.061180468648672104,
-0.06608527153730392,
-0.05956491455435753,
-0.06270361691713333,
0.0841144546866417,
0.07872504740953445,
0.08124332129955292,
0.07131215184926987,
-0.0009530861279927194,
0.0016931119607761502,
-0.13110384345054626,
-0.0226614847779274,
0.06425852328538895,
0.004312086850404739,
-0.11799412965774536,
-0.07750271260738373,
0.05008568614721298,
-0.02315339632332325,
0.14076094329357147,
-0.028732527047395706,
-0.007143286522477865,
-0.03185861557722092,
-0.04957468435168266,
0.01380356214940548,
0.00946865975856781,
0.2078883945941925,
-0.10119398683309555,
0.03096914477646351,
0.008766318671405315,
-0.015726257115602493,
0.0408879853785038,
0.16563640534877777,
0.07579876482486725,
0.131309375166893,
0.08361779153347015,
0.10166553407907486,
-0.041154470294713974,
-0.03154546022415161,
-0.14715856313705444,
0.02958892099559307,
0.02738594450056553,
0.058745358139276505,
-0.05394905433058739,
0.14277522265911102,
0.082333043217659,
-0.10618919879198074,
0.07589377462863922,
0.041298847645521164,
-0.09346236288547516,
-0.043605417013168335,
-0.1000654324889183,
-0.03332608938217163,
-0.07005148380994797,
0.018077295273542404,
-0.0959138572216034,
0.021528834477066994,
0.024625025689601898,
0.044507864862680435,
-0.016031330451369286,
0.16478601098060608,
-0.035675957798957825,
-0.0379101037979126,
0.021170735359191895,
0.009703266434371471,
0.0186949260532856,
0.07068681716918945,
0.003811425296589732,
0.06176821142435074,
-0.043784599751234055,
0.05159169062972069,
0.031280215829610825,
-0.010741530917584896,
0.021216563880443573,
0.016327116638422012,
0.007541388273239136,
-0.043626151978969574,
0.016265086829662323,
0.09862569719552994,
0.19567738473415375,
0.021138178184628487,
-0.06899502128362656,
-0.05992062762379646,
0.13381518423557281,
-0.06412927806377411,
-0.04803832247853279,
-0.07741731405258179,
0.11327487230300903,
0.047320447862148285,
0.02475186623632908,
0.0074713085778057575,
-0.07634799182415009,
-0.051905225962400436,
0.22833983600139618,
0.04884001240134239,
-0.024755042046308517,
-0.03125188872218132,
0.0061275227926671505,
0.0003944240161217749,
-0.08188620954751968,
0.1411786824464798,
0.00877867080271244,
0.2065083533525467,
-0.004395736381411552,
0.002049580914899707,
-0.04756510630249977,
-0.027585361152887344,
-0.026113495230674744,
0.1938346028327942,
-0.03837418183684349,
0.004592781886458397,
-0.07554066181182861,
-0.009445800445973873,
0.01848508231341839,
-0.11121385544538498,
0.12486566603183746,
-0.05737825855612755,
-0.06731108576059341,
0.030405569821596146,
0.06491678953170776,
-0.03313055261969566,
0.0323781818151474,
-0.03856924548745155,
0.050815507769584656,
0.07386240363121033,
-0.02390783093869686,
-0.11094767600297928,
-0.11123024672269821,
0.05936826393008232,
-0.045637693256139755,
0.1512846201658249,
0.02373495325446129,
0.08966952562332153,
0.0635933205485344,
0.002649475820362568,
-0.09993524104356766,
0.10559067130088806,
0.03579990193247795,
0.01817927323281765,
0.07480175793170929,
0.14126387238502502,
-0.04455547407269478,
0.14653097093105316,
-0.03698481619358063,
-0.023235497996211052,
-0.004671914968639612,
0.00575588084757328,
-0.011383583769202232,
-0.13604101538658142,
-0.005518511403352022,
-0.09038043022155762,
0.1379588097333908,
0.16388444602489471,
-0.04151817783713341,
-0.031120367348194122,
-0.04569725692272186,
0.08361537754535675,
-0.025842316448688507,
0.06439745426177979,
0.007117204833775759,
-0.16809287667274475,
0.017859289422631264,
0.03860838711261749,
0.04012661799788475,
-0.17295847833156586,
-0.049243051558732986,
-0.039296749979257584,
-0.05530346930027008,
-0.08701087534427643,
0.14207518100738525,
0.07338408380746841,
0.02091570384800434,
-0.03621888905763626,
-0.17862217128276825,
-0.020244212821125984,
0.0451166033744812,
-0.15189795196056366,
-0.1176399514079094
] |
null | null |
transformers
|
# CodeTrans model for program synthesis
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
## Intended uses & limitations
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_program_synthese_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_program_synthese_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/transfer%20learning%20fine-tuning/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 45,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | LISP |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 89.43 |
| CodeTrans-ST-Base | 89.65 |
| CodeTrans-TF-Small | 90.30 |
| CodeTrans-TF-Base | 90.24 |
| CodeTrans-TF-Large | 90.21 |
| CodeTrans-MT-Small | 82.88 |
| CodeTrans-MT-Base | 86.99 |
| CodeTrans-MT-Large | 90.27 |
| CodeTrans-MT-TF-Small | **90.31** |
| CodeTrans-MT-TF-Base | 90.30 |
| CodeTrans-MT-TF-Large | 90.17 |
| State of the art | 85.80 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
|
summarization
|
SEBIS/code_trans_t5_base_program_synthese_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for program synthesis
=====================================
Pretrained model on programming language lisp inspired DSL using the t5 base model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
Intended uses & limitations
---------------------------
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 45,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 45,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 45,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
63,
87,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 45,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10402420163154602,
0.07946319878101349,
-0.0018934241961687803,
0.1104278564453125,
0.05785100534558296,
0.012503650970757008,
0.0850929319858551,
0.08524813503026962,
-0.04353322833776474,
0.049997441470623016,
0.07311587035655975,
0.01261199451982975,
0.05094972997903824,
0.18612520396709442,
0.02717028558254242,
-0.167301207780838,
-0.006906399969011545,
0.028244849294424057,
-0.05800195410847664,
0.11289936304092407,
0.09469958394765854,
-0.09314092993736267,
0.06081872805953026,
-0.04940017685294151,
-0.11962240934371948,
0.05329860746860504,
-0.05174383893609047,
-0.03298918902873993,
0.06912757456302643,
0.0558101087808609,
0.10141092538833618,
-0.01650541089475155,
0.09769228845834732,
-0.19688531756401062,
-0.004802639130502939,
0.021467018872499466,
0.03688662871718407,
0.026270374655723572,
0.07197803258895874,
0.06509462743997574,
0.15221573412418365,
-0.03666979447007179,
0.023708462715148926,
0.048830535262823105,
-0.050202153623104095,
-0.08051185309886932,
-0.053882304579019547,
0.07956819981336594,
0.1373983472585678,
0.0988660603761673,
-0.014851916581392288,
0.015021820552647114,
-0.0812433734536171,
0.0923115611076355,
0.14254997670650482,
-0.2152785211801529,
-0.022137822583317757,
0.05872715264558792,
0.07162037491798401,
0.06733351945877075,
-0.055835213512182236,
-0.027128437533974648,
0.10186385363340378,
0.021111920475959778,
0.04506292939186096,
-0.07916665077209473,
-0.032684482634067535,
0.007342623081058264,
-0.08470623940229416,
-0.05529458448290825,
0.1474704146385193,
0.046548992395401,
-0.05454578250646591,
-0.11268840730190277,
-0.04325774684548378,
-0.1804218590259552,
0.03905513137578964,
0.014270966872572899,
0.027525896206498146,
-0.012262758798897266,
0.06722040474414825,
-0.01316559873521328,
-0.10216052085161209,
-0.11381496489048004,
0.026499593630433083,
0.0744071677327156,
0.07558900117874146,
0.015788286924362183,
0.001320616458542645,
0.09408649802207947,
0.0684044137597084,
-0.03607945144176483,
-0.022117044776678085,
0.008002339862287045,
-0.11587531864643097,
0.048166029155254364,
-0.039827510714530945,
-0.07361556589603424,
-0.04230080172419548,
0.08751026540994644,
0.016589278355240822,
0.05191495269536972,
0.1430472731590271,
0.010209531523287296,
-0.02137622982263565,
0.20340339839458466,
0.0061417738907039165,
-0.09909147769212723,
-0.0006971113034524024,
0.03576775640249252,
0.0005156865809112787,
-0.00551651231944561,
-0.07828288525342941,
-0.04055740684270859,
-0.009771514683961868,
0.06859534233808517,
-0.13273422420024872,
0.023079637438058853,
-0.04551190882921219,
-0.022454187273979187,
0.034610405564308167,
-0.1428649127483368,
0.025972489267587662,
0.001322665368206799,
-0.07045331597328186,
-0.05224112048745155,
0.08269747346639633,
-0.1032809466123581,
-0.119805246591568,
0.012193717062473297,
-0.04038718715310097,
-0.028186336159706116,
-0.1332237422466278,
-0.13346248865127563,
-0.0101193618029356,
-0.0823935940861702,
0.012252023443579674,
-0.1043357104063034,
-0.13095201551914215,
-0.024694664403796196,
0.04641798511147499,
0.00670751603320241,
-0.0010621504625305533,
-0.07415735721588135,
0.010978537611663342,
-0.009778821840882301,
-0.030177928507328033,
0.02767302840948105,
-0.026772717013955116,
0.10230956226587296,
0.08426360040903091,
0.05478402599692345,
0.015480195172131062,
0.03559079021215439,
-0.05712272226810455,
0.056250493973493576,
-0.04645572602748871,
0.0809900164604187,
-0.03146032989025116,
0.05789096653461456,
-0.08141174167394638,
-0.08380008488893509,
0.07057919353246689,
0.056800879538059235,
0.06883193552494049,
0.03488659858703613,
-0.12691645324230194,
0.008347580209374428,
0.11804497987031937,
-0.08952023833990097,
-0.1111813560128212,
0.09234178811311722,
-0.006148499436676502,
0.023943772539496422,
0.06880764663219452,
0.13398884236812592,
0.1641378551721573,
-0.09172943234443665,
-0.044213637709617615,
0.0783257782459259,
0.07291363179683685,
-0.08994042873382568,
0.08386555314064026,
0.04397071152925491,
0.03838494420051575,
0.03242664784193039,
-0.005993045400828123,
0.08681129664182663,
-0.009647632017731667,
-0.047239623963832855,
-0.01506457943469286,
-0.0981946662068367,
-0.0433804951608181,
-0.0017875494668260217,
0.03425147384405136,
-0.0505511611700058,
-0.0772777572274208,
0.05406107008457184,
0.17397181689739227,
-0.11175323277711868,
0.036964282393455505,
-0.0880458727478981,
-0.03646291047334671,
-0.09061135351657867,
0.011026683263480663,
-0.11499623954296112,
0.028041334822773933,
0.03422917425632477,
-0.03368717059493065,
0.07871630042791367,
0.08966591209173203,
0.007423073053359985,
0.03477458283305168,
-0.027304314076900482,
-0.05222872272133827,
-0.05739649012684822,
-0.060389500111341476,
-0.13561876118183136,
-0.026414040476083755,
-0.11459201574325562,
-0.034558579325675964,
-0.06104224547743797,
-0.1589754968881607,
0.015979932621121407,
-0.04368169605731964,
0.03785213083028793,
0.004629122093319893,
-0.02609667368233204,
0.038868896663188934,
0.05373947322368622,
-0.04922177642583847,
-0.08215977251529694,
0.020162707194685936,
-0.001593816326931119,
-0.11994274705648422,
-0.03764081001281738,
-0.11214348673820496,
-0.03994624689221382,
0.07876180857419968,
0.04673821106553078,
-0.05649996176362038,
-0.004257652442902327,
-0.031222909688949585,
-0.05212977156043053,
0.008352603763341904,
-0.07776898145675659,
0.14920562505722046,
0.004489716608077288,
0.1570453643798828,
-0.13978418707847595,
-0.0670875832438469,
-0.03605535998940468,
0.004018046893179417,
-0.00004351614916231483,
0.17130134999752045,
0.027205627411603928,
-0.04833811894059181,
0.04320929944515228,
0.02663593553006649,
-0.04295658320188522,
0.13087177276611328,
-0.019039813429117203,
-0.1066000908613205,
0.01578713208436966,
0.09963525831699371,
-0.016742683947086334,
0.11269120126962662,
-0.06163675710558891,
-0.02652302384376526,
0.002662103157490492,
0.014821760356426239,
0.03839763626456261,
-0.1393323838710785,
0.012446276843547821,
0.059420980513095856,
-0.07253050059080124,
-0.030782289803028107,
-0.025111444294452667,
-0.05903593450784683,
0.04507666081190109,
-0.01631048507988453,
-0.016729686409235,
-0.003836849005892873,
-0.029084626585245132,
-0.08273277431726456,
0.18316632509231567,
-0.09500707685947418,
-0.1935729831457138,
-0.17099745571613312,
0.024288330227136612,
-0.052307289093732834,
0.006014264188706875,
0.0423765704035759,
-0.12938234210014343,
-0.06548473238945007,
-0.09760519117116928,
0.12100960314273834,
-0.10939537733793259,
0.005674741230905056,
-0.006563575007021427,
0.034540094435214996,
0.027142014354467392,
-0.18784190714359283,
0.04641463980078697,
-0.018933122977614403,
0.00996414851397276,
-0.015319375321269035,
-0.0418362133204937,
0.09557389467954636,
0.11768952757120132,
-0.07348094880580902,
0.023585403338074684,
-0.0034374359529465437,
0.1834421008825302,
-0.0405597947537899,
0.022608837112784386,
0.2010117471218109,
0.02185996249318123,
0.03048865497112274,
0.04635842889547348,
0.0151714738458395,
-0.11836865544319153,
0.06217959150671959,
0.04959045350551605,
-0.017213137820363045,
-0.25271546840667725,
-0.0057875425554811954,
-0.06569790840148926,
0.024642808362841606,
0.1020345538854599,
0.053858060389757156,
-0.14627337455749512,
0.03599067032337189,
-0.009018834680318832,
0.146747887134552,
-0.04055098816752434,
0.044012077152729034,
0.016943397000432014,
-0.007589047309011221,
0.000771430553868413,
-0.08502181619405746,
-0.0016051732236519456,
0.06347781419754028,
0.08876758813858032,
0.22219781577587128,
-0.07050489634275436,
0.22010233998298645,
0.017380038276314735,
0.08550018072128296,
-0.00014163332525640726,
0.13599291443824768,
-0.10579778254032135,
0.015009264461696148,
0.011533021926879883,
-0.017350422218441963,
-0.07942622154951096,
0.07445080578327179,
-0.022442562505602837,
0.05413854122161865,
-0.07041055709123611,
0.028035743162035942,
0.004538404289633036,
0.1589886099100113,
0.0524359866976738,
-0.186951145529747,
-0.07470786571502686,
0.02547525055706501,
-0.09455472230911255,
-0.10754751414060593,
0.06660809367895126,
0.19261638820171356,
-0.005897861905395985,
-0.009731526486575603,
-0.006961768493056297,
0.13561932742595673,
-0.0723785012960434,
-0.03840029239654541,
0.019503841176629066,
0.06828891485929489,
0.029485272243618965,
0.13506023585796356,
-0.27139419317245483,
0.09845032542943954,
0.01804119534790516,
0.09245039522647858,
-0.026119766756892204,
0.06669531762599945,
-0.04954170063138008,
-0.014060374349355698,
0.09784144908189774,
0.006044432520866394,
-0.053842175751924515,
-0.20577195286750793,
-0.07760641723871231,
0.015934966504573822,
0.07886603474617004,
-0.03182416409254074,
0.0852261334657669,
-0.0009196783066727221,
0.04941874369978905,
-0.015550372190773487,
-0.10939260572195053,
-0.0787363275885582,
-0.1474606841802597,
-0.008319544605910778,
0.017232799902558327,
-0.012045995332300663,
-0.03972582519054413,
0.007775849662721157,
-0.05055411532521248,
0.22595059871673584,
-0.16700837016105652,
-0.11065749078989029,
-0.09088655561208725,
0.0646827444434166,
0.13262441754341125,
-0.08935220539569855,
0.020238744094967842,
0.0428546778857708,
0.018898868933320045,
-0.03308713808655739,
-0.07312270998954773,
0.03596221283078194,
-0.0448029488325119,
-0.07737097889184952,
-0.03160056471824646,
0.07733874022960663,
0.0012085435446351767,
0.04260745644569397,
-0.011090059764683247,
-0.09499744325876236,
-0.04188615456223488,
-0.12243059277534485,
-0.08385713398456573,
0.017085621133446693,
0.0403425395488739,
0.005198521539568901,
-0.11194059252738953,
0.10663502663373947,
-0.023623036220669746,
-0.08989593386650085,
0.06308311969041824,
0.20261874794960022,
-0.06240304559469223,
0.03211773559451103,
0.11873641610145569,
-0.07142583280801773,
-0.15645726025104523,
-0.058577924966812134,
0.05316874757409096,
0.08782926946878433,
-0.03764287382364273,
-0.16050203144550323,
0.06986569613218307,
0.007955269888043404,
0.032617393881082535,
0.027082005515694618,
-0.28721675276756287,
-0.13642224669456482,
0.06956522911787033,
0.09717462956905365,
0.0389753021299839,
-0.11143876612186432,
-0.033857956528663635,
-0.05828418582677841,
-0.0641549676656723,
0.044562727212905884,
0.10313057899475098,
0.1298593282699585,
-0.03548983857035637,
0.02092571370303631,
0.030031954869627953,
-0.023948363959789276,
0.09951241314411163,
0.014249821193516254,
0.12371688336133957,
-0.03337610885500908,
0.029875533655285835,
0.0522107295691967,
-0.05883790925145149,
0.16633373498916626,
-0.1725507378578186,
0.07607924193143845,
-0.23547816276550293,
-0.06480927020311356,
-0.01275580283254385,
-0.012425987049937248,
-0.03506810963153839,
-0.06701190024614334,
-0.10275015980005264,
0.00688599981367588,
0.04787642881274223,
-0.01334418449550867,
0.04223889857530594,
-0.026854483410716057,
-0.08812165260314941,
0.05869394168257713,
0.104336217045784,
-0.00397146912291646,
-0.09541884064674377,
0.006513587664812803,
0.03295334056019783,
0.09349343925714493,
-0.1778748780488968,
-0.002867911010980606,
0.14134207367897034,
0.004834767431020737,
0.10699030011892319,
0.010182679630815983,
-0.0738697275519371,
0.03022652305662632,
0.06967037171125412,
-0.049346912652254105,
-0.1004924476146698,
-0.020896131172776222,
-0.03738146275281906,
-0.07954415678977966,
0.030541973188519478,
0.08651354163885117,
-0.06926082819700241,
-0.014858808368444443,
-0.017009655013680458,
-0.0014976118691265583,
-0.0727447047829628,
0.19751675426959991,
0.03823366016149521,
0.07196851819753647,
-0.061413612216711044,
0.08921757340431213,
0.08596676588058472,
-0.11920122057199478,
0.025029022246599197,
0.14552666246891022,
-0.07647501677274704,
-0.034252237528562546,
0.0601140558719635,
0.08792989701032639,
-0.029044821858406067,
-0.05917935445904732,
-0.08740086853504181,
-0.06307590752840042,
0.036593370139598846,
0.03343996778130531,
0.07037513703107834,
0.08115337044000626,
-0.036796148866415024,
0.022606613114476204,
-0.09420990198850632,
0.09101233631372452,
0.08403142541646957,
0.047196026891469955,
-0.16058248281478882,
0.1661110669374466,
0.020347004756331444,
0.09038154780864716,
-0.0039766812697052956,
0.04755968227982521,
-0.06590256839990616,
0.049658775329589844,
-0.03973528742790222,
0.017037855461239815,
-0.008039349690079689,
0.0487012043595314,
-0.021958893164992332,
0.02736310288310051,
-0.011209322139620781,
0.05440065264701843,
-0.061094824224710464,
-0.02646923065185547,
-0.017732540145516396,
0.030042581260204315,
-0.0614757277071476,
-0.048058345913887024,
-0.006092829164117575,
-0.07791995257139206,
0.10058744996786118,
-0.04643559455871582,
-0.018323011696338654,
-0.006233485881239176,
-0.029244927689433098,
0.08569920808076859,
0.024525919929146767,
0.06425339728593826,
-0.01933697611093521,
0.005772777833044529,
0.03673885017633438,
0.022080395370721817,
-0.010703366249799728,
-0.0200523491948843,
0.04304478317499161,
-0.14697466790676117,
-0.08154573291540146,
-0.08897325396537781,
-0.047597311437129974,
-0.07562149316072464,
0.08617731183767319,
0.0859263688325882,
0.07455248385667801,
0.08389333635568619,
-0.006627530325204134,
-0.005394398234784603,
-0.13322579860687256,
-0.023186828941106796,
0.055908817797899246,
-0.014214898459613323,
-0.097542904317379,
-0.07071264833211899,
0.05501779168844223,
-0.03805426135659218,
0.11566366255283356,
-0.01629822701215744,
-0.0018117401050403714,
-0.02987341769039631,
-0.06646191328763962,
0.014433551579713821,
0.005364048294723034,
0.21990883350372314,
-0.09424292296171188,
0.030962321907281876,
0.0002495992521289736,
-0.009745785966515541,
0.04875199869275093,
0.1600320041179657,
0.06582803279161453,
0.14437881112098694,
0.05399945750832558,
0.11317194998264313,
-0.05323753505945206,
-0.04004989564418793,
-0.16981704533100128,
0.03739078715443611,
0.012284303084015846,
0.057398777455091476,
-0.04929143190383911,
0.13773277401924133,
0.1055121123790741,
-0.11090471595525742,
0.07558686286211014,
0.04440077021718025,
-0.09954962134361267,
-0.04639348015189171,
-0.08934244513511658,
-0.033200204372406006,
-0.08705690503120422,
0.02075698971748352,
-0.10321087390184402,
0.018991593271493912,
0.030078373849391937,
0.04554758220911026,
-0.019756777212023735,
0.14681608974933624,
-0.024885213002562523,
-0.04494635760784149,
0.026973072439432144,
0.009012941271066666,
0.021895168349146843,
0.07328247278928757,
-0.0064552417024970055,
0.07113508135080338,
-0.0660519003868103,
0.05527646467089653,
0.020661814138293266,
-0.0048165773041546345,
0.01764126494526863,
0.01488906517624855,
0.003717227606102824,
-0.05028725415468216,
0.012982271611690521,
0.09681033343076706,
0.2013479620218277,
0.03316548839211464,
-0.06458856165409088,
-0.0570145808160305,
0.14223501086235046,
-0.06625685840845108,
-0.04765778034925461,
-0.08461746573448181,
0.13260586559772491,
0.04906615987420082,
0.03320041298866272,
0.006248502526432276,
-0.07811678200960159,
-0.05235251039266586,
0.24587857723236084,
0.018127815797924995,
-0.03191655874252319,
-0.04131756350398064,
-0.0015886715846136212,
-0.004807597491890192,
-0.07082062214612961,
0.14658556878566742,
0.026747066527605057,
0.20064906775951385,
-0.003241173690184951,
-0.01318606361746788,
-0.04584743455052376,
-0.028395330533385277,
-0.027069441974163055,
0.18378272652626038,
-0.039195749908685684,
0.013553078286349773,
-0.08006878197193146,
-0.018106238916516304,
0.028855346143245697,
-0.11292676627635956,
0.12102149426937103,
-0.059245914220809937,
-0.0633070096373558,
0.026634087786078453,
0.08098123967647552,
-0.022606471553444862,
0.027213096618652344,
-0.029915019869804382,
0.057323917746543884,
0.06854899227619171,
-0.02617764100432396,
-0.11437525600194931,
-0.10452926158905029,
0.048697974532842636,
-0.041625138372182846,
0.14587484300136566,
0.02108198218047619,
0.08953097462654114,
0.07563721388578415,
0.010619753040373325,
-0.09353071451187134,
0.1120714470744133,
0.03718680888414383,
0.004291464574635029,
0.08223393559455872,
0.1354796439409256,
-0.044197581708431244,
0.14873191714286804,
-0.026623297482728958,
-0.018898431211709976,
-0.012076916173100471,
-0.0125404242426157,
-0.022485459223389626,
-0.14089855551719666,
0.002099973615258932,
-0.07286962866783142,
0.13419295847415924,
0.16201439499855042,
-0.041380804032087326,
-0.025892224162817,
-0.0468086376786232,
0.06967306137084961,
-0.02651679515838623,
0.07364165782928467,
0.013117696158587933,
-0.14963048696517944,
0.0021393517963588238,
0.031457770615816116,
0.028113052248954773,
-0.16224922239780426,
-0.04406580328941345,
-0.04822903871536255,
-0.05521560087800026,
-0.08128796517848969,
0.14357662200927734,
0.08328995108604431,
0.02801022306084633,
-0.03684637323021889,
-0.18864116072654724,
-0.022282186895608902,
0.05285957083106041,
-0.15837064385414124,
-0.11289709806442261
] |
null | null |
transformers
|
# CodeTrans model for source code summarization csharp
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization csharp dataset.
## Intended uses & limitations
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp", skip_special_tokens=True),
device=0
)
tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/source%20code%20summarization/csharp/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_csharp
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization csharp
====================================================
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization csharp dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
116
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.13123203814029694,
-0.008956597186625004,
0.00020162582222837955,
0.037732165306806564,
0.13945803046226501,
0.018874723464250565,
0.10457854717969894,
0.0261867456138134,
-0.022193020209670067,
-0.020739253610372543,
0.09670757502317429,
0.10125600546598434,
0.028636222705245018,
0.16784468293190002,
-0.02935093827545643,
-0.17355330288410187,
0.008417794480919838,
0.06609396636486053,
-0.1789551079273224,
0.13523977994918823,
0.11265981942415237,
-0.06236467510461807,
0.09623076766729355,
-0.003968840464949608,
-0.23009300231933594,
0.04039261117577553,
0.0031363123562186956,
-0.08276820927858353,
0.1303713172674179,
0.09641513973474503,
0.14144468307495117,
0.05152898281812668,
0.021603241562843323,
-0.20230989158153534,
0.04325537011027336,
-0.047463346272706985,
-0.01463493425399065,
0.060126710683107376,
0.014784903265535831,
-0.07332537323236465,
0.17133700847625732,
-0.008990704081952572,
0.024205265566706657,
0.06082039698958397,
-0.11177872866392136,
-0.04121687635779381,
-0.02060530334711075,
0.04331837594509125,
0.08951739221811295,
0.07351015508174896,
0.017166588455438614,
0.11220233142375946,
-0.15452758967876434,
0.10990844666957855,
0.121117003262043,
-0.19233538210391998,
-0.018852418288588524,
0.12731221318244934,
0.07105888426303864,
-0.06293974071741104,
-0.046443574130535126,
0.020158056169748306,
0.07198718935251236,
-0.012033877894282341,
0.033333729952573776,
-0.12511542439460754,
-0.15923812985420227,
0.06512438505887985,
-0.08091646432876587,
-0.06917539983987808,
0.28465816378593445,
-0.0016199630917981267,
-0.04149272292852402,
-0.045294057577848434,
-0.0465354360640049,
-0.004027132876217365,
-0.01719321869313717,
0.01745600253343582,
-0.004193696193397045,
-0.01550307311117649,
-0.05809586122632027,
0.00010702516738092527,
-0.11291486769914627,
-0.13162855803966522,
-0.002948643174022436,
0.10380139201879501,
-0.013520038686692715,
0.03077540546655655,
-0.12845033407211304,
0.10414222627878189,
0.08420919626951218,
-0.06221863627433777,
0.014226243831217289,
-0.0664677694439888,
-0.07944709807634354,
-0.023278208449482918,
-0.06963761150836945,
-0.1507025957107544,
0.07618466019630432,
0.12372492253780365,
-0.024889793246984482,
0.052999675273895264,
0.054256416857242584,
0.07178044319152832,
0.043758612126111984,
0.21571312844753265,
-0.004146764520555735,
-0.045659247785806656,
0.050198864191770554,
-0.017638027667999268,
-0.041040703654289246,
-0.009566240012645721,
-0.08823888748884201,
-0.045232344418764114,
0.0036057166289538145,
0.1271134614944458,
-0.12038273364305496,
0.08089222759008408,
-0.08244302123785019,
-0.03654423728585243,
0.04542718455195427,
-0.1214352548122406,
-0.03076154924929142,
0.036837443709373474,
-0.04019305109977722,
-0.01478677336126566,
0.12265938520431519,
-0.06742089986801147,
-0.07732218503952026,
0.003918325994163752,
-0.08457840979099274,
-0.008914501406252384,
-0.08853184431791306,
-0.0719878301024437,
0.010636093094944954,
0.04106469824910164,
0.0518663115799427,
-0.13056527078151703,
-0.1307675987482071,
0.004022459499537945,
0.06043970584869385,
0.0278252512216568,
0.041521064937114716,
-0.08630545437335968,
-0.020289793610572815,
-0.02257663942873478,
-0.00799438077956438,
0.0032045424450188875,
-0.07488317042589188,
0.07562487572431564,
0.05860411375761032,
0.04752984270453453,
-0.091040700674057,
0.03855659067630768,
-0.11415112763643265,
0.07316187769174576,
-0.18699833750724792,
0.08504616469144821,
-0.06132514774799347,
0.11700348556041718,
-0.09698625653982162,
-0.052647292613983154,
0.034735988825559616,
0.048883333802223206,
0.05019489675760269,
0.14076697826385498,
-0.1252911537885666,
-0.06724702566862106,
0.1455478072166443,
-0.12401778250932693,
-0.20878256857395172,
0.06365574151277542,
-0.06767822802066803,
0.14891469478607178,
0.05922044441103935,
0.14538449048995972,
0.10685122013092041,
-0.07666298002004623,
0.03149011731147766,
0.08308225870132446,
-0.04327763244509697,
-0.06364164501428604,
0.07858297973871231,
0.05627429485321045,
-0.13091415166854858,
0.04518222063779831,
-0.02610265277326107,
0.11235949397087097,
-0.04516289010643959,
-0.0535501129925251,
-0.00836742389947176,
-0.07254233211278915,
0.041959356516599655,
-0.012740490958094597,
0.08010826259851456,
0.006673901807516813,
-0.009549591690301895,
0.0441821925342083,
0.10775227099657059,
-0.1286570131778717,
0.0033758420031517744,
-0.10995881259441376,
0.07639767974615097,
-0.10026267915964127,
0.026968643069267273,
-0.19485126435756683,
-0.0636608824133873,
0.008580470457673073,
0.03689424693584442,
0.06493890285491943,
0.002264447743073106,
-0.005637839436531067,
0.015278877690434456,
-0.004199759569019079,
0.001299548428505659,
-0.005445033311843872,
-0.016563190147280693,
-0.038507673889398575,
-0.10585558414459229,
-0.06163530424237251,
-0.03834415227174759,
0.08589842915534973,
-0.1831098049879074,
0.0022936961613595486,
0.08342964947223663,
0.06822273135185242,
0.007395375054329634,
0.029701806604862213,
0.03331851586699486,
0.060742560774087906,
-0.05073525011539459,
-0.015284999273717403,
0.05637101083993912,
0.01706780306994915,
-0.16559089720249176,
0.06411978602409363,
-0.08366049081087112,
0.02968372404575348,
0.12438499182462692,
-0.1429111361503601,
-0.08760049194097519,
-0.07992863655090332,
-0.035817455500364304,
-0.02691299095749855,
0.017198186367750168,
-0.0385514572262764,
0.19451797008514404,
0.00035416774335317314,
0.1716557741165161,
-0.11050571501255035,
-0.03452304005622864,
-0.037205904722213745,
0.0013118014903739095,
0.038696832954883575,
0.12340717762708664,
0.10736952722072601,
-0.19430525600910187,
0.05343209207057953,
0.10319674015045166,
-0.010857881046831608,
0.17761708796024323,
-0.05718233063817024,
-0.06565245985984802,
-0.012484605424106121,
0.062444355338811874,
-0.015765199437737465,
0.1764148324728012,
-0.15241515636444092,
-0.04032861813902855,
0.018015189096331596,
-0.039236776530742645,
0.09135834127664566,
-0.1328551471233368,
-0.0012394689256325364,
0.02591458521783352,
-0.036575574427843094,
-0.14946697652339935,
0.045893143862485886,
0.00453641451895237,
0.030141733586788177,
0.0003600831551011652,
-0.022327378392219543,
0.04468446969985962,
-0.03178735077381134,
-0.12557096779346466,
0.2417794018983841,
-0.07763394713401794,
-0.24457862973213196,
-0.18827946484088898,
0.09298067539930344,
-0.011194185353815556,
0.010438009165227413,
0.03458482027053833,
-0.05003215745091438,
-0.05416709929704666,
-0.032220859080553055,
0.14316105842590332,
-0.02439987286925316,
-0.022071275860071182,
-0.02912469021975994,
0.051619309931993484,
-0.016815420240163803,
-0.17592032253742218,
-0.0009147602249868214,
-0.0032090824097394943,
0.02165476232767105,
0.01869140937924385,
-0.16299943625926971,
0.11902669817209244,
0.13587261736392975,
-0.050365373492240906,
0.032253433018922806,
-0.050930969417095184,
0.2244417518377304,
-0.07741253823041916,
-0.08181952685117722,
0.11455787718296051,
-0.12301001697778702,
0.004729738458991051,
0.03171410784125328,
0.012896325439214706,
-0.11343178153038025,
0.03762403130531311,
-0.05487211048603058,
-0.0793062150478363,
-0.2513212263584137,
-0.10861719399690628,
-0.08298659324645996,
0.08492682129144669,
0.029914436861872673,
0.024953268468379974,
-0.04652306064963341,
0.0803573951125145,
0.05229179933667183,
0.11668484658002853,
-0.005035310052335262,
0.07002533972263336,
0.06261711567640305,
0.015954548493027687,
0.013550273142755032,
-0.11150378733873367,
-0.048948053270578384,
0.03948980197310448,
0.06576565653085709,
0.1961737424135208,
0.012177766300737858,
0.20979191362857819,
0.05779226869344711,
0.03837276250123978,
0.06351422518491745,
0.1725957840681076,
-0.07381229102611542,
0.0018642153590917587,
-0.002225287724286318,
-0.031080706045031548,
-0.1436709761619568,
0.04682239517569542,
0.00037397030973806977,
0.03827211260795593,
-0.13782820105552673,
-0.05525169149041176,
0.06587815284729004,
0.13412578403949738,
-0.011927900835871696,
-0.2633565366268158,
-0.12768355011940002,
0.017533788457512856,
-0.05598778277635574,
-0.042832836508750916,
0.051647480577230453,
0.11993594467639923,
-0.13262319564819336,
-0.0022232704795897007,
-0.03861941397190094,
0.15743213891983032,
-0.07157079875469208,
0.009694796986877918,
-0.05949211120605469,
-0.0576402023434639,
0.005766133312135935,
0.13331928849220276,
-0.17785578966140747,
0.22520506381988525,
0.015352297574281693,
0.0053343623876571655,
-0.05876089632511139,
0.016627099364995956,
0.007904344238340855,
0.10965391993522644,
0.09921040385961533,
-0.02337172068655491,
-0.061257608234882355,
-0.18262311816215515,
0.008963484317064285,
0.0798078179359436,
0.08689610660076141,
-0.05400550365447998,
0.060697197914123535,
-0.022633284330368042,
0.013415521942079067,
-0.003459843108430505,
-0.07995136082172394,
-0.08075843006372452,
-0.1123850867152214,
-0.0013790351804345846,
-0.032469287514686584,
0.08457957953214645,
-0.03659914433956146,
0.0030011851340532303,
0.043397143483161926,
0.1715821921825409,
-0.07351231575012207,
-0.06292938441038132,
-0.11383823305368423,
0.016762780025601387,
0.12365713715553284,
-0.08161772787570953,
0.049303922802209854,
-0.0022671690676361322,
0.04244229570031166,
0.0015013479860499501,
-0.1427748054265976,
0.07262444496154785,
-0.07005642354488373,
-0.04064960032701492,
-0.027790963649749756,
0.1432553231716156,
-0.011645873077213764,
0.011579630896449089,
0.05319203436374664,
-0.04870441555976868,
-0.04383443295955658,
-0.14384087920188904,
-0.1161133423447609,
-0.04881985858082771,
0.033696502447128296,
0.08961587399244308,
-0.11715870350599289,
0.029529159888625145,
-0.0052720955573022366,
-0.008179075084626675,
0.22145691514015198,
0.12941068410873413,
-0.04822239652276039,
0.032253071665763855,
0.11485449969768524,
-0.07899985462427139,
-0.2672637403011322,
0.00554982665926218,
-0.027966562658548355,
0.08542685955762863,
0.004609330091625452,
-0.14212585985660553,
0.10311803221702576,
-0.019678859040141106,
0.03752434626221657,
0.0248886626213789,
-0.27122822403907776,
-0.11012888699769974,
0.11492665112018585,
0.11522994935512543,
0.08460487425327301,
-0.12322203069925308,
-0.05659342557191849,
-0.09564956277608871,
-0.19379974901676178,
0.16790954768657684,
-0.11451821029186249,
0.10256178677082062,
-0.0052902293391525745,
0.04399357736110687,
0.018239352852106094,
-0.0466342531144619,
0.12278670817613602,
0.04201126843690872,
0.08341233432292938,
-0.008543005213141441,
-0.10032416135072708,
0.17040584981441498,
-0.01867067441344261,
0.11220073699951172,
-0.08824165910482407,
0.08504272997379303,
-0.17839470505714417,
-0.035222869366407394,
-0.021953612565994263,
0.045664578676223755,
-0.017060687765479088,
-0.0458071306347847,
-0.0899902731180191,
-0.004116120282560587,
0.02494785375893116,
0.015931516885757446,
0.11334220319986343,
-0.048291075974702835,
0.008058705367147923,
0.075996033847332,
0.16127347946166992,
-0.03099261224269867,
-0.059409335255622864,
0.047708481550216675,
0.007134800776839256,
0.11628848314285278,
-0.22146719694137573,
0.0920521542429924,
0.1318332701921463,
0.03741365671157837,
0.10753737390041351,
0.09304569661617279,
-0.023421760648489,
0.04363929107785225,
0.08725660294294357,
-0.1357865333557129,
-0.05474546551704407,
-0.04738108441233635,
-0.0857178196310997,
-0.006271661259233952,
0.08465132862329483,
0.14221243560314178,
-0.046860672533512115,
-0.008055498823523521,
0.0024764672853052616,
-0.05325205996632576,
-0.12805134057998657,
0.1257031410932541,
0.04701004549860954,
0.06731367856264114,
-0.10047090798616409,
0.06846020370721817,
0.04896087571978569,
-0.1326725035905838,
-0.04175238683819771,
0.08562297374010086,
-0.14532500505447388,
-0.07317754626274109,
-0.03527069091796875,
0.22880540788173676,
-0.13740025460720062,
-0.08184788376092911,
-0.14144861698150635,
-0.08186229318380356,
-0.003932823892682791,
0.2584522068500519,
0.10829239338636398,
0.10322859138250351,
-0.03385571390390396,
0.0019451250554993749,
-0.09132213145494461,
0.042857758700847626,
0.07591713219881058,
0.023432858288288116,
-0.07887592166662216,
0.09357251226902008,
0.00319359521381557,
0.1446934938430786,
-0.06400734186172485,
-0.03318054601550102,
-0.1857101321220398,
0.07843168824911118,
-0.12199167162179947,
0.0625651478767395,
-0.06233898922801018,
0.01578798145055771,
0.015040429309010506,
0.00789504125714302,
-0.03267901763319969,
0.04242558777332306,
-0.09054365754127502,
0.023289857432246208,
0.00795282144099474,
0.0683976262807846,
-0.08418512344360352,
0.005268055479973555,
0.08498978614807129,
-0.06469041854143143,
0.11222489178180695,
0.045508116483688354,
-0.05197535455226898,
0.11751148849725723,
-0.21254323422908783,
-0.021524062380194664,
0.04473503306508064,
0.025797277688980103,
0.03429370000958443,
-0.02742066979408264,
0.037714213132858276,
0.02402804233133793,
0.03409062698483467,
0.002848838921636343,
0.09705730527639389,
-0.10640370845794678,
-0.10301754623651505,
-0.05190474912524223,
-0.09168237447738647,
-0.040336158126592636,
0.012187041342258453,
0.03838389739394188,
0.09586066752672195,
0.08777391910552979,
-0.02548164874315262,
0.030919259414076805,
-0.06417835503816605,
-0.02023305743932724,
0.033302366733551025,
-0.06298302114009857,
-0.09378224611282349,
-0.10228195786476135,
0.03519894555211067,
-0.06515844166278839,
0.22104088962078094,
-0.05954306200146675,
0.14334720373153687,
-0.0032816773746162653,
0.021011536940932274,
0.06729423999786377,
0.06942413002252579,
0.25972533226013184,
0.003210689639672637,
0.04785655066370964,
-0.05112087354063988,
0.0737200677394867,
0.032204076647758484,
0.040531326085329056,
0.10899107158184052,
0.07629990577697754,
-0.04868442565202713,
0.1256798654794693,
0.013728396035730839,
0.036458540707826614,
-0.01961296610534191,
-0.08703368902206421,
0.04004120081663132,
0.03366030752658844,
-0.03384467959403992,
0.13189849257469177,
0.11843791604042053,
-0.09360231459140778,
0.07891932129859924,
0.014241448603570461,
-0.10595059394836426,
-0.0477767214179039,
0.011364496313035488,
-0.05199091136455536,
-0.14163370430469513,
0.017065810039639473,
-0.10497719049453735,
-0.07921837270259857,
0.09231778979301453,
0.030452562496066093,
-0.03864917531609535,
0.2344799041748047,
-0.015705531463027,
-0.057589076459407806,
0.055594127625226974,
-0.009082184173166752,
0.013506078161299229,
-0.00326861091889441,
0.05332007631659508,
0.00806565210223198,
-0.0376075878739357,
0.006016661413013935,
0.035378217697143555,
-0.02274213172495365,
0.01673859916627407,
-0.05813964456319809,
-0.033100858330726624,
-0.040699802339076996,
0.051609866321086884,
-0.005623572506010532,
0.01865442469716072,
0.01872788369655609,
-0.01674646884202957,
-0.014885497279465199,
0.17298458516597748,
-0.04632297903299332,
-0.07044569402933121,
-0.15173625946044922,
0.16863836348056793,
0.021283699199557304,
0.05570198595523834,
0.005619633477181196,
-0.06654762476682663,
-0.024794938042759895,
0.2670011818408966,
0.19833798706531525,
-0.07275871187448502,
0.01861605793237686,
0.0016477067256346345,
0.02291586995124817,
0.0051046451553702354,
0.13921625912189484,
0.015283497981727123,
0.2089303731918335,
-0.027987966313958168,
-0.10493713617324829,
-0.05486796796321869,
-0.056696806102991104,
0.025577057152986526,
0.11138981580734253,
0.01211540400981903,
-0.05688358098268509,
-0.04853428900241852,
0.08696472644805908,
-0.16312535107135773,
-0.11187069863080978,
0.04942057654261589,
-0.15237054228782654,
-0.06189565360546112,
-0.06392054259777069,
0.0063404967077076435,
-0.012452790513634682,
0.035046182572841644,
-0.04529409483075142,
-0.028907887637615204,
0.07406366616487503,
0.026327837258577347,
-0.16229596734046936,
-0.08163175731897354,
0.05851346254348755,
-0.08085206151008606,
0.1312771737575531,
-0.028510650619864464,
0.1402822881937027,
0.08955686539411545,
0.06442246586084366,
-0.006923523265868425,
0.022352125495672226,
0.08128198236227036,
-0.017734695225954056,
0.05185769498348236,
0.06326262652873993,
-0.026739347726106644,
0.15325099229812622,
-0.042331263422966,
-0.11646414548158646,
0.05835287272930145,
-0.025329411029815674,
-0.006933559197932482,
-0.11488287150859833,
-0.04259075969457626,
-0.09701775014400482,
0.10439196228981018,
0.15575550496578217,
-0.045356858521699905,
0.003151016077026725,
-0.05199483036994934,
0.10708107054233551,
0.02051066979765892,
-0.013226657174527645,
-0.07017267495393753,
-0.153564915060997,
-0.012761110439896584,
0.010145648382604122,
-0.01851622387766838,
-0.19314135611057281,
-0.012671343982219696,
-0.0687597468495369,
0.0052902125753462315,
-0.028680037707090378,
0.12796460092067719,
0.10686971247196198,
0.03130332753062248,
-0.022534204646945,
-0.15449628233909607,
-0.010270664468407631,
0.06629201769828796,
-0.1293625831604004,
-0.1517159342765808
] |
null | null |
transformers
|
# CodeTrans model for source code summarization csharp
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/csharp/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_csharp_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization csharp
====================================================
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
62,
146
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 160,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.14978887140750885,
0.004209029488265514,
-0.0006039707805030048,
0.12856000661849976,
0.11313538998365402,
0.011521239764988422,
0.06561128795146942,
0.04737483710050583,
-0.04509666562080383,
0.03318491205573082,
0.04755651205778122,
0.015230391174554825,
0.04905352741479874,
0.20519571006298065,
0.011930340901017189,
-0.16285689175128937,
-0.00569401029497385,
0.026842571794986725,
-0.09498365968465805,
0.12588879466056824,
0.09661111980676651,
-0.07928673923015594,
0.04500735178589821,
-0.039412081241607666,
-0.2151782065629959,
0.048350755125284195,
-0.0023916626814752817,
-0.0748024433851242,
0.10709132999181747,
0.06733804941177368,
0.13676577806472778,
0.0044248998165130615,
0.0347251258790493,
-0.1237117126584053,
0.010583887808024883,
0.024679886177182198,
0.035542942583560944,
0.02762458100914955,
0.061906035989522934,
0.028046146035194397,
0.15971358120441437,
-0.0021776265930384398,
0.04045866057276726,
0.06373141705989838,
-0.06426970660686493,
-0.10254150629043579,
-0.025611475110054016,
0.040197741240262985,
0.05767524242401123,
0.09340080618858337,
-0.008317718282341957,
0.09655911475419998,
-0.15091297030448914,
0.11434225738048553,
0.09873737394809723,
-0.24859552085399628,
-0.010391592979431152,
0.0925142839550972,
0.06119600310921669,
0.06429797410964966,
-0.04282182455062866,
-0.04710110276937485,
0.08639675378799438,
0.04563082009553909,
0.042038705199956894,
-0.07524891197681427,
-0.08194129168987274,
0.027060389518737793,
-0.1017349362373352,
-0.07473500818014145,
0.21317383646965027,
0.010946962982416153,
-0.07354803383350372,
-0.06954323500394821,
-0.049065910279750824,
-0.13104186952114105,
0.02755635790526867,
0.043282121419906616,
0.0005995309329591691,
-0.022364340722560883,
-0.0013768927892670035,
0.04391017556190491,
-0.09734907746315002,
-0.13617059588432312,
0.016319774091243744,
0.08283838629722595,
0.05080167204141617,
0.03721150383353233,
-0.060534749180078506,
0.11157641559839249,
0.03467999026179314,
-0.039752308279275894,
-0.014926852658390999,
-0.03025195375084877,
-0.12232260406017303,
0.03935291990637779,
-0.06516454368829727,
-0.181411474943161,
-0.009813623502850533,
0.01909252628684044,
-0.0355103574693203,
0.0562235489487648,
0.053185053169727325,
0.02612380124628544,
0.020883502438664436,
0.20528045296669006,
0.008453551679849625,
-0.09278836846351624,
0.052353762090206146,
0.0440993569791317,
-0.043135058134794235,
-0.026466824114322662,
-0.0625477060675621,
-0.07401416450738907,
0.05108233913779259,
0.09945996850728989,
-0.14199738204479218,
0.0486060306429863,
-0.06769997626543045,
-0.041968367993831635,
0.02043069712817669,
-0.16591058671474457,
-0.007086632773280144,
0.02918298728764057,
-0.05707572028040886,
-0.025174066424369812,
0.09273228794336319,
-0.15368422865867615,
-0.14454051852226257,
-0.01371398288756609,
-0.08108843863010406,
-0.04972322657704353,
-0.14888302981853485,
-0.13368642330169678,
-0.008923247456550598,
-0.04342806339263916,
-0.004023718647658825,
-0.07490555197000504,
-0.13155563175678253,
-0.015425021760165691,
0.01948702149093151,
0.01093783788383007,
0.000988240703009069,
-0.07525140047073364,
-0.011216048151254654,
-0.01002748403698206,
-0.03160775080323219,
0.003425981616601348,
-0.045465875416994095,
0.11071360856294632,
0.0896126851439476,
0.03780517727136612,
-0.01604783721268177,
0.049448128789663315,
-0.06937413662672043,
0.07021427899599075,
-0.12779149413108826,
0.11019077897071838,
-0.0764259546995163,
0.0753372386097908,
-0.038245100528001785,
-0.10355883091688156,
0.04097194969654083,
0.050498057156801224,
0.05888470262289047,
0.05864378809928894,
-0.1488509476184845,
-0.03842785954475403,
0.19750431180000305,
-0.12955152988433838,
-0.1075507178902626,
0.10128354281187057,
-0.04270472750067711,
0.03148701786994934,
0.08394808322191238,
0.12156345695257187,
0.13221260905265808,
-0.039566557854413986,
-0.0012476036790758371,
0.060485318303108215,
0.04163029417395592,
-0.11768253147602081,
0.08958841115236282,
0.039900075644254684,
-0.08619167655706406,
0.05820372328162193,
-0.018137428909540176,
0.0995616763830185,
-0.01644315756857395,
-0.03807421028614044,
-0.0433105044066906,
-0.07145810127258301,
0.002009810646995902,
0.0007806318462826312,
0.0636870414018631,
-0.06481321156024933,
-0.07121314853429794,
0.058368463069200516,
0.15760286152362823,
-0.1327419877052307,
-0.0006818257970735431,
-0.09097638726234436,
0.053635552525520325,
-0.07157134264707565,
0.01777629926800728,
-0.17138215899467468,
0.006241961382329464,
0.07031901925802231,
-0.025015713647007942,
0.060918744653463364,
0.09682182967662811,
0.008280377835035324,
0.06395615637302399,
0.002061424544081092,
-0.006859437562525272,
-0.09453171491622925,
-0.05750971660017967,
-0.06189043074846268,
-0.060898859053850174,
-0.09948921948671341,
-0.04268581420183182,
0.025233952328562737,
-0.1792658269405365,
0.011924457736313343,
0.03522895276546478,
0.03125045821070671,
0.004304523579776287,
-0.014954287558794022,
0.018357038497924805,
0.07110121846199036,
-0.049562375992536545,
-0.03962716832756996,
0.027281086891889572,
0.01931440830230713,
-0.07791925966739655,
-0.020637283101677895,
-0.10392549633979797,
-0.010735255666077137,
0.11562058329582214,
0.04430023953318596,
-0.0810108333826065,
0.0064110043458640575,
-0.023685133084654808,
-0.03936183825135231,
0.023521663621068,
-0.06425201892852783,
0.16917255520820618,
0.003659185254946351,
0.20423398911952972,
-0.1584685742855072,
-0.03059418499469757,
-0.01990148425102234,
0.033173494040966034,
0.04522475600242615,
0.12645608186721802,
0.029315773397684097,
-0.10618089884519577,
0.054329898208379745,
0.013365170918405056,
-0.05578184127807617,
0.21412965655326843,
-0.056065209209918976,
-0.10519789159297943,
0.020681550726294518,
0.10285264253616333,
-0.01799943670630455,
0.1491512805223465,
-0.15226121246814728,
-0.0245572067797184,
0.01340559870004654,
0.00021078597637824714,
0.06819584965705872,
-0.1427098512649536,
0.009642967954277992,
0.02649017609655857,
-0.06686282157897949,
-0.10948022454977036,
-0.016724718734622,
-0.004895165096968412,
0.03739221766591072,
0.002297871047630906,
-0.02072649449110031,
0.01917543075978756,
-0.04098097234964371,
-0.1002945676445961,
0.22549520432949066,
-0.11029043793678284,
-0.23659400641918182,
-0.2100221961736679,
0.12450126558542252,
-0.04471205174922943,
0.003481611842289567,
0.013114847242832184,
-0.08991347253322601,
-0.06748700886964798,
-0.06366842985153198,
0.17346839606761932,
-0.07406570017337799,
-0.009716254658997059,
-0.011395944282412529,
0.05783350020647049,
0.0151256388053298,
-0.20725803077220917,
0.034463994204998016,
-0.0142104122787714,
-0.022107064723968506,
-0.0049607520923018456,
-0.09402824193239212,
0.08651286363601685,
0.1759188324213028,
-0.06387772411108017,
0.02172565646469593,
-0.00870414637029171,
0.19206643104553223,
-0.03705098107457161,
-0.06302612274885178,
0.12552420794963837,
-0.011113384738564491,
0.014012055471539497,
0.020197022706270218,
0.0019995681941509247,
-0.08886086940765381,
0.04707399383187294,
-0.0011382492957636714,
-0.03322272375226021,
-0.2717477083206177,
-0.030466344207525253,
-0.08295833319425583,
0.047445498406887054,
0.03616473078727722,
0.0453370064496994,
-0.07644543051719666,
0.031045254319906235,
0.03143887221813202,
0.1257912814617157,
-0.009515264071524143,
0.036768727004528046,
0.07016294449567795,
-0.00117562769446522,
0.016291825100779533,
-0.09956075996160507,
-0.006767901126295328,
0.07728665322065353,
0.08496790379285812,
0.2718698978424072,
-0.09876475483179092,
0.22636352479457855,
0.03717146813869476,
0.06889797002077103,
0.055649518966674805,
0.16159453988075256,
-0.10364282131195068,
0.0244526918977499,
0.0015778116649016738,
-0.0186142735183239,
-0.12550656497478485,
0.02764483354985714,
-0.024832461029291153,
0.05995141714811325,
-0.12071788311004639,
-0.0382624976336956,
0.007837772369384766,
0.18494389951229095,
0.028595808893442154,
-0.22857314348220825,
-0.12150053679943085,
0.010838177055120468,
-0.09530031681060791,
-0.09283889830112457,
0.060858581215143204,
0.22693102061748505,
-0.07997044175863266,
-0.02042592503130436,
-0.01827869936823845,
0.1261567324399948,
-0.04947290197014809,
-0.037724971771240234,
-0.040289152413606644,
0.06381239742040634,
0.019851619377732277,
0.12782804667949677,
-0.21830853819847107,
0.14517498016357422,
-0.005340880714356899,
0.058467332273721695,
-0.04208378493785858,
0.06477566063404083,
-0.035447340458631516,
0.053856056183576584,
0.04512109234929085,
-0.011184200644493103,
-0.02879011444747448,
-0.1929965615272522,
-0.012467315420508385,
0.029756126925349236,
0.056043993681669235,
0.029031913727521896,
0.06829451769590378,
-0.003895072266459465,
0.029487425461411476,
-0.002824228024110198,
-0.1276278793811798,
-0.06078196316957474,
-0.11647588759660721,
-0.0053609865717589855,
-0.03830750286579132,
-0.01645837351679802,
-0.06166629493236542,
-0.024343201890587807,
0.06049654632806778,
0.1811283975839615,
-0.10013725608587265,
-0.09035460650920868,
-0.08804330229759216,
0.03833755850791931,
0.14525499939918518,
-0.07684415578842163,
0.06939690560102463,
-0.005067064426839352,
0.04836619645357132,
-0.0010013668797910213,
-0.09691014885902405,
0.05880937725305557,
-0.03410843759775162,
-0.07906248420476913,
-0.022753577679395676,
0.11872082203626633,
0.001976830419152975,
0.023424340412020683,
-0.006197531707584858,
-0.06995531171560287,
-0.03348013758659363,
-0.12523330748081207,
-0.12398376315832138,
-0.023903360590338707,
0.010521936230361462,
0.06356239318847656,
-0.12159968167543411,
-0.028902383521199226,
-0.0040380340069532394,
-0.02422194927930832,
0.13668321073055267,
0.15421053767204285,
-0.07265716046094894,
0.045348942279815674,
0.12118265777826309,
-0.04521308094263077,
-0.18696729838848114,
0.0057816156186163425,
0.060664355754852295,
0.11441829800605774,
-0.04666832461953163,
-0.17643015086650848,
0.05656037852168083,
0.019080854952335358,
0.039125971496105194,
0.043237123638391495,
-0.3303239345550537,
-0.12362798303365707,
0.05899176001548767,
0.12604226171970367,
0.0548073947429657,
-0.10177920013666153,
-0.04383557289838791,
-0.058403268456459045,
-0.12425561249256134,
0.10779223591089249,
-0.005615636706352234,
0.1358836442232132,
-0.03881426900625229,
0.031590577214956284,
0.0280311219394207,
-0.04819604754447937,
0.07798253744840622,
0.030895480886101723,
0.09279871731996536,
-0.02769545279443264,
0.02572724036872387,
0.14456813037395477,
-0.02496485225856304,
0.15886269509792328,
-0.13216592371463776,
0.10363009572029114,
-0.20307569205760956,
-0.07425690442323685,
-0.06905995309352875,
0.01495751366019249,
-0.03623006120324135,
-0.031717699021101,
-0.07867344468832016,
0.020853262394666672,
0.000011602637641772162,
-0.012856104411184788,
0.02124389074742794,
-0.03339799866080284,
-0.021654944866895676,
0.09074613451957703,
0.10750193148851395,
-0.018628934398293495,
-0.08636215329170227,
0.03656989336013794,
0.04372803121805191,
0.09800425171852112,
-0.20384854078292847,
0.028215181082487106,
0.12130148708820343,
0.01971142552793026,
0.11357644945383072,
0.05401061475276947,
-0.10606393218040466,
0.03784860298037529,
0.09167330712080002,
-0.08164213597774506,
-0.08430236577987671,
-0.021787680685520172,
-0.07046538591384888,
-0.0641506090760231,
0.061004605144262314,
0.08991653472185135,
-0.0434047095477581,
-0.019719094038009644,
-0.0256717000156641,
-0.03767320141196251,
-0.10116228461265564,
0.20196150243282318,
0.060130588710308075,
0.08416029065847397,
-0.07953095436096191,
0.0715678408741951,
0.08005489408969879,
-0.10664073377847672,
-0.00006050320735084824,
0.1823747158050537,
-0.11325319856405258,
-0.04182834178209305,
0.022310525178909302,
0.14844061434268951,
-0.04311058670282364,
-0.05385468527674675,
-0.1234298124909401,
-0.08096915483474731,
0.04017212614417076,
0.17196884751319885,
0.08185779303312302,
0.11578483879566193,
-0.04903062805533409,
-0.0008896724903024733,
-0.08550933003425598,
0.07199268043041229,
0.07579854875802994,
0.033253055065870285,
-0.11313851922750473,
0.155775249004364,
0.0415448397397995,
0.11112434417009354,
-0.031387340277433395,
-0.014496579766273499,
-0.1000150591135025,
0.05280096083879471,
-0.08845742791891098,
0.03213229775428772,
-0.017423076555132866,
0.044937536120414734,
-0.022238925099372864,
0.0048575615510344505,
-0.023676259443163872,
0.06831911951303482,
-0.08799305558204651,
-0.004648057743906975,
-0.010484798811376095,
0.03768710419535637,
-0.05561003088951111,
-0.006212570238858461,
0.037217482924461365,
-0.09166225045919418,
0.1275271624326706,
-0.007923949509859085,
-0.03179321065545082,
0.08615581691265106,
-0.057187050580978394,
0.04180367290973663,
0.011451785452663898,
0.05522293224930763,
0.007525894325226545,
0.04197954759001732,
0.08322260528802872,
0.031785957515239716,
0.04341746121644974,
0.009026406332850456,
0.08539676666259766,
-0.13626724481582642,
-0.11212223023176193,
-0.0435415655374527,
-0.0953274816274643,
-0.06246257200837135,
0.08816574513912201,
0.06907317042350769,
0.0936480239033699,
0.08810213208198547,
-0.027003217488527298,
0.015399311669170856,
-0.13329604268074036,
-0.058299772441387177,
0.023003553971648216,
-0.031600818037986755,
-0.08779974281787872,
-0.05918959528207779,
0.05110256001353264,
-0.03054876998066902,
0.14455930888652802,
-0.01964404806494713,
0.05484901741147041,
-0.02564014308154583,
-0.036673404276371,
0.046065736562013626,
0.03969401866197586,
0.2361191362142563,
-0.052404969930648804,
0.03561147674918175,
0.0021560497116297483,
0.008120658807456493,
0.0027696595061570406,
0.1177711933851242,
0.11215953528881073,
0.13321161270141602,
-0.020278682932257652,
0.10635555535554886,
0.023072948679327965,
-0.009045113809406757,
-0.08943673968315125,
0.01110094878822565,
-0.00556524284183979,
0.06647223979234695,
-0.06379640847444534,
0.17779690027236938,
0.06978578120470047,
-0.10697493702173233,
0.10338813811540604,
0.025333434343338013,
-0.13017144799232483,
-0.04541964828968048,
-0.0035085545387119055,
-0.030525073409080505,
-0.14048588275909424,
0.029570573940873146,
-0.11836884915828705,
-0.02240057848393917,
0.08441333472728729,
0.05214479938149452,
-0.0643157884478569,
0.1896246075630188,
0.019441407173871994,
-0.05637294054031372,
0.059014469385147095,
0.007360091898590326,
0.03436604142189026,
0.044030629098415375,
0.013339069671928883,
0.045596715062856674,
-0.03322548419237137,
0.03977636620402336,
0.01471270713955164,
-0.021717099472880363,
0.00654846103861928,
-0.009884590283036232,
-0.0013532666489481926,
-0.02446100115776062,
0.023531334474682808,
0.05017333850264549,
0.14307664334774017,
0.02741129882633686,
-0.07025685161352158,
-0.04002824425697327,
0.16080236434936523,
-0.04288005456328392,
-0.07775459438562393,
-0.13727621734142303,
0.1596929132938385,
0.04260818660259247,
0.023771371692419052,
0.024113480001688004,
-0.09139692038297653,
-0.05116795003414154,
0.19986240565776825,
0.0796581506729126,
-0.02295604720711708,
-0.02177988365292549,
-0.001487475703470409,
-0.006228206213563681,
-0.04634108021855354,
0.20328931510448456,
0.025350332260131836,
0.2511121928691864,
0.007199855521321297,
-0.02429760806262493,
-0.061327047646045685,
-0.0294597577303648,
-0.012713760137557983,
0.14414945244789124,
-0.05295231565833092,
-0.031963299959897995,
-0.07315191626548767,
0.0057272715494036674,
0.005138433072715998,
-0.09961947798728943,
0.08748842030763626,
-0.13255371153354645,
-0.09234793484210968,
-0.03655770421028137,
0.026041371747851372,
-0.03658087179064751,
0.019740307703614235,
-0.028107861056923866,
0.037091903388500214,
0.08351133018732071,
-0.01606312207877636,
-0.12281597405672073,
-0.14157211780548096,
0.08179759234189987,
-0.07413366436958313,
0.14528478682041168,
-0.008012428879737854,
0.1294432282447815,
0.09411310404539108,
0.056779634207487106,
-0.049868930131196976,
0.09826438874006271,
0.045283496379852295,
0.02427007257938385,
0.04615994542837143,
0.12112060934305191,
-0.04434174299240112,
0.1593870222568512,
-0.05634701997041702,
-0.04113916680216789,
0.004761179443448782,
-0.08083031326532364,
-0.004891946911811829,
-0.15618859231472015,
-0.014494609087705612,
-0.10420254617929459,
0.10478340089321136,
0.19462260603904724,
-0.04164058342576027,
-0.030964668840169907,
-0.08713405579328537,
0.08779007196426392,
-0.021640479564666748,
0.05104872211813927,
-0.03247784078121185,
-0.19375568628311157,
0.0020077780354768038,
-0.0003351690829731524,
0.021798277273774147,
-0.2393992394208908,
-0.015728021040558815,
-0.04169195890426636,
-0.023867402225732803,
-0.06844543665647507,
0.15755854547023773,
0.0800454318523407,
0.04614271596074104,
-0.0302467979490757,
-0.12787070870399475,
-0.032714102417230606,
0.06245645880699158,
-0.1389462947845459,
-0.12596376240253448
] |
null | null |
transformers
|
# CodeTrans model for source code summarization csharp
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the csharp code snippets.
## Intended uses & limitations
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/csharp/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_csharp_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization csharp
====================================================
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the csharp code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
62,
88,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10474178194999695,
0.05668235197663307,
-0.001562343561090529,
0.10096552222967148,
0.057778842747211456,
0.0175760667771101,
0.07194986194372177,
0.07982325553894043,
-0.055481743067502975,
0.06762532144784927,
0.076026551425457,
-0.026202989742159843,
0.06981603801250458,
0.20513884723186493,
0.03702585771679878,
-0.1617477685213089,
-0.012491590343415737,
0.03269600868225098,
-0.04779019579291344,
0.11121316999197006,
0.0919022485613823,
-0.09909455478191376,
0.06036289036273956,
-0.04008405655622482,
-0.1363920420408249,
0.055575381964445114,
-0.04858116805553436,
-0.04432998225092888,
0.08105643838644028,
0.05930967628955841,
0.12015784531831741,
-0.01341877318918705,
0.08938194811344147,
-0.1943499594926834,
0.0007348547806032002,
0.023006822913885117,
0.03398904949426651,
0.028597788885235786,
0.07936225086450577,
0.0627308264374733,
0.14481745660305023,
-0.030953967943787575,
0.02883295901119709,
0.05320855602622032,
-0.06302103400230408,
-0.06001884117722511,
-0.04149601608514786,
0.08802751451730728,
0.1424146443605423,
0.08468984812498093,
-0.015394781716167927,
0.015128633938729763,
-0.07694631069898605,
0.08691750466823578,
0.11731897294521332,
-0.2231566160917282,
-0.027001483365893364,
0.06900546699762344,
0.07676646113395691,
0.06510443985462189,
-0.05867316946387291,
-0.03815045952796936,
0.10052178800106049,
0.03202301263809204,
0.05283662676811218,
-0.0829893946647644,
-0.05549386516213417,
-0.010387959890067577,
-0.0766570046544075,
-0.0691680759191513,
0.13994896411895752,
0.04191271960735321,
-0.05497496947646141,
-0.09887085109949112,
-0.04833555594086647,
-0.20696847140789032,
0.046611636877059937,
0.024115486070513725,
0.016463628038764,
-0.01968798041343689,
0.04781758785247803,
0.0004323774774093181,
-0.11836429685354233,
-0.11367548257112503,
0.021391892805695534,
0.057646337896585464,
0.06769327074289322,
0.027311306446790695,
-0.011376740410923958,
0.09059219807386398,
0.02630487270653248,
-0.03823386877775192,
-0.011477732099592686,
0.014821761287748814,
-0.13847778737545013,
0.025668850168585777,
-0.03162872791290283,
-0.08758175373077393,
-0.02808309905230999,
0.0810873806476593,
-0.030749905854463577,
0.06172391399741173,
0.14785389602184296,
0.011491742916405201,
-0.00025734055088832974,
0.21136224269866943,
0.014381554909050465,
-0.10266278684139252,
0.0003019029973074794,
0.03338317945599556,
-0.014939775690436363,
-0.011575938202440739,
-0.0705709308385849,
-0.042035140097141266,
-0.0012567938538268209,
0.07346461713314056,
-0.12470417469739914,
0.008632026612758636,
-0.04601838067173958,
-0.020912444218993187,
0.06385404616594315,
-0.13460597395896912,
0.022093018516898155,
0.021556710824370384,
-0.03928278014063835,
-0.06066164746880531,
0.060978636145591736,
-0.10856082290410995,
-0.12705984711647034,
0.027236994355916977,
-0.04492888227105141,
-0.030125433579087257,
-0.1280168890953064,
-0.11652372777462006,
-0.008284241892397404,
-0.07472509890794754,
0.007649582345038652,
-0.10121019929647446,
-0.07575478404760361,
-0.020739754661917686,
0.02843547612428665,
0.0025380500592291355,
-0.013422212563455105,
-0.054946258664131165,
0.016886861994862556,
-0.010219643823802471,
-0.019710645079612732,
0.028591161593794823,
-0.03000039979815483,
0.09428185969591141,
0.08405160158872604,
0.05421255901455879,
0.014165231958031654,
0.029546812176704407,
-0.05859729275107384,
0.06817121058702469,
-0.042875953018665314,
0.055674731731414795,
-0.017678525298833847,
0.05464668199419975,
-0.0894157811999321,
-0.08182632923126221,
0.05521034076809883,
0.04251839965581894,
0.0554443784058094,
0.02411721460521221,
-0.11236927658319473,
0.01240924559533596,
0.1411365121603012,
-0.094173863530159,
-0.12258756160736084,
0.09990645200014114,
0.00554249994456768,
-0.00016480607155244797,
0.05125196650624275,
0.1153365820646286,
0.14699822664260864,
-0.08991266787052155,
-0.03506355732679367,
0.08336294442415237,
0.062318868935108185,
-0.08380930870771408,
0.1043885126709938,
0.01798464171588421,
0.0404728539288044,
0.021292615681886673,
0.029421046376228333,
0.06745177507400513,
-0.01622619666159153,
-0.04379472881555557,
-0.013259678147733212,
-0.09204255789518356,
-0.0414351187646389,
-0.01827075332403183,
0.03837387263774872,
-0.04630483686923981,
-0.055531084537506104,
-0.008054046891629696,
0.15990017354488373,
-0.10337091237306595,
0.02695826254785061,
-0.08954481035470963,
-0.02802349254488945,
-0.11354918777942657,
0.012123335152864456,
-0.1173885241150856,
0.024882609024643898,
0.05556177347898483,
-0.061848536133766174,
0.06272850930690765,
0.0784287303686142,
-0.0036997192073613405,
0.04512335732579231,
-0.028742078691720963,
-0.046070732176303864,
-0.05868517979979515,
-0.057102005928754807,
-0.13098566234111786,
-0.01647845469415188,
-0.11953578889369965,
-0.030319591984152794,
-0.05300067737698555,
-0.15140384435653687,
0.01932428404688835,
-0.008751115761697292,
0.02992573380470276,
0.002565972041338682,
-0.024082442745566368,
0.034002870321273804,
0.06291763484477997,
-0.04198784753680229,
-0.07933647185564041,
0.01610158197581768,
0.016090495511889458,
-0.12785157561302185,
-0.03906873241066933,
-0.12735137343406677,
-0.04445640370249748,
0.06479280441999435,
0.06542885303497314,
-0.08395062386989594,
0.0074014803394675255,
-0.03407035395503044,
-0.055080950260162354,
-0.021083548665046692,
-0.06966666132211685,
0.15688222646713257,
0.004080366343259811,
0.16988666355609894,
-0.13546711206436157,
-0.05297260731458664,
-0.021273290738463402,
0.00857548601925373,
0.007725054863840342,
0.1519344002008438,
0.03685925528407097,
-0.06819675117731094,
0.03271276876330376,
0.019503438845276833,
-0.04410143196582794,
0.15473589301109314,
-0.010084075853228569,
-0.11691339313983917,
0.013114110566675663,
0.09909076988697052,
-0.013704124838113785,
0.13051268458366394,
-0.07099951803684235,
-0.015605765394866467,
0.00176758982706815,
0.008704797364771366,
0.04265614226460457,
-0.14150124788284302,
0.031297262758016586,
0.06342145055532455,
-0.061655931174755096,
-0.06413695216178894,
-0.03400964289903641,
-0.050190288573503494,
0.03772413730621338,
-0.004904826171696186,
-0.0011638444848358631,
-0.007049728184938431,
-0.03127199783921242,
-0.09738048166036606,
0.1973702758550644,
-0.08048232644796371,
-0.18991512060165405,
-0.18719707429409027,
0.07262259721755981,
-0.04596159979701042,
0.015234441496431828,
0.045569393783807755,
-0.0994352251291275,
-0.05714506283402443,
-0.09652425348758698,
0.1250385046005249,
-0.11323249340057373,
0.016340596601366997,
-0.00432930001989007,
0.03464294597506523,
0.03265047073364258,
-0.16842764616012573,
0.04186328873038292,
-0.004890471696853638,
-0.010481319390237331,
0.00592426722869277,
-0.047523874789476395,
0.10516756027936935,
0.13089805841445923,
-0.08401186764240265,
0.015096643008291721,
-0.005962478928267956,
0.18304462730884552,
-0.051751092076301575,
0.023842478170990944,
0.18356089293956757,
0.00816392246633768,
0.03269340097904205,
0.0371747761964798,
0.012733425945043564,
-0.09782330691814423,
0.055263176560401917,
0.03270719572901726,
-0.02998274192214012,
-0.24800117313861847,
0.0001737290876917541,
-0.06879254430532455,
0.039676833897829056,
0.11574181914329529,
0.04869038239121437,
-0.14151544868946075,
0.038740918040275574,
-0.013694160617887974,
0.1492190659046173,
-0.05115162581205368,
0.04850984737277031,
-0.008182625286281109,
0.009433201514184475,
0.0006193292792886496,
-0.09427926689386368,
0.00015042154700495303,
0.07127421349287033,
0.10070037841796875,
0.2194141000509262,
-0.05097529664635658,
0.2311326265335083,
0.014417600817978382,
0.07817068696022034,
0.0241418294608593,
0.13570722937583923,
-0.102081798017025,
0.006286948453634977,
0.011076078750193119,
-0.014824489131569862,
-0.07958713918924332,
0.06329205632209778,
0.010051077231764793,
0.05112157762050629,
-0.08338861912488937,
0.019354866817593575,
0.007966713979840279,
0.19731740653514862,
0.048712026327848434,
-0.1795097440481186,
-0.09954431653022766,
0.012855572625994682,
-0.09243200719356537,
-0.11057192832231522,
0.06128918379545212,
0.21069052815437317,
-0.038136158138513565,
-0.009615166112780571,
-0.009626710787415504,
0.13354219496250153,
-0.0706983134150505,
-0.030496446415781975,
0.01651306077837944,
0.045098766684532166,
0.02096550166606903,
0.1290617734193802,
-0.2389615774154663,
0.09833990782499313,
0.019787529483437538,
0.08910112082958221,
-0.03557970002293587,
0.07079888135194778,
-0.06373777240514755,
0.0043739620596170425,
0.08761415630578995,
0.002236811676993966,
-0.09081999957561493,
-0.22634272277355194,
-0.05679718777537346,
0.021677719429135323,
0.07708307355642319,
-0.03120412863790989,
0.08561187982559204,
0.0022117767948657274,
0.04997389391064644,
-0.029758622869849205,
-0.1239122673869133,
-0.06757019460201263,
-0.14227727055549622,
0.0027412313502281904,
0.012983485125005245,
-0.014374836347997189,
-0.03843790665268898,
0.01882033608853817,
-0.0010697698453441262,
0.19696229696273804,
-0.1845463663339615,
-0.096930593252182,
-0.09208943694829941,
0.05269697681069374,
0.1344507932662964,
-0.09283367544412613,
0.028116151690483093,
0.02345956861972809,
0.05217685177922249,
-0.038944728672504425,
-0.059598688036203384,
0.03309419006109238,
-0.05616644397377968,
-0.09598769247531891,
-0.028019635006785393,
0.09622547030448914,
-0.013146523386240005,
0.041416801512241364,
-0.008581720292568207,
-0.08445364236831665,
-0.04600534215569496,
-0.13051173090934753,
-0.06680193543434143,
-0.011574063450098038,
0.028419826179742813,
0.0022272304631769657,
-0.09879913926124573,
0.08115275949239731,
-0.007929774932563305,
-0.09069211781024933,
0.07622071355581284,
0.2201681286096573,
-0.06743507087230682,
0.021756689995527267,
0.11294587701559067,
-0.05929480493068695,
-0.16529934108257294,
-0.07084313035011292,
0.053094446659088135,
0.09419165551662445,
-0.025928404182195663,
-0.14953239262104034,
0.05842010676860809,
0.019408967345952988,
0.03202097490429878,
0.02791883796453476,
-0.29612496495246887,
-0.1360919326543808,
0.0475015752017498,
0.08585912734270096,
0.022560954093933105,
-0.11772064864635468,
-0.04280092194676399,
-0.0585460290312767,
-0.07957097142934799,
0.03552504628896713,
0.0669429823756218,
0.1359225958585739,
-0.04201853275299072,
0.003408603835850954,
0.025417793542146683,
-0.03339506685733795,
0.10711482912302017,
0.015599234960973263,
0.09776138514280319,
-0.01436617411673069,
0.027136851102113724,
0.0919620543718338,
-0.05296126380562782,
0.1446535438299179,
-0.15271803736686707,
0.08090768754482269,
-0.22600142657756805,
-0.06052980571985245,
-0.016212204471230507,
-0.012037524953484535,
-0.04526137188076973,
-0.05458109453320503,
-0.08955352008342743,
-0.008982610888779163,
0.053565580397844315,
-0.024844041094183922,
0.04327200725674629,
-0.03572461009025574,
-0.06344306468963623,
0.08481140434741974,
0.08786918967962265,
-0.027208350598812103,
-0.11023776233196259,
0.008327477611601353,
0.027212142944335938,
0.09280535578727722,
-0.17076121270656586,
0.016684051603078842,
0.13283266127109528,
0.008666676469147205,
0.09836445748806,
0.019572898745536804,
-0.06850635260343552,
0.043729688972234726,
0.07535319030284882,
-0.04482864961028099,
-0.10072784870862961,
-0.018093088641762733,
-0.03897927328944206,
-0.09116442501544952,
0.037854861468076706,
0.078360415995121,
-0.05451580137014389,
-0.025174405425786972,
-0.024672409519553185,
-0.011544681154191494,
-0.07434426248073578,
0.19464433193206787,
0.029464097693562508,
0.08754009008407593,
-0.05527940392494202,
0.08213373273611069,
0.10430408269166946,
-0.1286628693342209,
0.0028329817578196526,
0.14579971134662628,
-0.0777539312839508,
-0.037680406123399734,
0.06557717174291611,
0.07725010812282562,
-0.0485394150018692,
-0.0608111210167408,
-0.0996842011809349,
-0.06628486514091492,
0.02827450819313526,
0.033479247242212296,
0.06285644322633743,
0.082338348031044,
-0.0236209649592638,
0.014127478934824467,
-0.09825524687767029,
0.08695048838853836,
0.06267088651657104,
0.04600141569972038,
-0.13585539162158966,
0.17948120832443237,
0.022170310840010643,
0.10287906229496002,
-0.002203438663855195,
0.040521666407585144,
-0.07572993636131287,
0.046636588871479034,
-0.014420419931411743,
0.02460278570652008,
-0.012900701723992825,
0.03835947439074516,
-0.014935438521206379,
0.03691769391298294,
-0.02000707946717739,
0.05890180543065071,
-0.053590357303619385,
-0.026580167934298515,
-0.03564900532364845,
0.038394663482904434,
-0.05338987708091736,
-0.02020321786403656,
0.004707267973572016,
-0.0809771716594696,
0.10417506098747253,
-0.061285361647605896,
-0.013374005444347858,
-0.0021589857060462236,
-0.03125796094536781,
0.08058853447437286,
0.009828777052462101,
0.05239180848002434,
-0.021352186799049377,
0.012780437245965004,
0.04085433483123779,
0.024062199518084526,
-0.01937319152057171,
-0.015023994259536266,
0.03441951051354408,
-0.1482083648443222,
-0.09098353236913681,
-0.08480111509561539,
-0.05992033705115318,
-0.0727231577038765,
0.07799995690584183,
0.07738591730594635,
0.06382612138986588,
0.08027516305446625,
-0.023636505007743835,
0.00825207494199276,
-0.12639403343200684,
-0.03236253559589386,
0.056811925023794174,
0.001105980365537107,
-0.10221108794212341,
-0.0694565698504448,
0.05758509039878845,
-0.04152770712971687,
0.1171421930193901,
-0.03665999695658684,
0.04533892124891281,
-0.015070530585944653,
-0.05996996909379959,
0.004687988664954901,
0.023156611248850822,
0.22161003947257996,
-0.09688948094844818,
0.015044921077787876,
-0.003994880244135857,
-0.00488841300830245,
0.03483569249510765,
0.17552901804447174,
0.08772452920675278,
0.11439064145088196,
0.04728960990905762,
0.10986269265413284,
-0.05158022791147232,
-0.03822619095444679,
-0.15690888464450836,
0.03864220157265663,
0.006307993549853563,
0.05676525458693504,
-0.04144621267914772,
0.0908455029129982,
0.12296857684850693,
-0.1110607236623764,
0.07407944649457932,
0.024345148354768753,
-0.09827056527137756,
-0.03584209084510803,
-0.05039398372173309,
-0.04608019441366196,
-0.07604791224002838,
0.021155476570129395,
-0.09617757052183151,
0.0210137739777565,
0.061127398163080215,
0.03972569480538368,
-0.0218840092420578,
0.17404139041900635,
-0.010690689086914062,
-0.04012681171298027,
0.026540571823716164,
0.018550563603639603,
0.045001547783613205,
0.09265300631523132,
0.00219851010479033,
0.07808894664049149,
-0.07492388039827347,
0.06799733638763428,
0.016129979863762856,
-0.008073379285633564,
0.017281262204051018,
0.004060034640133381,
-0.006541089154779911,
-0.039490312337875366,
-0.00634561013430357,
0.07824411988258362,
0.1698983907699585,
0.028010204434394836,
-0.04902239516377449,
-0.052054960280656815,
0.1504824459552765,
-0.05582641437649727,
-0.05719664320349693,
-0.11079249531030655,
0.12721674144268036,
0.07041025161743164,
0.02966485545039177,
0.015157150104641914,
-0.08629963546991348,
-0.04442138597369194,
0.2120848447084427,
0.024970093742012978,
-0.01978624239563942,
-0.03124195896089077,
-0.0030807924922555685,
-0.0032204273156821728,
-0.04071764647960663,
0.14707453548908234,
0.007916304282844067,
0.19520603120326996,
-0.009007863700389862,
0.002337251091375947,
-0.039281703531742096,
-0.026776673272252083,
-0.03978325054049492,
0.1968614161014557,
-0.036160849034786224,
0.02619243785738945,
-0.08064869046211243,
-0.016409780830144882,
0.04932234808802605,
-0.11023090779781342,
0.12567225098609924,
-0.0779479593038559,
-0.06176622584462166,
0.03231098875403404,
0.07134447246789932,
-0.02280866913497448,
0.032738905400037766,
-0.03029366210103035,
0.04974665865302086,
0.07050222903490067,
-0.023586217314004898,
-0.10205129534006119,
-0.11119191348552704,
0.044820547103881836,
-0.06327603757381439,
0.1590055525302887,
0.02628474123775959,
0.09274828433990479,
0.07964137941598892,
0.01691531017422676,
-0.08256306499242783,
0.11493594199419022,
0.036687374114990234,
0.012246403843164444,
0.07912155985832214,
0.14227057993412018,
-0.03629244118928909,
0.14403237402439117,
-0.030480487272143364,
-0.03966187685728073,
-0.01250448264181614,
-0.028987908735871315,
0.004820543806999922,
-0.13246114552021027,
-0.0005564066814258695,
-0.06255390495061874,
0.131879061460495,
0.1713433414697647,
-0.04811814799904823,
-0.03056156449019909,
-0.03234171122312546,
0.07035607844591141,
-0.01885976269841194,
0.10309044271707535,
-0.006833321880549192,
-0.15963733196258545,
0.024665474891662598,
-0.019118687137961388,
0.03275704011321068,
-0.16022078692913055,
-0.04314849153161049,
-0.04620233178138733,
-0.04721102863550186,
-0.08443775773048401,
0.13795478641986847,
0.07515253126621246,
0.02311689220368862,
-0.039758726954460144,
-0.19324137270450592,
-0.02462863363325596,
0.04558185487985611,
-0.15213270485401154,
-0.12522272765636444
] |
null | null |
transformers
|
# CodeTrans model for source code summarization csharp
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the csharp code snippets.
## Intended uses & limitations
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_csharp_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/csharp/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_csharp_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization csharp
====================================================
Pretrained model on programming language csharp using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the csharp code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
62,
87,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09418269246816635,
0.057598013430833817,
-0.0012863409938290715,
0.10872101783752441,
0.05242162570357323,
0.01839829981327057,
0.06742025911808014,
0.08338817954063416,
-0.057268980890512466,
0.06417118012905121,
0.07211311161518097,
-0.03640841320157051,
0.07059560716152191,
0.1924547404050827,
0.027176011353731155,
-0.1741485893726349,
-0.016606388613581657,
0.031702492386102676,
-0.04812302440404892,
0.11165177822113037,
0.09913554042577744,
-0.09528782963752747,
0.06483669579029083,
-0.03737692907452583,
-0.12278777360916138,
0.05640888959169388,
-0.04581952840089798,
-0.04700149968266487,
0.09146282076835632,
0.061747368425130844,
0.13044780492782593,
-0.01828659139573574,
0.08864271640777588,
-0.2034403681755066,
0.0012755959760397673,
0.01896277628839016,
0.04070396348834038,
0.03227603808045387,
0.06680158525705338,
0.05978132411837578,
0.13428346812725067,
-0.03191506117582321,
0.03678351268172264,
0.04634323716163635,
-0.06263462454080582,
-0.046056509017944336,
-0.04751772806048393,
0.08284115046262741,
0.1293526291847229,
0.09105388820171356,
-0.01018283236771822,
0.0220839511603117,
-0.08613795042037964,
0.08391308784484863,
0.1306297332048416,
-0.23118624091148376,
-0.024657268077135086,
0.08556382358074188,
0.08558184653520584,
0.06739071011543274,
-0.07874476909637451,
-0.03726135939359665,
0.10499240458011627,
0.03589004650712013,
0.05978982523083687,
-0.08164101094007492,
-0.05286204442381859,
0.001004110206849873,
-0.06564044207334518,
-0.057871222496032715,
0.15881627798080444,
0.044700998812913895,
-0.05022431164979935,
-0.10092182457447052,
-0.04738028347492218,
-0.20052744448184967,
0.04883355647325516,
0.014623885974287987,
0.01741335727274418,
-0.01554279774427414,
0.030709199607372284,
0.0018470919458195567,
-0.11454838514328003,
-0.10911781340837479,
-0.002906227484345436,
0.06223747134208679,
0.06200173869729042,
0.026598790660500526,
-0.011129851453006268,
0.0881119817495346,
0.028563259169459343,
-0.04469003155827522,
-0.02335083857178688,
0.00988378468900919,
-0.12595334649085999,
0.017932165414094925,
-0.026787081733345985,
-0.07636643946170807,
-0.030246175825595856,
0.09068997204303741,
-0.047584667801856995,
0.06558826565742493,
0.13326305150985718,
0.011370064690709114,
0.0032777274027466774,
0.22697655856609344,
0.028842704370617867,
-0.11873669922351837,
0.0008458656375296414,
0.03265424072742462,
-0.013655335642397404,
-0.006070231087505817,
-0.06824076920747757,
-0.04327092692255974,
0.007536722347140312,
0.06822940707206726,
-0.1370079666376114,
0.016495637595653534,
-0.04904581978917122,
-0.01745821163058281,
0.054466888308525085,
-0.13754726946353912,
0.026444539427757263,
0.014200426638126373,
-0.05153924599289894,
-0.05079774931073189,
0.06881067901849747,
-0.11409391462802887,
-0.11428442597389221,
0.02787061594426632,
-0.052449434995651245,
-0.03283710405230522,
-0.12822453677654266,
-0.12110234051942825,
-0.01592404395341873,
-0.04949536174535751,
0.009120362810790539,
-0.107938751578331,
-0.08595916628837585,
-0.010824262164533138,
0.025984520092606544,
0.002539431443437934,
-0.004966252949088812,
-0.060513678938150406,
0.010925107635557652,
-0.0061185844242572784,
-0.0213518887758255,
0.01212479267269373,
-0.037371836602687836,
0.09060454368591309,
0.08126678317785263,
0.05227816849946976,
0.01335038524121046,
0.026248397305607796,
-0.06482616066932678,
0.06689181178808212,
-0.06445334851741791,
0.059185806661844254,
-0.02768106386065483,
0.04933995008468628,
-0.08736716210842133,
-0.08578289300203323,
0.04306398332118988,
0.04348282516002655,
0.05395497754216194,
0.02529257908463478,
-0.12897932529449463,
0.016911061480641365,
0.13449962437152863,
-0.09735456854104996,
-0.12319011986255646,
0.10119698196649551,
-0.006957605015486479,
0.0037094596773386,
0.05531780794262886,
0.11665277928113937,
0.14264808595180511,
-0.10087673366069794,
-0.04642840102314949,
0.08891568332910538,
0.05571083351969719,
-0.05802517384290695,
0.09808278828859329,
0.020945493131875992,
0.04213929548859596,
0.022299017757177353,
0.03232590854167938,
0.06141001358628273,
-0.01283298060297966,
-0.038572560995817184,
-0.018376657739281654,
-0.08860562741756439,
-0.03167055547237396,
-0.01943836733698845,
0.030889609828591347,
-0.057694390416145325,
-0.058871809393167496,
0.0060525513254106045,
0.16298113763332367,
-0.10486696660518646,
0.026319174095988274,
-0.0924585610628128,
-0.03872383013367653,
-0.09132415801286697,
0.0031426497735083103,
-0.10862591117620468,
0.015140030533075333,
0.05534999817609787,
-0.03794177994132042,
0.06345779448747635,
0.08237182348966599,
0.0003547201631590724,
0.03542377054691315,
-0.03840826824307442,
-0.04405798763036728,
-0.059832990169525146,
-0.05242016166448593,
-0.12458671629428864,
-0.013865306973457336,
-0.10485884547233582,
-0.026191970333456993,
-0.05440492182970047,
-0.16318954527378082,
0.012521944008767605,
-0.01248076930642128,
0.034338317811489105,
0.005008235108107328,
-0.017002593725919724,
0.020885689184069633,
0.052759528160095215,
-0.052520956844091415,
-0.0850881040096283,
0.003195294179022312,
0.019963141530752182,
-0.1261587142944336,
-0.03815695270895958,
-0.12864333391189575,
-0.052588608115911484,
0.07180290669202805,
0.0763477310538292,
-0.09100649505853653,
0.002037100261077285,
-0.03556408733129501,
-0.052270371466875076,
-0.031769055873155594,
-0.07479964196681976,
0.17286153137683868,
0.004820265807211399,
0.17203424870967865,
-0.13851019740104675,
-0.05211760476231575,
-0.03411451727151871,
-0.001536085968837142,
0.009042308665812016,
0.1472443789243698,
0.020727675408124924,
-0.0816713497042656,
0.03892962634563446,
0.008720621466636658,
-0.05006246268749237,
0.15536761283874512,
-0.006132613867521286,
-0.10548460483551025,
0.008931104093790054,
0.08903397619724274,
-0.011934641748666763,
0.15151016414165497,
-0.04327036440372467,
-0.008865119889378548,
-0.0002731657586991787,
0.017655104398727417,
0.04346274957060814,
-0.13800916075706482,
0.028273217380046844,
0.06703592091798782,
-0.061878643929958344,
-0.05780388042330742,
-0.04126026853919029,
-0.044720448553562164,
0.038409747183322906,
-0.0020786267705261707,
-0.00301058660261333,
-0.013422433286905289,
-0.029613593593239784,
-0.09505148231983185,
0.2037695199251175,
-0.09500246495008469,
-0.20926058292388916,
-0.18312034010887146,
0.07048384845256805,
-0.03533726930618286,
0.012579853646457195,
0.03742257133126259,
-0.09988059103488922,
-0.06904614716768265,
-0.10355187952518463,
0.1250695288181305,
-0.11737421900033951,
0.008774316869676113,
-0.003305440302938223,
0.03646935895085335,
0.024514179676771164,
-0.1777963489294052,
0.036326564848423004,
-0.003961534705013037,
-0.005747820250689983,
0.0001138915540650487,
-0.06694050878286362,
0.09641052037477493,
0.12956716120243073,
-0.08809062838554382,
0.012587056495249271,
-0.011105741374194622,
0.1730833798646927,
-0.05523600056767464,
0.028015034273266792,
0.17357060313224792,
0.009053578600287437,
0.03266856446862221,
0.03982382267713547,
0.009244947694242,
-0.09240899235010147,
0.06601867824792862,
0.03574134036898613,
-0.027231166139245033,
-0.24510711431503296,
-0.003302699886262417,
-0.06720784306526184,
0.04818153381347656,
0.11347617954015732,
0.0461588129401207,
-0.13373713195323944,
0.03324231505393982,
-0.008202980272471905,
0.1428695023059845,
-0.03799783065915108,
0.05383836477994919,
-0.0024583220947533846,
0.008328092284500599,
0.007500647101551294,
-0.09304242581129074,
0.002773155691102147,
0.07294796407222748,
0.10122378170490265,
0.2151034027338028,
-0.05638981610536575,
0.21298891305923462,
0.009575845673680305,
0.08174905925989151,
0.031160129234194756,
0.1265566200017929,
-0.1055726632475853,
0.001982158748432994,
0.006888681091368198,
-0.010013126768171787,
-0.07985005527734756,
0.05796944350004196,
-0.004966385196894407,
0.07156838476657867,
-0.07655956596136093,
0.03267763927578926,
0.014863693155348301,
0.1802385151386261,
0.04934117570519447,
-0.18573828041553497,
-0.11235256493091583,
0.012323124334216118,
-0.1005445346236229,
-0.1059633269906044,
0.06334184110164642,
0.21394017338752747,
-0.045230790972709656,
-0.008620405569672585,
-0.004303709138184786,
0.130828857421875,
-0.069276824593544,
-0.030378557741642,
0.01467202790081501,
0.054563116282224655,
0.008341644890606403,
0.12767070531845093,
-0.25084707140922546,
0.09508480876684189,
0.01757015474140644,
0.0917631983757019,
-0.026871202513575554,
0.06622552871704102,
-0.04995686188340187,
0.005459217354655266,
0.07509171962738037,
0.0018759530503302813,
-0.09679848700761795,
-0.21739374101161957,
-0.050893720239400864,
0.021279996261000633,
0.07827463001012802,
-0.021945586428046227,
0.08952675014734268,
-0.011734111234545708,
0.0567488856613636,
-0.018325619399547577,
-0.12926527857780457,
-0.06512873619794846,
-0.13938036561012268,
-0.008935470134019852,
0.004100384656339884,
-0.006973118521273136,
-0.039928603917360306,
0.019438384100794792,
-0.0076225814409554005,
0.20255091786384583,
-0.18580558896064758,
-0.09145911037921906,
-0.09254425764083862,
0.061111997812986374,
0.13716329634189606,
-0.09213001281023026,
0.03027959167957306,
0.027670521289110184,
0.06093462184071541,
-0.0349528007209301,
-0.06296548992395401,
0.02336622215807438,
-0.055793069303035736,
-0.08611266314983368,
-0.027012135833501816,
0.09858312457799911,
-0.009312739595770836,
0.04656153544783592,
0.005012721288949251,
-0.08477671444416046,
-0.048746258020401,
-0.13172218203544617,
-0.08401776105165482,
-0.015065893530845642,
0.02747061662375927,
0.012198311276733875,
-0.10077416896820068,
0.07477013021707535,
-0.014841025695204735,
-0.08709722757339478,
0.07512795925140381,
0.19324557483196259,
-0.07017562538385391,
0.017908087000250816,
0.09774403274059296,
-0.06109556183218956,
-0.15993312001228333,
-0.06151796132326126,
0.050508107990026474,
0.09628842025995255,
-0.019442472606897354,
-0.13701443374156952,
0.06926146149635315,
0.026949919760227203,
0.03567182272672653,
0.01383216492831707,
-0.2865130305290222,
-0.13396058976650238,
0.044671833515167236,
0.07499336451292038,
0.025768151506781578,
-0.10684619098901749,
-0.040784452110528946,
-0.060207873582839966,
-0.09079543501138687,
0.0416792668402195,
0.07274848967790604,
0.13413475453853607,
-0.03553994372487068,
0.026415031403303146,
0.027125203981995583,
-0.028101177886128426,
0.10579807311296463,
0.006841929629445076,
0.1034354493021965,
-0.018156135454773903,
0.017498528584837914,
0.08871864527463913,
-0.05571923404932022,
0.14880511164665222,
-0.15780629217624664,
0.09063653647899628,
-0.21694904565811157,
-0.056758441030979156,
-0.011918997392058372,
-0.006274611223489046,
-0.04319830611348152,
-0.055629879236221313,
-0.10437020659446716,
0.004625920671969652,
0.05458945408463478,
-0.02356763370335102,
0.04536697641015053,
-0.030623044818639755,
-0.06332145631313324,
0.05879881605505943,
0.10025635361671448,
-0.027072757482528687,
-0.10986799746751785,
0.017501136288046837,
0.030136460438370705,
0.08912058174610138,
-0.17404431104660034,
0.024147983640432358,
0.13487212359905243,
0.01065308228135109,
0.10807323455810547,
0.02596537210047245,
-0.0657651349902153,
0.043197497725486755,
0.0728212371468544,
-0.04491270333528519,
-0.09222277253866196,
-0.010562930256128311,
-0.025921430438756943,
-0.09266982227563858,
0.03261738270521164,
0.0872250497341156,
-0.05419602245092392,
-0.019829239696264267,
-0.016261395066976547,
-0.010662862099707127,
-0.07086653262376785,
0.19071950018405914,
0.027483798563480377,
0.0827856957912445,
-0.04845040664076805,
0.08457187563180923,
0.09459429234266281,
-0.11387482285499573,
0.005573217757046223,
0.14205071330070496,
-0.07895507663488388,
-0.02562410943210125,
0.06852594763040543,
0.08680399507284164,
-0.05822596326470375,
-0.0571734681725502,
-0.10259086638689041,
-0.07302925735712051,
0.01760764792561531,
0.04703223705291748,
0.0644037276506424,
0.0860111191868782,
-0.03143622353672981,
0.02331910841166973,
-0.0998760461807251,
0.09225814789533615,
0.06651332229375839,
0.048916611820459366,
-0.1352946162223816,
0.17699094116687775,
0.02071053721010685,
0.08861846476793289,
-0.004007413983345032,
0.040311768651008606,
-0.08475872129201889,
0.04339735582470894,
-0.015072474256157875,
0.03627213090658188,
-0.00891709141433239,
0.04106053709983826,
-0.015787135809659958,
0.030820028856396675,
-0.026497412472963333,
0.055271536111831665,
-0.04419201612472534,
-0.025652628391981125,
-0.027525510638952255,
0.03771116957068443,
-0.05516757071018219,
-0.022108331322669983,
0.015835441648960114,
-0.08014295995235443,
0.10059642046689987,
-0.05942916125059128,
-0.007839389145374298,
0.0031242682598531246,
-0.03274659067392349,
0.07546698302030563,
0.01201525516808033,
0.05731501802802086,
-0.016625696793198586,
0.008288919925689697,
0.04106666520237923,
0.022600257769227028,
-0.015543434768915176,
-0.011850819922983646,
0.03586636856198311,
-0.14118503034114838,
-0.09722509980201721,
-0.09411261230707169,
-0.05809745192527771,
-0.07686645537614822,
0.08123146742582321,
0.08607561141252518,
0.06437025964260101,
0.08618658781051636,
-0.028463006019592285,
0.00419976282864809,
-0.13236983120441437,
-0.03387372940778732,
0.05513027682900429,
-0.006455840542912483,
-0.09128668159246445,
-0.06258920580148697,
0.058732401579618454,
-0.04732300341129303,
0.12079110741615295,
-0.025129172950983047,
0.045949045568704605,
-0.01290106400847435,
-0.05554641783237457,
-0.0011435456108301878,
0.017780568450689316,
0.23537057638168335,
-0.0940627008676529,
0.0130767235532403,
-0.0001934361644089222,
-0.0028784696478396654,
0.03791118413209915,
0.15096253156661987,
0.0963326171040535,
0.12679080665111542,
0.03566423058509827,
0.10671652853488922,
-0.051348891109228134,
-0.04103589430451393,
-0.1509605348110199,
0.03537018969655037,
-0.0005825982661917806,
0.037484098225831985,
-0.029887448996305466,
0.0978274941444397,
0.12301912158727646,
-0.12409122288227081,
0.08646505326032639,
0.018462564796209335,
-0.10129650682210922,
-0.038517244160175323,
-0.04552832990884781,
-0.04562893509864807,
-0.086546890437603,
0.022017015144228935,
-0.11218655854463577,
0.017637034878134727,
0.07254260033369064,
0.03818615525960922,
-0.023503685370087624,
0.16110189259052277,
-0.030394159257411957,
-0.05300462618470192,
0.02365860715508461,
0.023841481655836105,
0.045224908739328384,
0.08183744549751282,
0.0017363284714519978,
0.07315364480018616,
-0.07223678380250931,
0.07031460106372833,
0.008298493921756744,
0.010120461694896221,
0.007407506462186575,
0.005565659608691931,
-0.0012416623067110777,
-0.04140964895486832,
-0.00812092237174511,
0.07003471255302429,
0.16436459124088287,
0.03253702074289322,
-0.043932896107435226,
-0.0512729212641716,
0.17163729667663574,
-0.057988930493593216,
-0.06773641705513,
-0.1218930333852768,
0.1362149566411972,
0.0619225800037384,
0.03216063976287842,
0.01187776867300272,
-0.08737465739250183,
-0.03974639251828194,
0.22134552896022797,
0.017794830724596977,
-0.034005891531705856,
-0.03217005729675293,
-0.009602821432054043,
-0.003695644671097398,
-0.03827818110585213,
0.1372823417186737,
0.017955753952264786,
0.1979818493127823,
-0.003012015949934721,
-0.011114914901554585,
-0.042719196528196335,
-0.02586274966597557,
-0.03349411115050316,
0.18866078555583954,
-0.03877904638648033,
0.027766326442360878,
-0.08849631994962692,
-0.02315177023410797,
0.048145271837711334,
-0.11948353052139282,
0.1169222965836525,
-0.09492289274930954,
-0.05991072952747345,
0.02986309863626957,
0.0753408893942833,
-0.0321706086397171,
0.032808009535074234,
-0.02416943572461605,
0.0501546785235405,
0.06633702665567398,
-0.02266727201640606,
-0.0968666821718216,
-0.10693731904029846,
0.04501164332032204,
-0.058640073984861374,
0.1575106531381607,
0.020799942314624786,
0.08919886499643326,
0.08555334806442261,
0.02517143078148365,
-0.0724753588438034,
0.1107146143913269,
0.03977714106440544,
0.010683760046958923,
0.07692719995975494,
0.13310720026493073,
-0.03529077395796776,
0.15599793195724487,
-0.013367992825806141,
-0.04464036971330643,
-0.01470552571117878,
-0.020385107025504112,
-0.005833152681589127,
-0.13655753433704376,
-0.0009624514495953918,
-0.05982653796672821,
0.1360355168581009,
0.17562973499298096,
-0.04561583325266838,
-0.026765668764710426,
-0.030389979481697083,
0.06811293959617615,
-0.017789648845791817,
0.10176470875740051,
-0.002474573440849781,
-0.1569414883852005,
0.016114911064505577,
-0.021805161610245705,
0.017220841720700264,
-0.16759365797042847,
-0.04893725737929344,
-0.042156897485256195,
-0.0484750010073185,
-0.07346107810735703,
0.1392500251531601,
0.08088431507349014,
0.03112974390387535,
-0.03976529464125633,
-0.18207907676696777,
-0.021929878741502762,
0.04945503920316696,
-0.14728431403636932,
-0.12050959467887878
] |
null | null |
transformers
|
# CodeTrans model for source code summarization python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization python dataset.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python", skip_special_tokens=True),
device=0
)
tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) '''
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/source%20code%20summarization/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_python
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization python
====================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization python dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
115
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08860733360052109,
0.017160693183541298,
-0.0006001046858727932,
0.04349924623966217,
0.16086836159229279,
0.020471887663006783,
0.08898057788610458,
0.05220251902937889,
-0.020472129806876183,
-0.04874681308865547,
0.0795515775680542,
0.15029504895210266,
0.028697984293103218,
0.15368743240833282,
-0.018860433250665665,
-0.22326116263866425,
-0.0019136077025905252,
0.058567918837070465,
-0.17546124756336212,
0.1351144164800644,
0.10726488381624222,
-0.03355778381228447,
0.0886555165052414,
0.005819724407047033,
-0.21633127331733704,
0.06482136249542236,
0.008466817438602448,
-0.08171592652797699,
0.12604518234729767,
0.07849413901567459,
0.13425880670547485,
0.006683136336505413,
0.01804765872657299,
-0.216693714261055,
0.03209909796714783,
-0.03307691588997841,
-0.007675210479646921,
0.03920748457312584,
0.0371880866587162,
-0.08023171871900558,
0.21515198051929474,
-0.016038790345191956,
0.05990903452038765,
0.05447465181350708,
-0.11270024627447128,
-0.11817799508571625,
-0.02072261832654476,
0.0073150829412043095,
0.07860671728849411,
0.08722208440303802,
0.013742159120738506,
0.1410273313522339,
-0.12554284930229187,
0.12932732701301575,
0.083123117685318,
-0.1895856410264969,
-0.017546553164720535,
0.11008241772651672,
0.06202330067753792,
-0.07383744418621063,
-0.034736569970846176,
0.0017029652372002602,
0.05661345273256302,
0.012250999920070171,
0.017544329166412354,
-0.1223830133676529,
-0.13123267889022827,
0.030336715281009674,
-0.0941266268491745,
-0.07532951980829239,
0.2824760377407074,
-0.018633387982845306,
-0.059198979288339615,
-0.03255108743906021,
-0.03956032544374466,
0.03380630537867546,
-0.023355377838015556,
0.03897177800536156,
-0.010274946689605713,
-0.011012970469892025,
-0.050432488322257996,
0.0015826828312128782,
-0.08938825875520706,
-0.09812258183956146,
-0.008633090183138847,
0.14012575149536133,
0.0107431560754776,
0.03126245737075806,
-0.16082540154457092,
0.10232760012149811,
0.07362513244152069,
-0.05751664191484451,
0.030702704563736916,
-0.05743110552430153,
-0.05579058825969696,
-0.022270534187555313,
-0.05157099664211273,
-0.1584625393152237,
0.06943129003047943,
0.13007105886936188,
-0.025406375527381897,
0.05964367091655731,
0.025422757491469383,
0.05015762150287628,
0.059790078550577164,
0.18777750432491302,
-0.00693871034309268,
-0.04548337310552597,
0.05316467955708504,
-0.04250086471438408,
-0.04662277549505234,
0.006334397941827774,
-0.06746023893356323,
-0.0289921872317791,
0.01243736781179905,
0.1249786987900734,
-0.07227800041437149,
0.09438866376876831,
-0.0653599351644516,
-0.04718368500471115,
-0.05966254696249962,
-0.13073180615901947,
-0.02098878100514412,
0.013165566138923168,
-0.06238408014178276,
0.022755172103643417,
0.12839080393314362,
-0.07794041186571121,
-0.10376103967428207,
0.024116959422826767,
-0.07959560304880142,
-0.004284413997083902,
-0.08679584413766861,
-0.10571932047605515,
0.007392845582216978,
0.07449834048748016,
0.0757002905011177,
-0.1310863345861435,
-0.15134213864803314,
0.015720106661319733,
0.0924772173166275,
0.023449905216693878,
0.025925206020474434,
-0.07852624356746674,
-0.00535770645365119,
-0.02133731171488762,
-0.008854633197188377,
0.0019050247501581907,
-0.07941143959760666,
0.09706663340330124,
0.07188963145017624,
0.03705019876360893,
-0.09157878905534744,
0.04741793870925903,
-0.12035613507032394,
0.06442774832248688,
-0.1458013504743576,
0.07575094699859619,
-0.06460627168416977,
0.14339877665042877,
-0.1130077913403511,
-0.07184705883264542,
0.0561845488846302,
0.049657706171274185,
0.05502849817276001,
0.12903976440429688,
-0.09477569907903671,
-0.06839010119438171,
0.1583215594291687,
-0.11593794077634811,
-0.19265897572040558,
0.0784149318933487,
-0.0709158256649971,
0.19416972994804382,
0.0805300772190094,
0.16170109808444977,
0.1514970064163208,
-0.08458569645881653,
0.04867082089185715,
0.08386710286140442,
-0.03652823343873024,
-0.036985062062740326,
0.06791453808546066,
0.05327065661549568,
-0.17166964709758759,
0.04885312169790268,
-0.008832509629428387,
0.1464223563671112,
-0.03925495222210884,
-0.04093072563409805,
-0.006643296219408512,
-0.05381792411208153,
0.0691559910774231,
-0.011064846999943256,
0.08444885909557343,
0.008058233186602592,
-0.03708968684077263,
0.10004451870918274,
0.13638536632061005,
-0.12526825070381165,
-0.0064178104512393475,
-0.1121286079287529,
0.0894700363278389,
-0.09734044224023819,
0.025917084887623787,
-0.19431473314762115,
-0.046134404838085175,
-0.015173333697021008,
0.028875162824988365,
0.0682443305850029,
-0.006542902905493975,
0.006266770884394646,
-0.0009116624132730067,
0.013765418902039528,
0.004295320250093937,
0.004424958024173975,
-0.020287955179810524,
-0.03353036940097809,
-0.08435585349798203,
-0.05483469367027283,
-0.040497247129678726,
0.07663778960704803,
-0.17543925344944,
0.004053147044032812,
0.04662472754716873,
0.06030837446451187,
0.0154345091432333,
0.02114376425743103,
0.03650931641459465,
0.05976090952754021,
-0.04915379732847214,
-0.016418440267443657,
0.05283936485648155,
0.007030125707387924,
-0.12435093522071838,
0.0072959293611347675,
-0.08435618877410889,
0.05037618800997734,
0.1294691115617752,
-0.14035363495349884,
-0.07206306606531143,
-0.030369795858860016,
-0.020620523020625114,
-0.017418358474969864,
0.015024960041046143,
-0.026672478765249252,
0.19297149777412415,
0.004619013983756304,
0.17782364785671234,
-0.0875970870256424,
-0.023878654465079308,
-0.035892002284526825,
-0.01992938667535782,
0.038965240120887756,
0.12645477056503296,
0.0896444171667099,
-0.19380724430084229,
0.057382069528102875,
0.06320872157812119,
-0.021837064996361732,
0.1733696311712265,
-0.0541255958378315,
-0.04692446067929268,
-0.004577385261654854,
0.0754409208893776,
-0.010067467577755451,
0.14957746863365173,
-0.1826048493385315,
-0.019501011818647385,
0.01575937494635582,
-0.02278163470327854,
0.09701230376958847,
-0.11409316956996918,
-0.003626129124313593,
0.03742143139243126,
-0.02127113565802574,
-0.17830781638622284,
0.035465266555547714,
0.020124640315771103,
0.03074602037668228,
0.0026577613316476345,
-0.01352281030267477,
0.02794647589325905,
-0.021136874333024025,
-0.12572194635868073,
0.2491578757762909,
-0.08376973122358322,
-0.2632632255554199,
-0.1874053031206131,
-0.021319737657904625,
-0.00724594434723258,
-0.026840712875127792,
0.04753633216023445,
-0.05369872972369194,
-0.053751517087221146,
-0.006508091930299997,
0.18552179634571075,
-0.07967925816774368,
-0.028225228190422058,
-0.05319402739405632,
0.059541091322898865,
0.007007195148617029,
-0.1842992901802063,
0.005940225441008806,
0.012327049858868122,
0.02335699088871479,
0.02222287282347679,
-0.1712690144777298,
0.09472005069255829,
0.10449052602052689,
-0.06062930077314377,
0.03250214084982872,
-0.04849730804562569,
0.25012338161468506,
-0.07301994413137436,
-0.10496613383293152,
0.12341627478599548,
-0.11269205063581467,
0.011493582278490067,
0.027205221354961395,
0.011213591322302818,
-0.1225871816277504,
0.037817928940057755,
-0.03910817578434944,
-0.06773950904607773,
-0.24173538386821747,
-0.11996512860059738,
-0.085337795317173,
0.10551639646291733,
0.07826950401067734,
0.023556429892778397,
-0.08220475167036057,
0.07234030961990356,
0.05987526848912239,
0.12845364212989807,
-0.003772375173866749,
0.0844821035861969,
0.08061183243989944,
0.015598076395690441,
0.002655432326719165,
-0.10467396676540375,
-0.05649431422352791,
0.033910930156707764,
0.0759735181927681,
0.19486801326274872,
0.017399916425347328,
0.14504337310791016,
0.04781779646873474,
0.0448528416454792,
0.04021444171667099,
0.1881457418203354,
-0.09904874116182327,
0.027088135480880737,
0.004270048812031746,
-0.024826165288686752,
-0.13679499924182892,
0.019532954320311546,
-0.013281641528010368,
0.0327470600605011,
-0.14675423502922058,
-0.07796474546194077,
0.0452897772192955,
0.09102129191160202,
0.012632477097213268,
-0.2618206739425659,
-0.121847003698349,
0.027405355125665665,
-0.1012464314699173,
-0.06588808447122574,
0.047864459455013275,
0.05661370977759361,
-0.1505529135465622,
0.03073257952928543,
-0.06570316106081009,
0.16508564352989197,
-0.0706694945693016,
0.005766936112195253,
-0.07534697651863098,
-0.05815184488892555,
-0.0035223239101469517,
0.14366312325000763,
-0.17939773201942444,
0.22710204124450684,
0.0024475974496454,
0.01675938256084919,
-0.05163611099123955,
0.0336214080452919,
0.00799607764929533,
0.09508217126131058,
0.09882908314466476,
-0.020261544734239578,
-0.023806411772966385,
-0.1755320280790329,
0.004512449260801077,
0.08648830652236938,
0.05450724810361862,
-0.024127047508955002,
0.08684056252241135,
-0.04377390444278717,
0.03425481542944908,
-0.01426686067134142,
-0.10073710232973099,
-0.0513342022895813,
-0.1247512698173523,
-0.011526756919920444,
-0.08892286568880081,
0.07278735190629959,
-0.022165371105074883,
0.010188303887844086,
0.0831073448061943,
0.18051114678382874,
-0.06312538683414459,
-0.08290383219718933,
-0.11326812952756882,
0.06050252541899681,
0.11777324974536896,
-0.08118961751461029,
0.04951071739196777,
-0.007045508828014135,
0.00813821330666542,
-0.0011113190557807684,
-0.17356427013874054,
0.06602536141872406,
-0.059750065207481384,
0.026008568704128265,
-0.01330669317394495,
0.10638748109340668,
-0.01514535490423441,
0.013344939798116684,
0.05369960889220238,
-0.05262547731399536,
-0.06104636192321777,
-0.12308359146118164,
-0.1400429755449295,
-0.05781443044543266,
0.012929280288517475,
0.09109502285718918,
-0.11951463669538498,
0.008990220725536346,
-0.0377374030649662,
0.007569184992462397,
0.2442476749420166,
0.11543837189674377,
-0.032934751361608505,
0.015565309673547745,
0.07849358767271042,
-0.09029877930879593,
-0.24147500097751617,
-0.006027544848620892,
-0.020900998264551163,
0.06906525045633316,
0.013295403681695461,
-0.1768987476825714,
0.10133776813745499,
-0.0061058844439685345,
0.04760036617517471,
0.044040363281965256,
-0.28890058398246765,
-0.09597942233085632,
0.1322752982378006,
0.1284269243478775,
0.09071438759565353,
-0.12671618163585663,
-0.03869784623384476,
-0.0839013159275055,
-0.18745283782482147,
0.16773207485675812,
-0.09316420555114746,
0.0995231494307518,
-0.0010118476347997785,
0.10928241908550262,
0.03057202510535717,
-0.03671339899301529,
0.1119920089840889,
0.0005518021062016487,
0.06948105990886688,
-0.016006460413336754,
-0.09090334177017212,
0.0968228280544281,
-0.019914543256163597,
0.13549986481666565,
-0.09641063958406448,
0.07187813520431519,
-0.25892916321754456,
-0.03866172581911087,
-0.022274304181337357,
0.04808279499411583,
-0.006936606485396624,
-0.05985216051340103,
-0.07422732561826706,
0.0006161804194562137,
0.043333832174539566,
0.015103396959602833,
0.11387009173631668,
-0.0314040333032608,
0.034626513719558716,
0.05472413823008537,
0.13754375278949738,
0.004887157119810581,
-0.1016491949558258,
0.048240143805742264,
0.01841137371957302,
0.1027987077832222,
-0.24218516051769257,
0.07421375066041946,
0.12528111040592194,
0.050972938537597656,
0.10566013306379318,
0.08031799644231796,
-0.03306876868009567,
0.04778425768017769,
0.09015640616416931,
-0.12023556977510452,
-0.09109016507863998,
-0.03429708629846573,
-0.04842875525355339,
-0.022041091695427895,
0.06023596599698067,
0.14939413964748383,
-0.05578511953353882,
-0.022857677191495895,
0.004581274464726448,
-0.035670481622219086,
-0.14026644825935364,
0.13850167393684387,
0.034027792513370514,
0.06344247609376907,
-0.09904337674379349,
0.05818779766559601,
0.05430431291460991,
-0.12186232954263687,
-0.03194190189242363,
0.11109478026628494,
-0.13122810423374176,
-0.0712808147072792,
-0.021015189588069916,
0.20672917366027832,
-0.1298903375864029,
-0.05939750000834465,
-0.10993314534425735,
-0.06593629717826843,
-0.013598504476249218,
0.2245892435312271,
0.10417208820581436,
0.08341331779956818,
-0.05393289029598236,
-0.011439240537583828,
-0.11303990334272385,
0.03956504538655281,
0.10839527100324631,
0.010565546341240406,
-0.09965001791715622,
0.103291355073452,
0.010540535673499107,
0.1300683617591858,
-0.055073611438274384,
-0.04934564232826233,
-0.19019827246665955,
0.08822666853666306,
-0.12401113659143448,
0.05795431509613991,
-0.06799498945474625,
0.03683539479970932,
0.020310508087277412,
0.008551478385925293,
-0.03148127347230911,
0.03134550154209137,
-0.10205160081386566,
0.012378858402371407,
0.01258348673582077,
0.04073790833353996,
-0.06195206940174103,
0.006533414591103792,
0.08948346972465515,
-0.06827909499406815,
0.09641249477863312,
0.067596934735775,
-0.04148447886109352,
0.13509239256381989,
-0.17320875823497772,
-0.045531321316957474,
0.04620177671313286,
0.019546983763575554,
0.05175615847110748,
-0.04934793710708618,
0.044270798563957214,
0.006312513258308172,
0.05011676996946335,
0.010588791221380234,
0.10599978268146515,
-0.12965014576911926,
-0.10282568633556366,
-0.04174760356545448,
-0.10291401296854019,
-0.03103168122470379,
0.01703834906220436,
0.006261806003749371,
0.08486995846033096,
0.12143983691930771,
-0.029914304614067078,
0.04621277004480362,
-0.06165456026792526,
-0.026887070387601852,
0.018320202827453613,
-0.0783783569931984,
-0.03651583194732666,
-0.0900096744298935,
0.02068103663623333,
-0.04926157370209694,
0.1784042865037918,
-0.018963752314448357,
0.13737033307552338,
-0.006873725447803736,
-0.017065400257706642,
0.06523091346025467,
0.04243475943803787,
0.26935556530952454,
0.0004803294432349503,
0.053976014256477356,
-0.061314091086387634,
0.06323787569999695,
0.030852988362312317,
0.06958266347646713,
0.07888567447662354,
0.09836981445550919,
-0.06759703904390335,
0.09070544689893723,
0.011735904030501842,
0.052228350192308426,
-0.06338207423686981,
-0.13303619623184204,
0.07450558245182037,
0.05836762115359306,
-0.023491835221648216,
0.12429329752922058,
0.12008589506149292,
-0.07156959176063538,
0.0875006914138794,
0.026261623948812485,
-0.10057538002729416,
-0.06023230776190758,
0.010142071172595024,
-0.03854146599769592,
-0.14936932921409607,
0.017069857567548752,
-0.11235924810171127,
-0.06877784430980682,
0.09101696312427521,
0.040867798030376434,
-0.04140205308794975,
0.19975696504116058,
-0.0035017330665141344,
-0.09362971782684326,
0.044159941375255585,
-0.011056154035031796,
-0.008124743588268757,
-0.004621696658432484,
0.06805405020713806,
-0.02507658302783966,
-0.031224019825458527,
0.016604207456111908,
0.04407299682497978,
-0.06024627760052681,
0.029014304280281067,
-0.0834360420703888,
-0.032170966267585754,
-0.04420490562915802,
0.06488708406686783,
-0.0137086883187294,
0.06320251524448395,
0.02043900080025196,
-0.03814668953418732,
-0.02997918613255024,
0.21199791133403778,
-0.042288873344659805,
-0.08232685178518295,
-0.15420730412006378,
0.2586973309516907,
0.012401950545608997,
0.05367332696914673,
0.0040296344086527824,
-0.06351668387651443,
-0.03483392670750618,
0.2828465700149536,
0.20360347628593445,
-0.02398342825472355,
0.003981270361691713,
-0.0021211616694927216,
0.018996773287653923,
0.005739368498325348,
0.14716868102550507,
0.032085321843624115,
0.22922013700008392,
-0.03202621266245842,
-0.07951317727565765,
-0.049493011087179184,
-0.046059370040893555,
0.004931811708956957,
0.09212339669466019,
0.03104843571782112,
-0.05976893752813339,
-0.05005800724029541,
0.12649843096733093,
-0.17238354682922363,
-0.09293278306722641,
0.019724639132618904,
-0.1341848224401474,
-0.0879984200000763,
-0.07960796356201172,
0.0004695173993241042,
-0.00675207981839776,
0.05912265181541443,
-0.04915475472807884,
-0.030734507367014885,
0.03470689430832863,
0.038533903658390045,
-0.17329087853431702,
-0.12782210111618042,
0.0919046401977539,
-0.02243560552597046,
0.1275320053100586,
-0.018699005246162415,
0.09960302710533142,
0.09278883785009384,
0.03922632709145546,
-0.013473722152411938,
0.001996937207877636,
0.07286128401756287,
0.019155075773596764,
0.06017227843403816,
0.06770287454128265,
-0.03759608045220375,
0.15057027339935303,
-0.052972469478845596,
-0.09899723529815674,
0.03345179557800293,
-0.017214007675647736,
0.015420802868902683,
-0.09657347202301025,
-0.0441599078476429,
-0.09929464757442474,
0.08386899530887604,
0.1964253932237625,
-0.05659019574522972,
0.02232116274535656,
-0.08100966364145279,
0.10818936675786972,
0.03005637787282467,
-0.03967338800430298,
-0.08678079396486282,
-0.15878920257091522,
-0.05021089315414429,
-0.01930675283074379,
-0.038411352783441544,
-0.23377539217472076,
0.008615624159574509,
-0.06250525265932083,
0.007294429931789637,
-0.05177873373031616,
0.14214551448822021,
0.11666188389062881,
0.02086796797811985,
-0.030802303925156593,
-0.1490253359079361,
-0.020312242209911346,
0.07490073889493942,
-0.11771131306886673,
-0.1360330432653427
] |
null | null |
transformers
|
# CodeTrans model for source code summarization python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) '''
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_python_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization python
====================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
146
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12630237638950348,
0.013945790939033031,
-0.0011800321517512202,
0.13230131566524506,
0.1183934137225151,
0.015809549018740654,
0.061605554074048996,
0.05264705419540405,
-0.04278646036982536,
0.020895998924970627,
0.04762960970401764,
0.02429259940981865,
0.04583820328116417,
0.1989128142595291,
0.021388281136751175,
-0.167968288064003,
-0.01840210147202015,
0.024809278547763824,
-0.08206938952207565,
0.12754173576831818,
0.09265416860580444,
-0.06389227509498596,
0.04680930823087692,
-0.045337848365306854,
-0.21089425683021545,
0.05923892557621002,
0.0036885193549096584,
-0.06212279945611954,
0.1046023890376091,
0.05116693675518036,
0.13737285137176514,
-0.023348551243543625,
0.038782112300395966,
-0.12843896448612213,
0.005747179035097361,
0.027340712025761604,
0.039312694221735,
0.01965276710689068,
0.06824447214603424,
0.029096189886331558,
0.16737399995326996,
-0.006788221187889576,
0.056001827120780945,
0.058914218097925186,
-0.06870158761739731,
-0.1239975094795227,
-0.02060619555413723,
0.025444449856877327,
0.05356736108660698,
0.10089746117591858,
-0.013783401809632778,
0.1131686195731163,
-0.13995462656021118,
0.12755078077316284,
0.07711349427700043,
-0.2434522658586502,
-0.012746939435601234,
0.08809095621109009,
0.05047992244362831,
0.062134113162755966,
-0.03882763534784317,
-0.06129439175128937,
0.08301261067390442,
0.05943499878048897,
0.03963528573513031,
-0.08339450508356094,
-0.08246784657239914,
0.010805051773786545,
-0.10302532464265823,
-0.07808712869882584,
0.22181302309036255,
-0.0022522551007568836,
-0.08558207750320435,
-0.04942653328180313,
-0.044880688190460205,
-0.11371350288391113,
0.03135828673839569,
0.053014468401670456,
-0.004176085814833641,
-0.01975400559604168,
-0.0024705345276743174,
0.040773905813694,
-0.08627285063266754,
-0.12796077132225037,
0.016773121431469917,
0.09709006547927856,
0.05772228538990021,
0.037223391234874725,
-0.07538091391324997,
0.11627880483865738,
0.027450084686279297,
-0.04052411764860153,
-0.004744403064250946,
-0.02207348868250847,
-0.11315227299928665,
0.03441484272480011,
-0.04876857250928879,
-0.18102140724658966,
-0.011154284700751305,
0.0222338680177927,
-0.03171267732977867,
0.05754540488123894,
0.03916003555059433,
0.020774642005562782,
0.037121936678886414,
0.18184277415275574,
0.026790454983711243,
-0.09943459928035736,
0.05425654351711273,
0.03998145833611488,
-0.039387382566928864,
-0.014003717340528965,
-0.050112925469875336,
-0.07177789509296417,
0.06717293709516525,
0.09891840815544128,
-0.11895045638084412,
0.049822788685560226,
-0.06423843652009964,
-0.04586886242032051,
-0.04518846422433853,
-0.166238471865654,
-0.0018033473752439022,
0.018561972305178642,
-0.0675368532538414,
-0.023080511018633842,
0.0973312258720398,
-0.16320252418518066,
-0.1548786461353302,
-0.009483670815825462,
-0.08432404696941376,
-0.043452564626932144,
-0.15389886498451233,
-0.15283741056919098,
-0.017948538064956665,
-0.02155126817524433,
0.010403362102806568,
-0.07711409777402878,
-0.16016459465026855,
-0.00910711195319891,
0.035475414246320724,
0.00935984868556261,
-0.006749791093170643,
-0.07258845865726471,
-0.004530640318989754,
-0.013920361176133156,
-0.030266225337982178,
0.0017310967668890953,
-0.0489245280623436,
0.12320540845394135,
0.10247252881526947,
0.028977874666452408,
-0.027560008689761162,
0.04906883090734482,
-0.07974816113710403,
0.06904009729623795,
-0.0985746756196022,
0.09969791024923325,
-0.0687413141131401,
0.08633026480674744,
-0.04518486559391022,
-0.11092016100883484,
0.0485568642616272,
0.05066730082035065,
0.06769424676895142,
0.05405528098344803,
-0.1488618552684784,
-0.0310454573482275,
0.1950415074825287,
-0.12477138638496399,
-0.1014900952577591,
0.11085549741983414,
-0.04516007378697395,
0.06142033636569977,
0.0950445830821991,
0.12857256829738617,
0.14591844379901886,
-0.03336963430047035,
0.0039040606934577227,
0.060178522020578384,
0.04576301947236061,
-0.09602725505828857,
0.08065425604581833,
0.04137787967920303,
-0.10878816246986389,
0.059969738125801086,
-0.006410744972527027,
0.11466673761606216,
-0.011897564865648746,
-0.03039049357175827,
-0.04847527667880058,
-0.06522040069103241,
0.014742196537554264,
-0.00003665557596832514,
0.06031685695052147,
-0.07110828161239624,
-0.08342312276363373,
0.07971829175949097,
0.17539626359939575,
-0.12448416650295258,
-0.013677318580448627,
-0.08883719146251678,
0.053995877504348755,
-0.06689044088125229,
0.016637474298477173,
-0.16459496319293976,
0.017274750396609306,
0.05906733497977257,
-0.02538585104048252,
0.06880659610033035,
0.10726752877235413,
0.013916055671870708,
0.05766013637185097,
0.01241363026201725,
-0.0068663968704640865,
-0.10004106163978577,
-0.061088886111974716,
-0.06408342719078064,
-0.05308331549167633,
-0.09601792693138123,
-0.04334724694490433,
0.010235406458377838,
-0.19210171699523926,
0.01233233418315649,
0.014830218628048897,
0.019645990803837776,
0.008385860361158848,
-0.021234720945358276,
0.02264830470085144,
0.06723780930042267,
-0.05812597647309303,
-0.040237147361040115,
0.022941023111343384,
0.01607169210910797,
-0.05552004650235176,
-0.07035422325134277,
-0.10839097201824188,
-0.004616033751517534,
0.11600387841463089,
0.0482177808880806,
-0.08153356611728668,
0.03483552485704422,
-0.014748490415513515,
-0.03528739511966705,
0.015386185608804226,
-0.0633777305483818,
0.1681731790304184,
0.0020116635132580996,
0.20365776121616364,
-0.14658646285533905,
-0.022458544000983238,
-0.02444579266011715,
0.017469216138124466,
0.058143969625234604,
0.12665271759033203,
0.013182645663619041,
-0.11133864521980286,
0.06054851412773132,
-0.0015797005034983158,
-0.07289145141839981,
0.2152034342288971,
-0.04935077950358391,
-0.09491032361984253,
0.021451570093631744,
0.11087256669998169,
-0.015385436825454235,
0.15596596896648407,
-0.16565099358558655,
-0.017793038859963417,
0.010379795916378498,
0.007188964635133743,
0.06586482375860214,
-0.13258738815784454,
0.009540410712361336,
0.025248456746339798,
-0.05946890264749527,
-0.1266283392906189,
-0.02929459512233734,
-0.00031420213053934276,
0.03664032369852066,
-0.0015989212552085519,
-0.011079098097980022,
0.010610911063849926,
-0.03477859869599342,
-0.10564497113227844,
0.22733238339424133,
-0.10736040025949478,
-0.2394486367702484,
-0.2029944211244583,
0.08784694969654083,
-0.044318512082099915,
-0.011748326942324638,
0.018512137234210968,
-0.08643912523984909,
-0.06523178517818451,
-0.05128396674990654,
0.19862787425518036,
-0.10136720538139343,
-0.011439154855906963,
-0.02700318768620491,
0.06262247264385223,
0.026936298236250877,
-0.20818224549293518,
0.04038620740175247,
-0.004929660819470882,
-0.026086434721946716,
0.00008127163164317608,
-0.10008227825164795,
0.0720025822520256,
0.15457071363925934,
-0.06934704631567001,
0.016728930175304413,
-0.0031827236525714397,
0.20775839686393738,
-0.03235858306288719,
-0.07368405163288116,
0.12421199679374695,
-0.012053625658154488,
0.011869829148054123,
0.017457671463489532,
0.00029016946791671216,
-0.0958920493721962,
0.05453904718160629,
0.0029043699614703655,
-0.030435146763920784,
-0.25933733582496643,
-0.034236013889312744,
-0.08479149639606476,
0.06078444421291351,
0.06122627481818199,
0.04349476471543312,
-0.10288244485855103,
0.03390386700630188,
0.040145136415958405,
0.13815303146839142,
-0.01541947852820158,
0.0488666333258152,
0.06604524701833725,
-0.0028051624540239573,
0.010739397257566452,
-0.09623255580663681,
-0.008862213231623173,
0.07879257947206497,
0.09229514002799988,
0.2735237181186676,
-0.09641201049089432,
0.1897478848695755,
0.02985314093530178,
0.07000628113746643,
0.045057639479637146,
0.162079855799675,
-0.11478318274021149,
0.03555387258529663,
0.005669895093888044,
-0.011054586619138718,
-0.1299157291650772,
0.01754351146519184,
-0.031985774636268616,
0.06930941343307495,
-0.11473669856786728,
-0.04205126687884331,
0.00043032364919781685,
0.16271533071994781,
0.04579271748661995,
-0.23148128390312195,
-0.12132804095745087,
0.01971106417477131,
-0.112895667552948,
-0.10692214220762253,
0.05887216702103615,
0.2062440812587738,
-0.08692433685064316,
-0.015113264322280884,
-0.028837988153100014,
0.13022448122501373,
-0.048186104744672775,
-0.031193377450108528,
-0.05092030391097069,
0.0634617879986763,
0.011685854755342007,
0.13358256220817566,
-0.22818253934383392,
0.14107674360275269,
-0.0063408734276890755,
0.06683757156133652,
-0.034212902188301086,
0.06850168108940125,
-0.035947613418102264,
0.047201596200466156,
0.0417213998734951,
-0.00735145527869463,
-0.00787088368088007,
-0.18448743224143982,
-0.007978860288858414,
0.03623652458190918,
0.04080045223236084,
0.04530610889196396,
0.08089736849069595,
-0.016048526391386986,
0.05129440501332283,
-0.010935354977846146,
-0.14545850455760956,
-0.05639221519231796,
-0.10718121379613876,
-0.0127249201759696,
-0.062364280223846436,
-0.02345796674489975,
-0.05474042892456055,
-0.011940798722207546,
0.09464740008115768,
0.187881201505661,
-0.10469520837068558,
-0.09657072275876999,
-0.08579656481742859,
0.06997358053922653,
0.13605020940303802,
-0.07318390905857086,
0.06233454495668411,
-0.005854398477822542,
0.02815234288573265,
0.0018803643761202693,
-0.10063645243644714,
0.05201372131705284,
-0.03298619017004967,
-0.04928631708025932,
-0.01695292256772518,
0.09871286153793335,
0.0026047108694911003,
0.02801067940890789,
-0.0020500277169048786,
-0.0734955444931984,
-0.04683074355125427,
-0.11686298996210098,
-0.13430708646774292,
-0.04764192923903465,
-0.002939388854429126,
0.06685362011194229,
-0.12358565628528595,
-0.040573492646217346,
-0.01698361337184906,
-0.016085287556052208,
0.14523963630199432,
0.14593511819839478,
-0.06352272629737854,
0.02855144813656807,
0.10392400622367859,
-0.054358117282390594,
-0.1764521300792694,
0.010154741816222668,
0.06480003893375397,
0.11243562400341034,
-0.03914045915007591,
-0.2013891041278839,
0.056506212800741196,
0.026637045666575432,
0.0430036336183548,
0.06683412939310074,
-0.3286319673061371,
-0.11593157052993774,
0.07313422858715057,
0.12998652458190918,
0.06945309042930603,
-0.11019349843263626,
-0.033445242792367935,
-0.05053773522377014,
-0.1104264184832573,
0.10306975245475769,
-0.01631402224302292,
0.13555443286895752,
-0.04033944755792618,
0.07162244617938995,
0.03915278986096382,
-0.04226203262805939,
0.07481670379638672,
0.010112431831657887,
0.09575416892766953,
-0.033024292439222336,
0.02329859882593155,
0.11304613947868347,
-0.025966867804527283,
0.17437142133712769,
-0.14142799377441406,
0.10456793010234833,
-0.24280421435832977,
-0.06744580715894699,
-0.06579912453889847,
0.014848874881863594,
-0.0313543900847435,
-0.04542389139533043,
-0.07304742187261581,
0.021528035402297974,
0.0031660411041229963,
-0.008582225069403648,
0.02341715432703495,
-0.02242865227162838,
-0.013422108255326748,
0.07883839309215546,
0.10023270547389984,
0.0043946984224021435,
-0.10973527282476425,
0.04689881205558777,
0.05092036724090576,
0.0928245261311531,
-0.2095898985862732,
0.02309037558734417,
0.11608447879552841,
0.03472393751144409,
0.1117841899394989,
0.04811183363199234,
-0.10496138781309128,
0.04452789947390556,
0.0906825140118599,
-0.08394704014062881,
-0.09863828122615814,
-0.012153795920312405,
-0.04212256148457527,
-0.08305464684963226,
0.04464137926697731,
0.09905561059713364,
-0.047255173325538635,
-0.023673949763178825,
-0.022216733545064926,
-0.02791915275156498,
-0.11412452906370163,
0.20422764122486115,
0.05229208245873451,
0.0824371948838234,
-0.07179530709981918,
0.06266943365335464,
0.08075465261936188,
-0.07896032929420471,
0.00741643738001585,
0.19029220938682556,
-0.10605517029762268,
-0.04098667576909065,
0.03169678524136543,
0.14725717902183533,
-0.033505722880363464,
-0.04135393351316452,
-0.11249362677335739,
-0.07335494458675385,
0.02627604454755783,
0.14743219316005707,
0.08197802305221558,
0.10388940572738647,
-0.0574665367603302,
-0.000272867560852319,
-0.09669068455696106,
0.07467467337846756,
0.08900252729654312,
0.0316467359662056,
-0.12247245758771896,
0.15928006172180176,
0.04406104236841202,
0.10073579847812653,
-0.026431547477841377,
-0.01818173937499523,
-0.11072660982608795,
0.062071073800325394,
-0.08857843279838562,
0.027243928983807564,
-0.01653030514717102,
0.04969386011362076,
-0.013522550463676453,
0.0033075471874326468,
-0.0258657094091177,
0.060274288058280945,
-0.08957472443580627,
-0.007814121432602406,
-0.0010935005266219378,
0.026998016983270645,
-0.041166771203279495,
-0.004292760975658894,
0.03911120817065239,
-0.09418732672929764,
0.11711173504590988,
-0.0011582843726500869,
-0.024097008630633354,
0.09551656991243362,
-0.03907826170325279,
0.031362492591142654,
0.01184187363833189,
0.052846021950244904,
0.011799158528447151,
0.030470769852399826,
0.07929784059524536,
0.02360164001584053,
0.062193892896175385,
0.01448872871696949,
0.10343070328235626,
-0.14138470590114594,
-0.10551101714372635,
-0.03476253151893616,
-0.10301797091960907,
-0.05824839323759079,
0.0923217236995697,
0.041243188083171844,
0.09790223836898804,
0.10642702877521515,
-0.031159643083810806,
0.02246197499334812,
-0.13040821254253387,
-0.06178969889879227,
0.017780594527721405,
-0.031251680105924606,
-0.06680287420749664,
-0.054162394255399704,
0.0416620671749115,
-0.029206929728388786,
0.11841768771409988,
0.003449004841968417,
0.045974183827638626,
-0.02466440759599209,
-0.05275362730026245,
0.03858601674437523,
0.022252662107348442,
0.23914194107055664,
-0.06227409839630127,
0.03704405575990677,
-0.011064461432397366,
0.00549447163939476,
0.003657330758869648,
0.11858224868774414,
0.10549727827310562,
0.1418120414018631,
-0.036384809762239456,
0.08858145028352737,
0.01622360572218895,
-0.00925953034311533,
-0.10890526324510574,
-0.003988899756222963,
0.01571788266301155,
0.07329916954040527,
-0.05682661384344101,
0.18608711659908295,
0.07670238614082336,
-0.09387428313493729,
0.10498609393835068,
0.0298386812210083,
-0.12953995168209076,
-0.04359458386898041,
0.005693093407899141,
-0.021664246916770935,
-0.13787123560905457,
0.029170332476496696,
-0.12222278118133545,
-0.016529392451047897,
0.08606904000043869,
0.05307496339082718,
-0.06573833525180817,
0.17415639758110046,
0.035929590463638306,
-0.07445555925369263,
0.04915487393736839,
0.0074262237176299095,
0.023765265941619873,
0.03639981523156166,
0.020892266184091568,
0.026821104809641838,
-0.0319216288626194,
0.049069467931985855,
0.018249616026878357,
-0.03469642251729965,
0.004668070934712887,
-0.019536370411515236,
0.0018994379788637161,
-0.027798311784863472,
0.02629968710243702,
0.04259876534342766,
0.167662113904953,
0.030252814292907715,
-0.08238139748573303,
-0.04440899193286896,
0.1752956509590149,
-0.04351768642663956,
-0.08147910237312317,
-0.1357429474592209,
0.19678232073783875,
0.026607664301991463,
0.021314818412065506,
0.02238965407013893,
-0.09061186015605927,
-0.04969240725040436,
0.1955851912498474,
0.08003530651330948,
-0.010162165388464928,
-0.027837010100483894,
-0.008616289123892784,
-0.005680339876562357,
-0.051820870488882065,
0.20517602562904358,
0.024329259991645813,
0.2566232681274414,
0.006513123866170645,
-0.008133361116051674,
-0.06521210819482803,
-0.027133097872138023,
-0.01856246031820774,
0.1261267066001892,
-0.04257075861096382,
-0.026835031807422638,
-0.08440776914358139,
0.019293898716568947,
0.00006685755215585232,
-0.08664009720087051,
0.06859119236469269,
-0.11918944865465164,
-0.10119911283254623,
-0.03845910727977753,
0.023949380964040756,
-0.03806367516517639,
0.03162972629070282,
-0.03847446292638779,
0.03932004049420357,
0.056392498314380646,
-0.019029492512345314,
-0.12279044836759567,
-0.16794070601463318,
0.1048850566148758,
-0.042478639632463455,
0.14667174220085144,
-0.008986355736851692,
0.12145175039768219,
0.09236729145050049,
0.03576464205980301,
-0.05070514976978302,
0.08789833635091782,
0.040244489908218384,
0.05009396746754646,
0.04402754455804825,
0.1223485991358757,
-0.04793626815080643,
0.16442973911762238,
-0.05775558203458786,
-0.02718343399465084,
-0.017720520496368408,
-0.0640610083937645,
0.0029247936327010393,
-0.15453487634658813,
-0.01671643927693367,
-0.1043972447514534,
0.09545664489269257,
0.20869967341423035,
-0.0496600866317749,
-0.021763915196061134,
-0.10167241096496582,
0.09225202351808548,
-0.009449124336242676,
0.048524729907512665,
-0.03797844052314758,
-0.1910392791032791,
-0.014320241287350655,
-0.010758673772215843,
0.005175073631107807,
-0.26635682582855225,
-0.0064634052105247974,
-0.03895251452922821,
-0.021669775247573853,
-0.07590482383966446,
0.165268212556839,
0.08227937668561935,
0.042866114526987076,
-0.0351051464676857,
-0.13304832577705383,
-0.04000761732459068,
0.060607265681028366,
-0.12680849432945251,
-0.12631608545780182
] |
null | null |
transformers
|
# CodeTrans model for source code summarization python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the python code snippets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) '''
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_python_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization python
====================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the python code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.07775121182203293,
0.07865709066390991,
-0.0013889130204916,
0.10578306764364243,
0.05742204189300537,
0.023086827248334885,
0.04704305902123451,
0.0987197607755661,
-0.04594762623310089,
0.06519899517297745,
0.07809405773878098,
-0.030575037002563477,
0.06460621953010559,
0.17980673909187317,
0.03398878499865532,
-0.18953852355480194,
-0.016877388581633568,
0.026403067633509636,
-0.03869783505797386,
0.1076657772064209,
0.0929504930973053,
-0.0776035338640213,
0.06434310227632523,
-0.03731834888458252,
-0.1144300177693367,
0.055822692811489105,
-0.04345965012907982,
-0.04792356491088867,
0.09194375574588776,
0.044715650379657745,
0.11889467388391495,
-0.0386212095618248,
0.07972236722707748,
-0.20494936406612396,
-0.005468621850013733,
0.024444224312901497,
0.05873097479343414,
0.02047574147582054,
0.06563001871109009,
0.05923500284552574,
0.1360846757888794,
-0.03969186544418335,
0.05641045421361923,
0.046089041978120804,
-0.06380799412727356,
-0.09610255062580109,
-0.0510561428964138,
0.0729348361492157,
0.10799168795347214,
0.08925054967403412,
-0.01682676188647747,
0.026384899392724037,
-0.07524813711643219,
0.09019945561885834,
0.10838207602500916,
-0.21479259431362152,
-0.021549738943576813,
0.0979202389717102,
0.09114880114793777,
0.05699664726853371,
-0.07285437732934952,
-0.041320834308862686,
0.10324373096227646,
0.038007862865924835,
0.04385429993271828,
-0.08289790153503418,
-0.06144491583108902,
-0.013280077837407589,
-0.06234943866729736,
-0.05499663203954697,
0.13807904720306396,
0.03449327498674393,
-0.06187477335333824,
-0.09492365270853043,
-0.04473763704299927,
-0.19230270385742188,
0.04424702003598213,
0.035490017384290695,
0.012414230965077877,
-0.013641556724905968,
0.012102299369871616,
0.007228025235235691,
-0.08953005075454712,
-0.09331180155277252,
-0.0020770940463989973,
0.07791968435049057,
0.07888913154602051,
0.028754930943250656,
-0.01644003950059414,
0.08280576020479202,
-0.028722520917654037,
-0.05145134776830673,
-0.025108158588409424,
0.01591918244957924,
-0.12506750226020813,
0.018960976973176003,
-0.013746865093708038,
-0.060249775648117065,
-0.014335596933960915,
0.11317503452301025,
-0.053249459713697433,
0.08029916882514954,
0.12392830848693848,
-0.002292544348165393,
0.0077073946595191956,
0.23179304599761963,
0.03988126665353775,
-0.1446172297000885,
0.0021487807389348745,
0.022851308807730675,
-0.002340460428968072,
-0.004578819032758474,
-0.057542093098163605,
-0.03484285622835159,
0.012217964045703411,
0.06140339747071266,
-0.11893480271100998,
0.0255214162170887,
-0.03579145297408104,
-0.0059279282577335835,
0.012070751748979092,
-0.1313648819923401,
0.0401146337389946,
0.006982236634939909,
-0.06216743215918541,
-0.029583726078271866,
0.061764493584632874,
-0.11661221086978912,
-0.11586892604827881,
0.04888073727488518,
-0.05330589786171913,
-0.03064698912203312,
-0.12135004997253418,
-0.13426126539707184,
-0.01550909411162138,
-0.05035733804106712,
0.02164609357714653,
-0.10438571125268936,
-0.10222335159778595,
-0.00924505852162838,
0.034049198031425476,
0.0004356396966613829,
-0.01737072318792343,
-0.05027705803513527,
0.012052062898874283,
-0.0023262780159711838,
-0.018411066383123398,
0.013233590871095657,
-0.04157642647624016,
0.09172309190034866,
0.10317142307758331,
0.05284969508647919,
0.01302305143326521,
0.026609687134623528,
-0.07308376580476761,
0.0711757242679596,
-0.05360066890716553,
0.055627960711717606,
-0.0248737633228302,
0.069066122174263,
-0.09578311443328857,
-0.08837007731199265,
0.05167136713862419,
0.04612061008810997,
0.05881451815366745,
0.017316646873950958,
-0.10764490067958832,
0.017833037301898003,
0.1526677906513214,
-0.09246695041656494,
-0.1226080134510994,
0.11665211617946625,
-0.005147205665707588,
0.004279215820133686,
0.0765942931175232,
0.11961399763822556,
0.1469002515077591,
-0.0959477573633194,
-0.04877112805843353,
0.08559176325798035,
0.07239534705877304,
-0.03560003265738487,
0.10128641128540039,
0.011910658329725266,
0.026125524193048477,
0.023056866601109505,
0.04652709141373634,
0.06661120057106018,
-0.004803861491382122,
-0.03272034972906113,
-0.025580869987607002,
-0.07693123072385788,
-0.01654156856238842,
-0.018729733303189278,
0.018853917717933655,
-0.07359135895967484,
-0.08125059306621552,
0.019276101142168045,
0.18200770020484924,
-0.09922318905591965,
0.021928897127509117,
-0.09596199542284012,
-0.03975768759846687,
-0.08974935859441757,
0.011728591285645962,
-0.1003379300236702,
0.013884137384593487,
0.046395402401685715,
-0.0640590488910675,
0.07132840901613235,
0.06040504202246666,
-0.0018783495761454105,
0.034323811531066895,
-0.03432026877999306,
-0.04229487478733063,
-0.05341583117842674,
-0.05377829447388649,
-0.12428917735815048,
-0.004244414623826742,
-0.09021526575088501,
-0.026134992018342018,
-0.047068025916814804,
-0.1720574051141739,
0.01598389446735382,
-0.024921663105487823,
0.017752762883901596,
0.002300976077094674,
-0.023332560434937477,
0.022698983550071716,
0.05556647852063179,
-0.058689069002866745,
-0.09171506017446518,
0.002328475471585989,
0.015644391998648643,
-0.11319100111722946,
-0.06618794798851013,
-0.1291716992855072,
-0.06456746906042099,
0.07774503529071808,
0.0941644012928009,
-0.0889095813035965,
0.035021934658288956,
-0.019229058176279068,
-0.05196673050522804,
-0.04913296177983284,
-0.0759953036904335,
0.17796167731285095,
0.009899882599711418,
0.16903473436832428,
-0.1301777958869934,
-0.0506008006632328,
-0.03428533673286438,
-0.02081519179046154,
0.021634304895997047,
0.14554019272327423,
0.006924157030880451,
-0.08539526909589767,
0.04678231105208397,
-0.019756166264414787,
-0.07232528179883957,
0.14755593240261078,
-0.003467520233243704,
-0.09629261493682861,
0.014951817691326141,
0.09551466256380081,
-0.009494469501078129,
0.1382831484079361,
-0.08244000375270844,
-0.00827891193330288,
-0.0017987136961892247,
0.025759823620319366,
0.040843717753887177,
-0.13057008385658264,
0.032530058175325394,
0.06636853516101837,
-0.05448569357395172,
-0.07963424175977707,
-0.04296162724494934,
-0.0359070710837841,
0.0338020958006382,
0.006929121445864439,
0.0010838633170351386,
-0.018502242863178253,
-0.019582560285925865,
-0.09401142597198486,
0.21251770853996277,
-0.0872294083237648,
-0.21082305908203125,
-0.16982951760292053,
0.0069511886686086655,
-0.04219883680343628,
-0.00812384020537138,
0.046654365956783295,
-0.11282122880220413,
-0.06786070019006729,
-0.08712920546531677,
0.15298065543174744,
-0.14217892289161682,
0.010172774083912373,
-0.010777299292385578,
0.032284386456012726,
0.03335198014974594,
-0.17648877203464508,
0.03262656554579735,
0.002444620244204998,
-0.01030288077890873,
0.006848393939435482,
-0.06477849185466766,
0.0817561000585556,
0.1310827136039734,
-0.08742377161979675,
0.009908760897815228,
-0.00822541769593954,
0.17141368985176086,
-0.05252472683787346,
0.01517362892627716,
0.18322397768497467,
0.011946791782975197,
0.04128023236989975,
0.048000819981098175,
0.012759164907038212,
-0.09348851442337036,
0.06155436858534813,
0.047376781702041626,
-0.036631498485803604,
-0.23869343101978302,
-0.005688754841685295,
-0.06982190161943436,
0.056449975818395615,
0.1341923326253891,
0.045970771461725235,
-0.14588169753551483,
0.040123917162418365,
-0.00877140462398529,
0.14918917417526245,
-0.03483026102185249,
0.06881319731473923,
0.010628240182995796,
0.015308437868952751,
0.009753940626978874,
-0.09448748826980591,
-0.002219999907538295,
0.07135521620512009,
0.11138558387756348,
0.20978564023971558,
-0.05294886976480484,
0.17916454374790192,
0.012897237204015255,
0.07036131620407104,
0.027867663651704788,
0.11452647298574448,
-0.12172035127878189,
0.0029275778215378523,
0.009692143648862839,
-0.004602604079991579,
-0.06485836207866669,
0.04923677444458008,
-0.018348852172493935,
0.06542827934026718,
-0.06602061539888382,
0.02453787252306938,
0.011583473533391953,
0.14840368926525116,
0.07938762754201889,
-0.2022012621164322,
-0.1145474910736084,
0.019810877740383148,
-0.11547896265983582,
-0.11868278682231903,
0.06226380914449692,
0.18029090762138367,
-0.05707602575421333,
0.0289307851344347,
-0.015512033365666866,
0.13527724146842957,
-0.07356707751750946,
-0.02857757918536663,
0.014846386387944221,
0.06476020067930222,
0.003548843553289771,
0.12823821604251862,
-0.24100685119628906,
0.0791773796081543,
0.011511162854731083,
0.09931610524654388,
-0.017793696373701096,
0.07545095682144165,
-0.039965346455574036,
0.0014356112806126475,
0.06787870079278946,
-0.0007232636562548578,
-0.07432500272989273,
-0.2200891524553299,
-0.052385542541742325,
0.025232259184122086,
0.06352512538433075,
-0.008888683281838894,
0.10787052661180496,
-0.021062513813376427,
0.07036679983139038,
-0.030861960723996162,
-0.13070344924926758,
-0.04466405138373375,
-0.13897471129894257,
-0.023396510630846024,
-0.011478628031909466,
-0.021480420604348183,
-0.029064971953630447,
0.023306550458073616,
0.00005873050758964382,
0.2096034735441208,
-0.1635502427816391,
-0.10681858658790588,
-0.09232717007398605,
0.08214017003774643,
0.13739824295043945,
-0.10073421895503998,
0.028255153447389603,
0.015600007958710194,
0.05029180273413658,
-0.039336904883384705,
-0.059812966734170914,
0.01727752946317196,
-0.05476367473602295,
-0.0725068524479866,
-0.022792942821979523,
0.09206194430589676,
-0.019130509346723557,
0.050385959446430206,
0.004822873510420322,
-0.07500922679901123,
-0.05763247609138489,
-0.1275956779718399,
-0.09552177041769028,
-0.016711749136447906,
0.023313116282224655,
0.011078264564275742,
-0.08492693305015564,
0.08149846643209457,
-0.0246726181358099,
-0.08012120425701141,
0.08860191702842712,
0.1951596438884735,
-0.06100618839263916,
0.0014886991120874882,
0.09373541176319122,
-0.055735327303409576,
-0.15795190632343292,
-0.060104742646217346,
0.05431521311402321,
0.08555134385824203,
-0.028562840074300766,
-0.14876297116279602,
0.0769738256931305,
0.03939494490623474,
0.03500140458345413,
0.016546132043004036,
-0.30011269450187683,
-0.12603308260440826,
0.05509943887591362,
0.06550819426774979,
0.04009636491537094,
-0.11691351234912872,
-0.03332207351922989,
-0.06448044627904892,
-0.0665322095155716,
0.037035323679447174,
0.058943599462509155,
0.13334162533283234,
-0.033154748380184174,
0.05645930394530296,
0.03362215682864189,
-0.026073496788740158,
0.10551223158836365,
0.0008902748813852668,
0.08915550261735916,
-0.0202062726020813,
0.015281562693417072,
0.05893857777118683,
-0.06268741190433502,
0.17441107332706451,
-0.16815686225891113,
0.08329594880342484,
-0.23386380076408386,
-0.05290830880403519,
-0.006905007641762495,
-0.007090937811881304,
-0.0377146415412426,
-0.06120387464761734,
-0.10877364128828049,
-0.0020403799135237932,
0.05839087814092636,
-0.02956072986125946,
0.06607743352651596,
-0.017756951972842216,
-0.04727235808968544,
0.03437688201665878,
0.07731153815984726,
-0.013980381190776825,
-0.1513807475566864,
0.02691034786403179,
0.031035667285323143,
0.08054950833320618,
-0.19851794838905334,
0.013887996785342693,
0.12485247850418091,
0.02281068079173565,
0.10675114393234253,
0.017554203048348427,
-0.07559891045093536,
0.044641755521297455,
0.07389302551746368,
-0.031125972047448158,
-0.09522342681884766,
0.0008903517154976726,
-0.021204335615038872,
-0.10522863268852234,
0.02977927029132843,
0.08682127296924591,
-0.05505163595080376,
-0.022966869175434113,
-0.015146065503358841,
0.003964628558605909,
-0.0722397118806839,
0.20326673984527588,
0.02301710471510887,
0.08590301871299744,
-0.06121247261762619,
0.07448042184114456,
0.10233356058597565,
-0.10537297278642654,
0.008653325028717518,
0.16598360240459442,
-0.07507362961769104,
-0.019054213538765907,
0.062165822833776474,
0.08118098229169846,
-0.05354650318622589,
-0.054221637547016144,
-0.08477050065994263,
-0.06806840747594833,
0.010787924751639366,
0.018777282908558846,
0.06735536456108093,
0.07122284919023514,
-0.039590295404195786,
0.016416549682617188,
-0.1092224270105362,
0.09878689795732498,
0.0824136957526207,
0.05366200953722,
-0.14644142985343933,
0.1664385199546814,
0.034044526517391205,
0.08236460387706757,
0.004029769450426102,
0.031966205686330795,
-0.09369168430566788,
0.04586470127105713,
-0.02181876078248024,
0.03872179239988327,
-0.010838079266250134,
0.05011492222547531,
-0.01626467891037464,
0.02698041871190071,
-0.02922016568481922,
0.042856816202402115,
-0.044951196759939194,
-0.03518470376729965,
-0.02659602276980877,
0.028597405180335045,
-0.04784535989165306,
-0.014914215542376041,
0.014507312327623367,
-0.08311234414577484,
0.09500335901975632,
-0.06012585386633873,
-0.001781183760613203,
0.010654681362211704,
0.0029843549709767103,
0.06072637066245079,
0.024140886962413788,
0.054246965795755386,
-0.01090827863663435,
-0.013453684747219086,
0.04447308927774429,
0.008777959272265434,
-0.00011216104030609131,
-0.0022640496026724577,
0.041305337101221085,
-0.14635467529296875,
-0.10199954360723495,
-0.09575647115707397,
-0.06915877014398575,
-0.0663139745593071,
0.07768315821886063,
0.07113230973482132,
0.07159360498189926,
0.10625938326120377,
-0.031062575057148933,
0.011252795346081257,
-0.1291019469499588,
-0.04076692834496498,
0.04367891699075699,
-0.014064996503293514,
-0.07942650467157364,
-0.04447948560118675,
0.054054006934165955,
-0.04388432577252388,
0.11700563132762909,
-0.011898963712155819,
0.04078146070241928,
-0.012069945223629475,
-0.04527544230222702,
-0.0005822055391035974,
-0.0019919807091355324,
0.22935160994529724,
-0.08409473299980164,
0.012112337164580822,
-0.010176151059567928,
-0.005791195202618837,
0.04603361338376999,
0.1407906413078308,
0.08916831016540527,
0.12807980179786682,
0.043907281011343,
0.09799014031887054,
-0.05600516125559807,
-0.03166724368929863,
-0.17407821118831635,
0.02806924283504486,
-0.0014981356216594577,
0.039326127618551254,
-0.02464940771460533,
0.11309143155813217,
0.14041469991207123,
-0.1201774924993515,
0.09000570327043533,
0.024869956076145172,
-0.10247232764959335,
-0.05066542327404022,
-0.05804825574159622,
-0.04783853143453598,
-0.08653201907873154,
0.02136881649494171,
-0.1086200624704361,
0.024343742057681084,
0.08678246289491653,
0.04079975560307503,
-0.021590502932667732,
0.15391306579113007,
-0.02472381666302681,
-0.07201419770717621,
0.009449619799852371,
0.02884901873767376,
0.041159242391586304,
0.10927315801382065,
0.01100321114063263,
0.06369937211275101,
-0.06130128726363182,
0.08732330054044724,
0.01785537227988243,
0.0012350105680525303,
0.020103106275200844,
-0.010526536032557487,
-0.0034014838747680187,
-0.04601716250181198,
-0.000445593090262264,
0.08143122494220734,
0.16264040768146515,
0.03840557858347893,
-0.05418761819601059,
-0.05469613894820213,
0.17690251767635345,
-0.055890560150146484,
-0.08116990327835083,
-0.12036475539207458,
0.17591407895088196,
0.05610084533691406,
0.025874432176351547,
0.011934252455830574,
-0.08713296800851822,
-0.04884737730026245,
0.21962909400463104,
0.019634241238236427,
-0.003722574096173048,
-0.040500227361917496,
-0.013970707543194294,
-0.008469329215586185,
-0.03885078802704811,
0.14632824063301086,
0.013676347211003304,
0.21253155171871185,
0.0009026295156218112,
0.0008707867236807942,
-0.03768246993422508,
-0.02803480252623558,
-0.047513071447610855,
0.18809543550014496,
-0.03288482874631882,
0.028408586978912354,
-0.09727191179990768,
-0.00019308672926854342,
0.04459976404905319,
-0.10589291155338287,
0.09776405245065689,
-0.09135112911462784,
-0.07677725702524185,
0.030849896371364594,
0.07463368773460388,
-0.027238789945840836,
0.04212542995810509,
-0.014924252405762672,
0.04731672629714012,
0.02233811281621456,
-0.02441198192536831,
-0.10123255103826523,
-0.13150177896022797,
0.058707769960165024,
-0.009708406403660774,
0.15833184123039246,
0.024750415235757828,
0.08204425871372223,
0.09579663723707199,
0.010130329057574272,
-0.06978002935647964,
0.10988981276750565,
0.03701076656579971,
0.023033400997519493,
0.07904373109340668,
0.12941695749759674,
-0.0410519577562809,
0.13637009263038635,
0.0013938590418547392,
-0.026057444512844086,
-0.032248690724372864,
-0.02177315019071102,
0.0049093617126345634,
-0.14736002683639526,
-0.005955118220299482,
-0.06476914137601852,
0.12740303575992584,
0.19672535359859467,
-0.05428023263812065,
-0.02220086008310318,
-0.040077708661556244,
0.06570661067962646,
-0.009967135265469551,
0.0854092612862587,
-0.006914926692843437,
-0.17098061740398407,
0.005894953850656748,
-0.044718414545059204,
0.01242811419069767,
-0.1785638928413391,
-0.04115552827715874,
-0.03461626544594765,
-0.03518420085310936,
-0.087615966796875,
0.14891506731510162,
0.07351820915937424,
0.018289588391780853,
-0.045381754636764526,
-0.20666873455047607,
-0.02894435077905655,
0.049852319061756134,
-0.14408299326896667,
-0.12297967076301575
] |
null | null |
transformers
|
# CodeTrans model for source code summarization python
Pretrained model on programming language python using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the python code snippets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_python_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) '''
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/python/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_python_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization python
====================================================
Pretrained model on programming language python using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the python code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.07284283638000488,
0.0787285566329956,
-0.0012506694765761495,
0.11000391840934753,
0.06413900852203369,
0.020343435928225517,
0.02786881849169731,
0.10416662693023682,
-0.05462821573019028,
0.0629790797829628,
0.0673137903213501,
-0.04170045256614685,
0.06254603713750839,
0.17467331886291504,
0.02833465114235878,
-0.20356452465057373,
-0.022124802693724632,
0.02700166031718254,
-0.05074258893728256,
0.1059933677315712,
0.08813529461622238,
-0.07043541222810745,
0.06993401050567627,
-0.043445099145174026,
-0.09985087811946869,
0.058775290846824646,
-0.03686055913567543,
-0.04499560967087746,
0.09265241026878357,
0.05241146311163902,
0.12107149511575699,
-0.0461537167429924,
0.07123004645109177,
-0.2117733508348465,
-0.005127894226461649,
0.03351346403360367,
0.05647440627217293,
0.02327323891222477,
0.05790223926305771,
0.050931431353092194,
0.15254440903663635,
-0.03524662181735039,
0.058469586074352264,
0.05051521956920624,
-0.06816063821315765,
-0.09869077801704407,
-0.04572319611907005,
0.05191446468234062,
0.09797479957342148,
0.10039372742176056,
-0.01509214285761118,
0.0204028207808733,
-0.07643377035856247,
0.08841951191425323,
0.10971101373434067,
-0.2183958739042282,
-0.018051108345389366,
0.10283449292182922,
0.0918440893292427,
0.06267451494932175,
-0.07864245772361755,
-0.04071720317006111,
0.10210435092449188,
0.039123695343732834,
0.055303823202848434,
-0.08369650691747665,
-0.06551859527826309,
-0.0144934868440032,
-0.06373247504234314,
-0.052780382335186005,
0.13688664138317108,
0.02595278061926365,
-0.05550451576709747,
-0.09639307111501694,
-0.05612049624323845,
-0.19529810547828674,
0.040470171719789505,
0.03144686669111252,
0.01690910942852497,
0.0008532206993550062,
0.0016085499664768577,
-0.0006365034496411681,
-0.08636049181222916,
-0.09138907492160797,
-0.0032872462179511786,
0.061544567346572876,
0.07308170944452286,
0.03115534782409668,
-0.021899303421378136,
0.08369849622249603,
-0.03215861693024635,
-0.05052817612886429,
-0.024817541241645813,
0.01635175198316574,
-0.11383804678916931,
0.023535259068012238,
-0.011561887338757515,
-0.05924190208315849,
-0.01663581281900406,
0.1132819801568985,
-0.06422971934080124,
0.08204939216375351,
0.11576145887374878,
-0.00020504696294665337,
0.0027301423251628876,
0.2376897782087326,
0.04268183931708336,
-0.15917500853538513,
0.01116883009672165,
0.01411800179630518,
0.002714704954996705,
-0.003215587232261896,
-0.0626097097992897,
-0.0405455119907856,
0.011150275357067585,
0.06562585383653641,
-0.11763990670442581,
0.029587963595986366,
-0.03819245845079422,
-0.0003284089616499841,
0.0141298184171319,
-0.12765365839004517,
0.04119968041777611,
0.003374379361048341,
-0.06654703617095947,
-0.018087556585669518,
0.07299966365098953,
-0.1205974668264389,
-0.11748652160167694,
0.04796978458762169,
-0.050551556050777435,
-0.032761022448539734,
-0.12815919518470764,
-0.13691358268260956,
-0.018818242475390434,
-0.046817563474178314,
0.01655704155564308,
-0.10731467604637146,
-0.10373760759830475,
-0.007133445702493191,
0.036521121859550476,
0.0004140517849009484,
-0.0138118090108037,
-0.060036271810531616,
0.0024609952233731747,
0.005492824129760265,
-0.017271926626563072,
0.010642261244356632,
-0.04496968537569046,
0.09064788371324539,
0.09314385801553726,
0.051469672471284866,
0.0028386942576617002,
0.02841426618397236,
-0.08581762760877609,
0.07050656527280807,
-0.068105049431324,
0.06478085368871689,
-0.02014218457043171,
0.06638476252555847,
-0.09977348148822784,
-0.08951830118894577,
0.03699597716331482,
0.04896271601319313,
0.06258512288331985,
0.020314445719122887,
-0.11433922499418259,
0.01984507590532303,
0.14962288737297058,
-0.10347120463848114,
-0.1114104613661766,
0.11696787178516388,
-0.005856297444552183,
0.010372173972427845,
0.0832001343369484,
0.12308129668235779,
0.1530434489250183,
-0.09786443412303925,
-0.04530232399702072,
0.08843354135751724,
0.05683586373925209,
-0.04491650313138962,
0.08890193700790405,
0.011343028396368027,
0.023688089102506638,
0.026729699224233627,
0.04490596055984497,
0.06923726201057434,
-0.000006674749329249607,
-0.031134702265262604,
-0.030598128214478493,
-0.07682964205741882,
-0.019275188446044922,
-0.009878798387944698,
0.015282481908798218,
-0.06770933419466019,
-0.0830954760313034,
0.01991645060479641,
0.17456012964248657,
-0.10452231019735336,
0.02681228704750538,
-0.08617115765810013,
-0.038563042879104614,
-0.07739647477865219,
0.015359689481556416,
-0.10170615464448929,
0.005952855106443167,
0.04649398475885391,
-0.043421510607004166,
0.06738535314798355,
0.06169944629073143,
0.002672158181667328,
0.024523120373487473,
-0.043056879192590714,
-0.04482816159725189,
-0.04165305197238922,
-0.06306415051221848,
-0.11840606480836868,
-0.000042096409742953256,
-0.0810873806476593,
-0.022999880835413933,
-0.053736377507448196,
-0.16846615076065063,
0.010559646412730217,
-0.023849274963140488,
0.015083592385053635,
0.008462885394692421,
-0.02056610770523548,
0.017303330823779106,
0.05442340299487114,
-0.05583832785487175,
-0.09485277533531189,
0.0070036412216722965,
0.02083793841302395,
-0.10458876192569733,
-0.04770580679178238,
-0.12762488424777985,
-0.06262369453907013,
0.0795840248465538,
0.09987657517194748,
-0.09367605298757553,
0.02353476732969284,
-0.0161872748285532,
-0.04872406646609306,
-0.05408415198326111,
-0.0723596140742302,
0.19770175218582153,
0.004480772186070681,
0.17086292803287506,
-0.12901929020881653,
-0.04983929172158241,
-0.040129441767930984,
-0.016450801864266396,
0.025283554568886757,
0.14553691446781158,
0.0030010102782398462,
-0.08068550378084183,
0.04997562989592552,
-0.033913783729076385,
-0.075568288564682,
0.1527368128299713,
-0.0008738607866689563,
-0.08915664255619049,
0.013754313811659813,
0.09237276017665863,
-0.00363224558532238,
0.13960054516792297,
-0.06847646087408066,
-0.00353938527405262,
-0.006178890820592642,
0.02737222984433174,
0.04591349884867668,
-0.12682035565376282,
0.033443138003349304,
0.06181349232792854,
-0.055948562920093536,
-0.06913501024246216,
-0.044088978320360184,
-0.03901715949177742,
0.034503985196352005,
0.011274985037744045,
0.004768531769514084,
-0.022809401154518127,
-0.02222033590078354,
-0.09057509154081345,
0.21399131417274475,
-0.0932525247335434,
-0.21835173666477203,
-0.17700237035751343,
0.021251583471894264,
-0.027022631838917732,
-0.008352658711373806,
0.042598605155944824,
-0.10985521972179413,
-0.07702840119600296,
-0.0939498022198677,
0.15584373474121094,
-0.14966998994350433,
0.004338630009442568,
-0.03458639234304428,
0.04167596250772476,
0.03302430361509323,
-0.17847128212451935,
0.030731912702322006,
-0.0025604255497455597,
-0.013904494233429432,
0.0006887547788210213,
-0.06750930100679398,
0.0779884085059166,
0.13197331130504608,
-0.08485705405473709,
0.00708241481333971,
-0.014214667491614819,
0.156375452876091,
-0.05643392354249954,
0.019112603738904,
0.1787516176700592,
0.013241617009043694,
0.04395585134625435,
0.05282660946249962,
0.005874840077012777,
-0.0931820496916771,
0.06728491932153702,
0.04664986953139305,
-0.04178059101104736,
-0.23430216312408447,
-0.0149614242836833,
-0.07072550058364868,
0.06512603908777237,
0.1359270215034485,
0.044769536703825,
-0.14554664492607117,
0.03157418966293335,
-0.009908528998494148,
0.14686962962150574,
-0.024538753554224968,
0.06883861869573593,
0.01907014101743698,
0.013228921219706535,
0.01109372079372406,
-0.09342561662197113,
-0.0008046216680668294,
0.06952574104070663,
0.10421974956989288,
0.2107008695602417,
-0.07037033885717392,
0.18025806546211243,
0.00035821693018078804,
0.07875878363847733,
0.03828558325767517,
0.10536367446184158,
-0.12679658830165863,
0.00999661535024643,
0.009461808949708939,
-0.005175301805138588,
-0.06433096528053284,
0.048286352306604385,
-0.026746483519673347,
0.06557243317365646,
-0.06680291891098022,
0.026573196053504944,
0.01054302603006363,
0.152078777551651,
0.07230446487665176,
-0.19932344555854797,
-0.11080718785524368,
0.017283454537391663,
-0.11229796707630157,
-0.11372477561235428,
0.07038403302431107,
0.1931990385055542,
-0.056546811014413834,
0.02251780405640602,
-0.01611463539302349,
0.1366671919822693,
-0.08839818835258484,
-0.03435393422842026,
0.015226799063384533,
0.08011741191148758,
0.0026939078234136105,
0.12239128351211548,
-0.24596178531646729,
0.07115975022315979,
0.011224069632589817,
0.1010482981801033,
-0.009437121450901031,
0.07253055274486542,
-0.040891434997320175,
-0.004424455109983683,
0.06708846241235733,
0.0005012439796701074,
-0.07316538691520691,
-0.207969069480896,
-0.053882699459791183,
0.02399977296590805,
0.06386011838912964,
-0.0037379141431301832,
0.10219381749629974,
-0.030463693663477898,
0.061847984790802,
-0.021761056035757065,
-0.15133343636989594,
-0.036430198699235916,
-0.1451612412929535,
-0.03850992023944855,
-0.011685368604958057,
-0.012667639181017876,
-0.02812381274998188,
0.030747422948479652,
0.019105974584817886,
0.2153419703245163,
-0.15435992181301117,
-0.10358056426048279,
-0.09248357266187668,
0.0829787626862526,
0.13966244459152222,
-0.10152437537908554,
0.038966357707977295,
0.020007789134979248,
0.04934212937951088,
-0.039886485785245895,
-0.07097627967596054,
0.027695221826434135,
-0.0514075867831707,
-0.06307904422283173,
-0.021254485473036766,
0.09718836843967438,
-0.012797299772500992,
0.0486016646027565,
0.0038740751333534718,
-0.07011836767196655,
-0.05441392958164215,
-0.1253724843263626,
-0.10530663281679153,
0.0023130476474761963,
0.02553047426044941,
0.016438590362668037,
-0.08838768303394318,
0.07273732125759125,
-0.022104734554886818,
-0.07312192767858505,
0.09108705818653107,
0.17007039487361908,
-0.06711280345916748,
0.0025897978339344263,
0.08768578618764877,
-0.061712123453617096,
-0.16862212121486664,
-0.049377162009477615,
0.05241546779870987,
0.08140699565410614,
-0.03545545041561127,
-0.14644058048725128,
0.06993088126182556,
0.036910831928253174,
0.03815784677863121,
0.025249473750591278,
-0.3072279393672943,
-0.12359243631362915,
0.03976542875170708,
0.06550192087888718,
0.04886608570814133,
-0.10949306935071945,
-0.033198144286870956,
-0.06429766118526459,
-0.05581110343337059,
0.043174780905246735,
0.06624210625886917,
0.1274694949388504,
-0.03316449746489525,
0.05071939155459404,
0.038317419588565826,
-0.024411603808403015,
0.08903443813323975,
-0.009143241681158543,
0.0903664231300354,
-0.022104278206825256,
0.01950831525027752,
0.06231138855218887,
-0.06162375584244728,
0.17721068859100342,
-0.17545606195926666,
0.08496920764446259,
-0.21160519123077393,
-0.05239444971084595,
-0.007199458312243223,
-0.0015918214339762926,
-0.03203180432319641,
-0.06135004758834839,
-0.11769863218069077,
0.004430433269590139,
0.06048472225666046,
-0.028135912492871284,
0.060668472200632095,
-0.01986066810786724,
-0.051660433411598206,
0.03196697682142258,
0.07948443293571472,
-0.0076315272599458694,
-0.15078414976596832,
0.035375453531742096,
0.03192205727100372,
0.08617477118968964,
-0.2035660743713379,
0.01316914614289999,
0.12403720617294312,
0.021780068054795265,
0.10524539649486542,
0.018748540431261063,
-0.07499858736991882,
0.04277839884161949,
0.07034160196781158,
-0.028717288747429848,
-0.09726578742265701,
-0.004385167267173529,
-0.025062285363674164,
-0.09894048422574997,
0.022115210071206093,
0.08652187883853912,
-0.06305497884750366,
-0.01684682071208954,
-0.010424469597637653,
0.007833675481379032,
-0.06752333045005798,
0.1928016096353531,
0.020620042458176613,
0.0755176842212677,
-0.060243040323257446,
0.08049039542675018,
0.09782519936561584,
-0.11540412157773972,
0.012094497680664062,
0.17063383758068085,
-0.07737535983324051,
-0.016604656353592873,
0.07295578718185425,
0.08647855371236801,
-0.05604366213083267,
-0.05053623393177986,
-0.0802573561668396,
-0.06526653468608856,
0.009384503588080406,
0.029836757108569145,
0.06569105386734009,
0.07476262748241425,
-0.0469503290951252,
0.018563413992524147,
-0.11331668496131897,
0.09509878605604172,
0.08005042374134064,
0.052653416991233826,
-0.14138682186603546,
0.16338391602039337,
0.03835079073905945,
0.0635097548365593,
0.003942023031413555,
0.02712879702448845,
-0.09938941150903702,
0.04285154119133949,
-0.007363428827375174,
0.04566499963402748,
-0.007617075927555561,
0.05113515630364418,
-0.02385522425174713,
0.028415627777576447,
-0.028189590200781822,
0.04470750689506531,
-0.03855679929256439,
-0.03873004391789436,
-0.032008096575737,
0.020924534648656845,
-0.051364440470933914,
-0.016964206472039223,
0.01074624340981245,
-0.08060026913881302,
0.09075423330068588,
-0.05658198893070221,
0.0010495675960555673,
0.015167399309575558,
0.0043878937140107155,
0.05893861502408981,
0.031748950481414795,
0.05051972717046738,
-0.01099431049078703,
-0.0009504698100499809,
0.04166506603360176,
0.009696637280285358,
-0.0035408595576882362,
-0.0055005489848554134,
0.046360332518815994,
-0.1496194303035736,
-0.09334320574998856,
-0.09270842373371124,
-0.06530117988586426,
-0.06642501056194305,
0.07446839660406113,
0.07533250004053116,
0.07715263962745667,
0.10876087099313736,
-0.03740197420120239,
0.01582697406411171,
-0.1301887035369873,
-0.03846275433897972,
0.04578401520848274,
-0.015239027328789234,
-0.07438827306032181,
-0.03834855183959007,
0.055286433547735214,
-0.04149670526385307,
0.11238645762205124,
-0.0015661045908927917,
0.04869280755519867,
-0.014562852680683136,
-0.03987138345837593,
-0.00031647965079173446,
0.000347361114108935,
0.22224244475364685,
-0.0807608962059021,
0.009383101016283035,
-0.00772361783310771,
0.004791073966771364,
0.05278920382261276,
0.144008606672287,
0.09012608230113983,
0.12088444828987122,
0.047421619296073914,
0.09942714869976044,
-0.06144412234425545,
-0.026963964104652405,
-0.17066065967082977,
0.03509336709976196,
-0.009405075572431087,
0.03550281748175621,
-0.02717776969075203,
0.12469673901796341,
0.13534756004810333,
-0.12423983216285706,
0.0970243439078331,
0.027874141931533813,
-0.10280231386423111,
-0.052179671823978424,
-0.07512186467647552,
-0.047551047056913376,
-0.09987697750329971,
0.014796226285398006,
-0.11207252740859985,
0.022065479308366776,
0.08434078097343445,
0.037762727588415146,
-0.024042250588536263,
0.14776280522346497,
-0.026666300371289253,
-0.08110911399126053,
0.01774503104388714,
0.0352480448782444,
0.0419326014816761,
0.11113005876541138,
0.01532116811722517,
0.057289596647024155,
-0.06926444917917252,
0.08161415904760361,
0.02209206484258175,
0.003755000652745366,
0.022595589980483055,
-0.005694187246263027,
0.002735304180532694,
-0.04818302020430565,
0.009449400007724762,
0.07521971315145493,
0.16595672070980072,
0.04445637762546539,
-0.05731833726167679,
-0.052348289638757706,
0.19750498235225677,
-0.06025565415620804,
-0.07849051803350449,
-0.12330206483602524,
0.18401603400707245,
0.053605817258358,
0.02684961073100567,
0.007673498243093491,
-0.08647900074720383,
-0.04398440197110176,
0.22683998942375183,
0.029671339318156242,
-0.006465904880315065,
-0.0376395545899868,
-0.018443340435624123,
-0.010469463653862476,
-0.030338900163769722,
0.14803601801395416,
0.020779786631464958,
0.23174802958965302,
-0.0009205652168020606,
-0.010357246734201908,
-0.038225091993808746,
-0.027095766738057137,
-0.04012434184551239,
0.1904466301202774,
-0.039350688457489014,
0.02642730250954628,
-0.09717654436826706,
-0.0030050347559154034,
0.029961412772536278,
-0.10333951562643051,
0.09409322589635849,
-0.10498923063278198,
-0.0701335072517395,
0.023381683975458145,
0.07255809754133224,
-0.030796073377132416,
0.04525953158736229,
-0.010524739511311054,
0.05094172805547714,
0.025257224217057228,
-0.019865941256284714,
-0.11407642066478729,
-0.1423884630203247,
0.0457589216530323,
-0.009836643002927303,
0.14495381712913513,
0.020859986543655396,
0.06769980490207672,
0.09460512548685074,
0.0211125910282135,
-0.06796444952487946,
0.10402464121580124,
0.03162920102477074,
0.009222624823451042,
0.07299722731113434,
0.1210850402712822,
-0.044137630611658096,
0.14962734282016754,
0.002352055860683322,
-0.02412610873579979,
-0.02492465451359749,
-0.02056683413684368,
0.0016194129129871726,
-0.1465490311384201,
-0.009069330990314484,
-0.0647265687584877,
0.1291487067937851,
0.19886578619480133,
-0.04627503454685211,
-0.020293334499001503,
-0.04321688413619995,
0.05946377292275429,
-0.018226394429802895,
0.08554777503013611,
-0.0007300059660337865,
-0.16326573491096497,
-0.006231456529349089,
-0.03340297192335129,
0.006929845083504915,
-0.1758827567100525,
-0.038480136543512344,
-0.03864670544862747,
-0.03557140752673149,
-0.09205842763185501,
0.1479434072971344,
0.07049542665481567,
0.01909462921321392,
-0.04203573241829872,
-0.19347839057445526,
-0.018608560785651207,
0.05433715134859085,
-0.14216524362564087,
-0.12071877717971802
] |
null | null |
transformers
|
# CodeTrans model for source code summarization sql
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization sql dataset.
## Intended uses & limitations
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql", skip_special_tokens=True),
device=0
)
tokenized_code = "select time ( col0 ) from tab0"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/source%20code%20summarization/sql/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_sql
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization sql
=================================================
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used single-task training on source code summarization sql dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
115
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11829722672700882,
0.03842651844024658,
-0.00021651657880283892,
0.046913787722587585,
0.16030675172805786,
0.015879150480031967,
0.08486726880073547,
0.03700187802314758,
0.00416232505813241,
-0.04853709414601326,
0.08114942163228989,
0.13262304663658142,
0.02861112914979458,
0.15516117215156555,
-0.03163391351699829,
-0.1800098568201065,
-0.0020905560813844204,
0.04391103982925415,
-0.18841134011745453,
0.12594591081142426,
0.11647581309080124,
-0.05482039600610733,
0.09630665183067322,
0.004475155845284462,
-0.20419366657733917,
0.07166168838739395,
0.010242474265396595,
-0.06805887073278427,
0.12498646229505539,
0.10805126279592514,
0.1092548593878746,
0.03596363961696625,
0.0300209429115057,
-0.22787117958068848,
0.0390714593231678,
-0.048660989850759506,
-0.010741882026195526,
0.043930601328611374,
0.032265111804008484,
-0.058232296258211136,
0.14255808293819427,
-0.026857133954763412,
0.007099861279129982,
0.06671138107776642,
-0.10708115249872208,
-0.05727280676364899,
-0.03303803130984306,
0.01070262212306261,
0.07357186824083328,
0.08106829226016998,
0.016692042350769043,
0.1195385530591011,
-0.14274609088897705,
0.11651401966810226,
0.09003540873527527,
-0.19755370914936066,
-0.014942756853997707,
0.1281273514032364,
0.062028802931308746,
-0.08303071558475494,
-0.04293094575405121,
0.022469354793429375,
0.05164765566587448,
0.00024002099235076457,
0.02062946744263172,
-0.12695597112178802,
-0.1391305774450302,
0.0868520587682724,
-0.07604383677244186,
-0.06964045763015747,
0.28158295154571533,
0.009662375785410404,
-0.045655667781829834,
-0.03306029736995697,
-0.03598611429333687,
0.0433080680668354,
-0.01402506697922945,
0.008759449236094952,
0.011450008489191532,
0.00011384839308448136,
-0.05178261920809746,
-0.02834155037999153,
-0.1116379052400589,
-0.12139821797609329,
-0.029398461803793907,
0.08647240698337555,
-0.011114743538200855,
0.03691980615258217,
-0.13870638608932495,
0.0928826555609703,
0.0631256103515625,
-0.05788680538535118,
0.01190720684826374,
-0.07038760185241699,
-0.0897190123796463,
-0.01578647643327713,
-0.08128087222576141,
-0.16198815405368805,
0.09967046976089478,
0.10545267164707184,
-0.0370703786611557,
0.05914614722132683,
0.06216214969754219,
0.053765516728162766,
0.05505279079079628,
0.21443305909633636,
-0.006383453030139208,
-0.07330576330423355,
0.06461044400930405,
-0.043276574462652206,
-0.051872726529836655,
0.01746547780930996,
-0.08823616057634354,
-0.038788482546806335,
-0.010805505327880383,
0.12158598750829697,
-0.09960625320672989,
0.09076502919197083,
-0.07458839565515518,
-0.03915642946958542,
-0.004911077208817005,
-0.13379327952861786,
-0.023959225043654442,
0.02549041621387005,
-0.05772070586681366,
-0.047919198870658875,
0.10989189147949219,
-0.05855097249150276,
-0.10052715986967087,
-0.04376000910997391,
-0.08082912117242813,
-0.015506729483604431,
-0.09676385670900345,
-0.08216874301433563,
0.008927157148718834,
0.05709436535835266,
0.07025090605020523,
-0.11678779125213623,
-0.19407954812049866,
-0.0033237291499972343,
0.07065817713737488,
0.014159642159938812,
0.02603180892765522,
-0.08748308569192886,
0.0015902919694781303,
-0.02430981956422329,
-0.023820383474230766,
0.01256253756582737,
-0.078548863530159,
0.06726106256246567,
0.06973282992839813,
0.02950628660619259,
-0.10436010360717773,
0.042290907353162766,
-0.12699422240257263,
0.06225541979074478,
-0.14145302772521973,
0.12192564457654953,
-0.06007522717118263,
0.12610898911952972,
-0.1119009405374527,
-0.037757303565740585,
0.03670615702867508,
0.06251297146081924,
0.061274006962776184,
0.15017090737819672,
-0.14064696431159973,
-0.06158652901649475,
0.14910577237606049,
-0.10843785852193832,
-0.20418643951416016,
0.07068506628274918,
-0.07403788715600967,
0.1715259999036789,
0.07870911806821823,
0.16834425926208496,
0.1331547051668167,
-0.0525188148021698,
0.0352608859539032,
0.10095491260290146,
-0.04958346486091614,
-0.07888798415660858,
0.07661689072847366,
0.05927696079015732,
-0.13738979399204254,
0.05908111855387688,
-0.004625466652214527,
0.11700677871704102,
-0.040296416729688644,
-0.05769098922610283,
-0.013221990317106247,
-0.06714525818824768,
0.023893633857369423,
-0.010061545297503471,
0.08888985961675644,
0.018256481736898422,
-0.0007831378607079387,
0.09293082356452942,
0.09762439131736755,
-0.10373040288686752,
-0.0040612053126096725,
-0.11413132399320602,
0.0373632088303566,
-0.08887128531932831,
0.03351861983537674,
-0.21028250455856323,
-0.05860935524106026,
-0.007217642851173878,
0.004960246849805117,
0.07235024124383926,
0.022304009646177292,
0.005745750851929188,
-0.008764287456870079,
0.00522166071459651,
0.005884324200451374,
0.02494386024773121,
-0.016860704869031906,
-0.044165197759866714,
-0.09825918078422546,
-0.05519796535372734,
-0.04478118196129799,
0.05028210207819939,
-0.1712959110736847,
0.004881078843027353,
0.04523194953799248,
0.06315930932760239,
0.001990579767152667,
0.028070155531167984,
0.042894259095191956,
0.05585221201181412,
-0.04208378493785858,
-0.005775331519544125,
0.059461791068315506,
0.025692719966173172,
-0.16555200517177582,
0.04908646270632744,
-0.03706963732838631,
0.059625230729579926,
0.11135679483413696,
-0.11885527521371841,
-0.05066312104463577,
-0.10078447312116623,
-0.029182353988289833,
-0.018400244414806366,
0.024671606719493866,
-0.029101543128490448,
0.18331430852413177,
0.007642396260052919,
0.16065789759159088,
-0.09571867436170578,
-0.03485431522130966,
-0.0443446971476078,
-0.010997399687767029,
0.028961794450879097,
0.14281955361366272,
0.08991831541061401,
-0.2558926045894623,
0.07096008211374283,
0.06767655909061432,
-0.005145180504769087,
0.17248159646987915,
-0.06186862662434578,
-0.04048556834459305,
-0.027384206652641296,
0.03910141438245773,
-0.032439108937978745,
0.15845637023448944,
-0.18130366504192352,
-0.029263406991958618,
0.01528235524892807,
-0.020150477066636086,
0.09936005622148514,
-0.11418841034173965,
-0.002603216329589486,
0.018912093713879585,
-0.04195120930671692,
-0.1668998897075653,
0.03700386732816696,
0.01358980406075716,
0.021651044487953186,
0.00541476346552372,
0.019831426441669464,
0.04492863640189171,
-0.04426112398505211,
-0.13602522015571594,
0.24678131937980652,
-0.08532845973968506,
-0.25677403807640076,
-0.18784405291080475,
0.04229454696178436,
-0.021627675741910934,
0.0006328414310701191,
0.053784556686878204,
-0.05367497354745865,
-0.05528825521469116,
-0.01798824593424797,
0.13712427020072937,
-0.025857627391815186,
-0.024568526074290276,
-0.03710633143782616,
0.06374885141849518,
0.009723182767629623,
-0.17664623260498047,
-0.0005093719810247421,
-0.007588368374854326,
0.03944087401032448,
-0.0012668616836890578,
-0.14840222895145416,
0.12292996048927307,
0.0813545510172844,
-0.03637685999274254,
0.037973519414663315,
-0.0498347133398056,
0.26361244916915894,
-0.056501179933547974,
-0.0906873270869255,
0.13350465893745422,
-0.10293067991733551,
0.008398809470236301,
0.048481207340955734,
0.016088571399450302,
-0.10712913423776627,
0.03460795432329178,
-0.03859930485486984,
-0.06389474868774414,
-0.26799315214157104,
-0.10260865092277527,
-0.09058967232704163,
0.06498217582702637,
0.013521556742489338,
0.02563381753861904,
-0.10557302832603455,
0.06585878133773804,
0.0661914199590683,
0.10014764219522476,
-0.017905419692397118,
0.05575432628393173,
0.11139120906591415,
0.016640931367874146,
0.008240618743002415,
-0.10981829464435577,
-0.03593868389725685,
0.04829929769039154,
0.08733956515789032,
0.1769423633813858,
0.019545182585716248,
0.1862316131591797,
0.06018378958106041,
0.04163108766078949,
0.040352191776037216,
0.15650039911270142,
-0.07721835374832153,
0.017279010266065598,
-0.007932126522064209,
-0.01734810508787632,
-0.1545836180448532,
0.043805789202451706,
-0.015977805480360985,
0.02976190112531185,
-0.13426180183887482,
-0.031639520078897476,
0.06895886361598969,
0.12831230461597443,
-0.007278642617166042,
-0.24467375874519348,
-0.1374540776014328,
0.03148767724633217,
-0.08499220758676529,
-0.04011142998933792,
0.04047727584838867,
0.09326048195362091,
-0.13410557806491852,
0.019033227115869522,
-0.03508099168539047,
0.16820840537548065,
-0.052276454865932465,
0.02989039570093155,
-0.049684297293424606,
-0.03280544653534889,
0.012405088171362877,
0.15128400921821594,
-0.20166286826133728,
0.2363375872373581,
0.017349576577544212,
0.007225213572382927,
-0.048102788627147675,
0.028808610513806343,
0.011469417251646519,
0.0752493366599083,
0.11167161911725998,
-0.021200228482484818,
-0.0971967875957489,
-0.16326940059661865,
0.008456860668957233,
0.0931185781955719,
0.05209751054644585,
-0.029474740847945213,
0.07572066783905029,
-0.03103615716099739,
0.02475815638899803,
-0.005703955888748169,
-0.03999249264597893,
-0.08173160254955292,
-0.12588509917259216,
-0.023661533370614052,
-0.05359738692641258,
0.06661761552095413,
-0.026757637038826942,
0.0203532837331295,
0.05938243865966797,
0.18204446136951447,
-0.08308285474777222,
-0.06319709867238998,
-0.11963896453380585,
0.08218274265527725,
0.10937228053808212,
-0.09393710643053055,
0.03491795435547829,
-0.00646540243178606,
0.02277163788676262,
0.017819253727793694,
-0.15734711289405823,
0.07207200676202774,
-0.06885375082492828,
0.013265975750982761,
-0.03111337684094906,
0.10335204750299454,
-0.011555634438991547,
0.010729532688856125,
0.06061539426445961,
-0.06042034924030304,
-0.05380954220890999,
-0.13806171715259552,
-0.09283149242401123,
-0.05604724958539009,
0.030908072367310524,
0.0952671617269516,
-0.14124688506126404,
-0.01450105756521225,
-0.0016093408921733499,
-0.0233540590852499,
0.241264209151268,
0.10942473262548447,
-0.04355229437351227,
0.0326835997402668,
0.1142830103635788,
-0.09814591705799103,
-0.2647928297519684,
-0.004669151268899441,
-0.0217368695884943,
0.08416146039962769,
0.027059555053710938,
-0.15845398604869843,
0.10816770792007446,
-0.012596908025443554,
0.03803027421236038,
-0.026298675686120987,
-0.243617981672287,
-0.10528511554002762,
0.11448340862989426,
0.10245974361896515,
0.09624851495027542,
-0.12103114277124405,
-0.06682049483060837,
-0.09636674076318741,
-0.1834440678358078,
0.15742160379886627,
-0.09683360159397125,
0.09762654453516006,
0.003862344194203615,
0.05575565621256828,
0.02166249230504036,
-0.047623321413993835,
0.10140487551689148,
0.029182350262999535,
0.08329612016677856,
-0.007293634582310915,
-0.08912104368209839,
0.12947094440460205,
-0.01913020946085453,
0.12466736882925034,
-0.0797765925526619,
0.0933506116271019,
-0.2038925439119339,
-0.04152747243642807,
-0.023129165172576904,
0.04053875803947449,
-0.005440902896225452,
-0.04599563404917717,
-0.060036879032850266,
0.01787281036376953,
0.040071386843919754,
0.011329312808811665,
0.10598480701446533,
-0.052917756140232086,
0.007695822510868311,
0.11758091300725937,
0.15713311731815338,
-0.010999813675880432,
-0.06649681180715561,
0.04458734765648842,
0.01359573844820261,
0.08949335664510727,
-0.21246963739395142,
0.09044252336025238,
0.12322520464658737,
0.032091379165649414,
0.10575101524591446,
0.07862760126590729,
-0.015333008021116257,
0.05188211798667908,
0.08504267036914825,
-0.11767743527889252,
-0.10047411918640137,
-0.05510722100734711,
-0.07915439456701279,
-0.023117223754525185,
0.080081045627594,
0.1422179639339447,
-0.025845186784863472,
-0.014389541000127792,
-0.0023590426426380873,
-0.03780951723456383,
-0.1215401366353035,
0.14300322532653809,
0.03153657168149948,
0.06563234329223633,
-0.09830976277589798,
0.07852987945079803,
0.04040587693452835,
-0.12193536758422852,
-0.0310130026191473,
0.10443645715713501,
-0.14719997346401215,
-0.07452931255102158,
-0.02997920848429203,
0.23143692314624786,
-0.14913950860500336,
-0.07667907327413559,
-0.14651228487491608,
-0.078758105635643,
0.010807694867253304,
0.236646831035614,
0.10718165338039398,
0.0889652669429779,
-0.03940873220562935,
-0.007296252995729446,
-0.05879809707403183,
0.05839601159095764,
0.10163634270429611,
-0.0041090878657996655,
-0.0910852923989296,
0.0577453076839447,
-0.014358656480908394,
0.1493002027273178,
-0.05744972825050354,
-0.03296114504337311,
-0.1912446916103363,
0.07773200422525406,
-0.16558730602264404,
0.06792309135198593,
-0.07092484086751938,
0.030273333191871643,
0.007754888851195574,
-0.011420176364481449,
-0.040883880108594894,
0.04616689309477806,
-0.0915130078792572,
0.0144486790522933,
0.0033022852148860693,
0.07649587839841843,
-0.07889878004789352,
-0.0006349478499032557,
0.08054223656654358,
-0.04668800160288811,
0.11037776619195938,
0.06070994585752487,
-0.049468375742435455,
0.11947134882211685,
-0.1879909336566925,
-0.03328140452504158,
0.03891817480325699,
0.03376692160964012,
0.048136256635189056,
-0.03508726507425308,
0.0414070263504982,
0.036247942596673965,
0.03242132440209389,
-0.005385610740631819,
0.1043878123164177,
-0.11118410527706146,
-0.10123083740472794,
-0.033323559910058975,
-0.08760129660367966,
-0.04193156212568283,
0.00880185142159462,
0.05048687011003494,
0.09090224653482437,
0.11201085150241852,
-0.03220589458942413,
0.024006307125091553,
-0.09183540940284729,
-0.01221628487110138,
0.034645963460206985,
-0.07442989945411682,
-0.11590355634689331,
-0.10588983446359634,
0.03827674686908722,
-0.04435478150844574,
0.22904284298419952,
-0.02053580991923809,
0.14640624821186066,
0.005178721621632576,
0.0019967106636613607,
0.06302136182785034,
0.05660305544734001,
0.24719369411468506,
-0.002132234163582325,
0.04887491837143898,
-0.026861349120736122,
0.07581521570682526,
0.006311976350843906,
0.05798853561282158,
0.10226031392812729,
0.10030987858772278,
-0.011891862377524376,
0.12008114159107208,
0.02273138426244259,
0.050589751452207565,
-0.04299650341272354,
-0.0749303475022316,
0.043160781264305115,
0.03984306752681732,
-0.045757878571748734,
0.09421813488006592,
0.10284147411584854,
-0.1030687615275383,
0.09776884317398071,
-0.002918679267168045,
-0.09482073783874512,
-0.06906962394714355,
-0.010190465487539768,
-0.05991531163454056,
-0.13839015364646912,
0.0065883370116353035,
-0.12078497558832169,
-0.06394947320222855,
0.07546941936016083,
0.021950291469693184,
-0.04450714960694313,
0.18200364708900452,
0.0015386578161269426,
-0.06982533633708954,
0.028146304190158844,
-0.015272257849574089,
0.007751172874122858,
-0.0003817662945948541,
0.06689900159835815,
0.010579348541796207,
-0.019061345607042313,
-0.003154247533529997,
0.0455450564622879,
-0.02648351341485977,
0.013747666031122208,
-0.0745384693145752,
-0.023078184574842453,
-0.05481547862291336,
0.07019662857055664,
-0.013170265592634678,
0.026285067200660706,
0.023473674431443214,
-0.030349858105182648,
-0.005687909200787544,
0.2150130569934845,
-0.04589614272117615,
-0.07440610229969025,
-0.1544029712677002,
0.21345238387584686,
0.02844688855111599,
0.060328368097543716,
0.011568128131330013,
-0.07491638511419296,
-0.011396370828151703,
0.2748085856437683,
0.2129025012254715,
-0.08098213374614716,
0.0005707279196940362,
0.003831156063824892,
0.01715988479554653,
0.01242441963404417,
0.13482776284217834,
0.027350395917892456,
0.21652741730213165,
-0.03578706830739975,
-0.0789879783987999,
-0.03710184246301651,
-0.061368755996227264,
0.036028169095516205,
0.13490617275238037,
0.01975533366203308,
-0.04109237715601921,
-0.046221669763326645,
0.10230779647827148,
-0.18157435953617096,
-0.12751029431819916,
0.020603686571121216,
-0.1403474509716034,
-0.07644367218017578,
-0.072187639772892,
0.01705292984843254,
0.008901124820113182,
0.043694064021110535,
-0.04632682353258133,
-0.008730996400117874,
0.04664861783385277,
0.02400340512394905,
-0.16809752583503723,
-0.09858894348144531,
0.0889122411608696,
-0.02541540004312992,
0.12597210705280304,
-0.027750760316848755,
0.11111492663621902,
0.09947217255830765,
0.0505172498524189,
-0.03235141932964325,
0.013664277270436287,
0.062261685729026794,
-0.009547875262796879,
0.06400828808546066,
0.028633037582039833,
-0.030017700046300888,
0.1701994240283966,
-0.04544125869870186,
-0.10791914165019989,
0.050646908581256866,
0.001437083468772471,
-0.02348765730857849,
-0.1089029386639595,
-0.019147545099258423,
-0.10271992534399033,
0.09451586753129959,
0.16427990794181824,
-0.04486991837620735,
0.032613612711429596,
-0.07086348533630371,
0.12433641403913498,
0.00684031518176198,
-0.01712827943265438,
-0.07416309416294098,
-0.14459937810897827,
-0.005631803534924984,
0.018140306696295738,
-0.03338248282670975,
-0.2065586894750595,
-0.017709020525217056,
-0.0548117496073246,
0.011965930461883545,
-0.023706629872322083,
0.1385994553565979,
0.13239595293998718,
0.037758421152830124,
-0.027836669236421585,
-0.16798394918441772,
-0.01164190098643303,
0.05924508720636368,
-0.11205017566680908,
-0.15086260437965393
] |
null | null |
transformers
|
# CodeTrans model for source code summarization sql
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "select time ( col0 ) from tab0"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/sql/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_sql_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization sql
=================================================
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
146
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.14027753472328186,
0.017429374158382416,
-0.0009911966044455767,
0.13499858975410461,
0.1154983714222908,
0.016824904829263687,
0.06534715741872787,
0.04692898318171501,
-0.029549721628427505,
0.02224586345255375,
0.046579185873270035,
0.012243643403053284,
0.04683276265859604,
0.2019985467195511,
0.015482104383409023,
-0.14956004917621613,
-0.017517710104584694,
0.01986844278872013,
-0.0813739001750946,
0.12285575270652771,
0.0967540442943573,
-0.07606319338083267,
0.05061240866780281,
-0.043952617794275284,
-0.20714177191257477,
0.06284740567207336,
0.0031081431079655886,
-0.0589151605963707,
0.10466274619102478,
0.06619575619697571,
0.12831714749336243,
-0.010245745070278645,
0.04326457530260086,
-0.13453933596611023,
0.008869955316185951,
0.019803708419203758,
0.03721171244978905,
0.02276112698018551,
0.06526400148868561,
0.03719713166356087,
0.13354042172431946,
-0.007663562893867493,
0.03360768407583237,
0.06460370123386383,
-0.06765631586313248,
-0.09603038430213928,
-0.028073493391275406,
0.030421823263168335,
0.053859345614910126,
0.09613381326198578,
-0.010702102445065975,
0.09619376063346863,
-0.15196539461612701,
0.12051790207624435,
0.08434391766786575,
-0.2518724203109741,
-0.011576535180211067,
0.09477530419826508,
0.0553920604288578,
0.06206725165247917,
-0.04505838453769684,
-0.05031673237681389,
0.08054354786872864,
0.053792208433151245,
0.040249645709991455,
-0.08338870108127594,
-0.08112791180610657,
0.033409491181373596,
-0.0948452427983284,
-0.0737370178103447,
0.2191774994134903,
0.011816432699561119,
-0.07702174037694931,
-0.052267640829086304,
-0.041416291147470474,
-0.11582725495100021,
0.03618989512324333,
0.03768886625766754,
0.004938732832670212,
-0.0184023454785347,
0.0014061796246096492,
0.02469477616250515,
-0.0955738052725792,
-0.14138129353523254,
0.007565842475742102,
0.07686246186494827,
0.04735148698091507,
0.03886235132813454,
-0.06765099614858627,
0.10978691279888153,
0.018612438812851906,
-0.03980781137943268,
-0.014544316567480564,
-0.028923524543642998,
-0.12945730984210968,
0.04085998609662056,
-0.06340949982404709,
-0.1826658993959427,
0.0003523191262502223,
0.013843064196407795,
-0.03434735909104347,
0.05715816840529442,
0.052886974066495895,
0.026830390095710754,
0.032884806394577026,
0.197710782289505,
0.0235142782330513,
-0.11463363468647003,
0.05529850721359253,
0.04063941910862923,
-0.04063798859715462,
-0.011858461424708366,
-0.06036888435482979,
-0.0783182829618454,
0.05806588754057884,
0.09392217546701431,
-0.13462749123573303,
0.046685781329870224,
-0.06949954479932785,
-0.04295305907726288,
-0.016431014984846115,
-0.16742557287216187,
-0.00206454424187541,
0.02493140660226345,
-0.06642469018697739,
-0.0544268861413002,
0.09191326051950455,
-0.15434369444847107,
-0.1506546437740326,
-0.031380683183670044,
-0.08061233907938004,
-0.04639047756791115,
-0.1583338975906372,
-0.14232878386974335,
-0.019164880737662315,
-0.03626493364572525,
0.004610945004969835,
-0.0696660727262497,
-0.1735762059688568,
-0.018438272178173065,
0.025698263198137283,
0.0045668017119169235,
-0.004360487684607506,
-0.07504231482744217,
0.0011246533831581473,
-0.01667976565659046,
-0.038226790726184845,
0.00681843888014555,
-0.046807777136564255,
0.11022160947322845,
0.09753921627998352,
0.030281420797109604,
-0.03470398485660553,
0.048652201890945435,
-0.07806512713432312,
0.06649250537157059,
-0.10544638335704803,
0.11841636896133423,
-0.07035898417234421,
0.07271783798933029,
-0.042713019996881485,
-0.09695352613925934,
0.042020123451948166,
0.0572851225733757,
0.07105078548192978,
0.06044177711009979,
-0.16848452389240265,
-0.0267464742064476,
0.1894923746585846,
-0.12387114763259888,
-0.10643554478883743,
0.10901419073343277,
-0.045114170759916306,
0.051244914531707764,
0.09074956178665161,
0.1350538581609726,
0.1415490061044693,
-0.020904306322336197,
0.00361806177534163,
0.06355097889900208,
0.03811107575893402,
-0.11275486648082733,
0.083425372838974,
0.04458000883460045,
-0.09323518723249435,
0.06445455551147461,
-0.004846927709877491,
0.09627104550600052,
-0.014093291945755482,
-0.037567198276519775,
-0.05004890263080597,
-0.07482758909463882,
-0.002211696468293667,
-0.0018511335365474224,
0.06413783133029938,
-0.06501338630914688,
-0.06396157294511795,
0.07207122445106506,
0.15509572625160217,
-0.11422494053840637,
-0.010109982453286648,
-0.08995401859283447,
0.03228883817791939,
-0.06145825609564781,
0.01863873563706875,
-0.17109975218772888,
0.016118768602609634,
0.06341245025396347,
-0.029136551544070244,
0.07156050205230713,
0.12109550088644028,
0.013832531869411469,
0.05328023433685303,
0.005202263593673706,
-0.005845972336828709,
-0.09282837063074112,
-0.05820484086871147,
-0.07108043879270554,
-0.059407465159893036,
-0.09614820033311844,
-0.045479338616132736,
-0.0029044244438409805,
-0.18519863486289978,
0.013628044165670872,
0.012020681984722614,
0.019076406955718994,
0.0033926137257367373,
-0.01641038991510868,
0.023074155673384666,
0.0666620209813118,
-0.05355878546833992,
-0.03378141671419144,
0.025448696687817574,
0.023654509335756302,
-0.07524831593036652,
-0.0531185008585453,
-0.08567875623703003,
0.004996585659682751,
0.10658279061317444,
0.053521621972322464,
-0.07293001562356949,
-0.0003988468088209629,
-0.019753824919462204,
-0.037193723022937775,
0.017189156264066696,
-0.06799949705600739,
0.1657085418701172,
0.0028920297045260668,
0.19463372230529785,
-0.1496220976114273,
-0.026121852919459343,
-0.027753960341215134,
0.024159682914614677,
0.05457158386707306,
0.13448278605937958,
0.014888161793351173,
-0.13144677877426147,
0.06457364559173584,
0.004300489090383053,
-0.06741825491189957,
0.21211379766464233,
-0.048533715307712555,
-0.08963393419981003,
0.013085120357573032,
0.09552151709794998,
-0.024926140904426575,
0.16179418563842773,
-0.16489045321941376,
-0.021044103428721428,
0.01179916225373745,
0.006889942102134228,
0.06633615493774414,
-0.13436570763587952,
0.010063224472105503,
0.01833052933216095,
-0.0701812133193016,
-0.12058901786804199,
-0.027894869446754456,
-0.003154858248308301,
0.03535700589418411,
-0.0009467290947213769,
0.002528758253902197,
0.01599741168320179,
-0.04527377709746361,
-0.1065734401345253,
0.22374065220355988,
-0.10897687077522278,
-0.23543556034564972,
-0.20384493470191956,
0.11966204643249512,
-0.053988244384527206,
0.0008909939788281918,
0.023624740540981293,
-0.08767388761043549,
-0.06466202437877655,
-0.057343628257513046,
0.17403307557106018,
-0.07471238076686859,
-0.009791904129087925,
-0.021117376163601875,
0.06335587054491043,
0.02411901392042637,
-0.20479269325733185,
0.03700589761137962,
-0.01653992384672165,
-0.017683174461126328,
-0.009477226994931698,
-0.08679400384426117,
0.08561117947101593,
0.144571915268898,
-0.06008828058838844,
0.019855495542287827,
-0.005726571194827557,
0.21310307085514069,
-0.027990998700261116,
-0.06084059923887253,
0.12790337204933167,
-0.0097154900431633,
0.01018684171140194,
0.028947504237294197,
0.0033054917585104704,
-0.08970674872398376,
0.051998578011989594,
0.005985787138342857,
-0.028865525498986244,
-0.27278587222099304,
-0.025130968540906906,
-0.0859353318810463,
0.04497579485177994,
0.03341922536492348,
0.044946733862161636,
-0.11019794642925262,
0.03151201084256172,
0.043170806020498276,
0.12642541527748108,
-0.01687837764620781,
0.03249679133296013,
0.07276084274053574,
-0.004690844099968672,
0.018184073269367218,
-0.09608447551727295,
0.0023327646777033806,
0.08523751050233841,
0.1000722125172615,
0.26796916127204895,
-0.0972214788198471,
0.20266281068325043,
0.03718603029847145,
0.06753196567296982,
0.0462280698120594,
0.15174232423305511,
-0.10641378164291382,
0.03242798149585724,
-0.0010943780653178692,
-0.008171215653419495,
-0.13568347692489624,
0.02853868342936039,
-0.031151248142123222,
0.0696190595626831,
-0.11014609038829803,
-0.02541802078485489,
0.010974517092108727,
0.18174292147159576,
0.03153357282280922,
-0.21753454208374023,
-0.12639105319976807,
0.02163640223443508,
-0.101164311170578,
-0.09430251270532608,
0.05814792215824127,
0.22366678714752197,
-0.0779435783624649,
-0.019799843430519104,
-0.013279606588184834,
0.13037371635437012,
-0.036922208964824677,
-0.021590853109955788,
-0.04306412860751152,
0.0722225159406662,
0.018053315579891205,
0.13567309081554413,
-0.24611353874206543,
0.14654631912708282,
0.0006457299459725618,
0.06148681789636612,
-0.0358407087624073,
0.06582416594028473,
-0.034570444375276566,
0.044487230479717255,
0.04916656017303467,
-0.007276305928826332,
-0.03964091092348099,
-0.1806032508611679,
-0.0054733771830797195,
0.03672336786985397,
0.043042007833719254,
0.04235944524407387,
0.07157205045223236,
-0.006199982482939959,
0.046040937304496765,
-0.008331721648573875,
-0.11900781840085983,
-0.07388594001531601,
-0.1078694611787796,
-0.015872200950980186,
-0.04502471163868904,
-0.02359408512711525,
-0.05572284013032913,
-0.007121379021555185,
0.07773500680923462,
0.18752357363700867,
-0.11763063818216324,
-0.08607716113328934,
-0.08863991498947144,
0.07315952330827713,
0.13242384791374207,
-0.07818058878183365,
0.05773294344544411,
-0.004723081365227699,
0.03438686579465866,
0.006860844325274229,
-0.09207116067409515,
0.05582146719098091,
-0.03833019360899925,
-0.05874134600162506,
-0.025759004056453705,
0.09621559828519821,
0.005139193031936884,
0.027262628078460693,
0.0019244914874434471,
-0.0794186070561409,
-0.04272369667887688,
-0.12571632862091064,
-0.10911297053098679,
-0.03837260603904724,
0.00006359419785439968,
0.06980156153440475,
-0.1382620632648468,
-0.05185770243406296,
-0.0021215754095464945,
-0.030509963631629944,
0.14281170070171356,
0.14481215178966522,
-0.06969135999679565,
0.03994398191571236,
0.12100610882043839,
-0.05814225599169731,
-0.1861613541841507,
0.010087980888783932,
0.0642123818397522,
0.11951274424791336,
-0.031171564012765884,
-0.18873168528079987,
0.05837687477469444,
0.024789873510599136,
0.03730500862002373,
0.03948201984167099,
-0.30950257182121277,
-0.12246858328580856,
0.06495819985866547,
0.1250249445438385,
0.07502727955579758,
-0.10854851454496384,
-0.04537429288029671,
-0.058040328323841095,
-0.10969310998916626,
0.0972384437918663,
-0.016050614416599274,
0.13526862859725952,
-0.03902551904320717,
0.04449646174907684,
0.03393327072262764,
-0.04723319411277771,
0.07000642269849777,
0.022765761241316795,
0.10226165503263474,
-0.030399911105632782,
0.024117084220051765,
0.12611234188079834,
-0.027290312573313713,
0.16715280711650848,
-0.13399693369865417,
0.10905696451663971,
-0.21840105950832367,
-0.0680173709988594,
-0.06689351052045822,
0.010779175907373428,
-0.032826025038957596,
-0.03737720474600792,
-0.06835135817527771,
0.02969532273709774,
0.001146903494372964,
-0.01039039809256792,
0.014220942743122578,
-0.03398016095161438,
-0.0240426417440176,
0.1072741150856018,
0.11153528094291687,
-0.005721853114664555,
-0.09207785874605179,
0.044777143746614456,
0.046674177050590515,
0.09045397490262985,
-0.18937991559505463,
0.030874105170369148,
0.11523736268281937,
0.02396801859140396,
0.11343257129192352,
0.04961463809013367,
-0.0966871827840805,
0.04542485997080803,
0.08863416314125061,
-0.08092983812093735,
-0.10265649855136871,
-0.020215176045894623,
-0.05707966536283493,
-0.07775041460990906,
0.05548551306128502,
0.0936979204416275,
-0.035477351397275925,
-0.020343756303191185,
-0.024792049080133438,
-0.027797739952802658,
-0.105147585272789,
0.20626160502433777,
0.05109662190079689,
0.08335106074810028,
-0.07086220383644104,
0.07421985268592834,
0.07828366756439209,
-0.08257836848497391,
0.008235933259129524,
0.18300758302211761,
-0.11136924475431442,
-0.04356585443019867,
0.026967616751790047,
0.16110946238040924,
-0.039671748876571655,
-0.05063653737306595,
-0.1291087567806244,
-0.08194556087255478,
0.037040166556835175,
0.15818971395492554,
0.08322243392467499,
0.10287006199359894,
-0.050503022968769073,
0.002216456225141883,
-0.07102488726377487,
0.08193635195493698,
0.08392151445150375,
0.02862728014588356,
-0.11629700660705566,
0.1373443752527237,
0.034339871257543564,
0.10991987586021423,
-0.0285947285592556,
-0.00915131438523531,
-0.11492106318473816,
0.05697320029139519,
-0.10032692551612854,
0.03377636894583702,
-0.013701964169740677,
0.0483020581305027,
-0.02152531035244465,
-0.006190963089466095,
-0.02930249087512493,
0.06535797566175461,
-0.08655869960784912,
-0.004683531820774078,
-0.005404843017458916,
0.04239679127931595,
-0.05114434286952019,
-0.009776908904314041,
0.036030251532793045,
-0.08351454883813858,
0.12390467524528503,
-0.008377142250537872,
-0.029174407944083214,
0.08278481662273407,
-0.04984142631292343,
0.03737974911928177,
0.00734344869852066,
0.05939748138189316,
0.009351570159196854,
0.035588402301073074,
0.07774128764867783,
0.03404703363776207,
0.052836522459983826,
0.008378122933208942,
0.10386388748884201,
-0.13339698314666748,
-0.10401052981615067,
-0.031160322949290276,
-0.09766919910907745,
-0.062254250049591064,
0.0887976661324501,
0.06356024742126465,
0.09885962307453156,
0.09934485703706741,
-0.03091418370604515,
0.011982288211584091,
-0.14236094057559967,
-0.05379464849829674,
0.0268399640917778,
-0.029704255983233452,
-0.09754586219787598,
-0.06372331082820892,
0.0485931932926178,
-0.02555590122938156,
0.1457364857196808,
0.0029028572607785463,
0.05096251517534256,
-0.021588826552033424,
-0.046943530440330505,
0.029288243502378464,
0.0298989899456501,
0.22906331717967987,
-0.06401348114013672,
0.0348597951233387,
0.003912750631570816,
0.010767122730612755,
-0.007193422876298428,
0.1172795370221138,
0.1208643689751625,
0.14420652389526367,
-0.013823180459439754,
0.0991114005446434,
0.021331356838345528,
-0.009727689437568188,
-0.09613995254039764,
0.01864108070731163,
0.002239033579826355,
0.06501252204179764,
-0.06762342900037766,
0.17048931121826172,
0.06500782072544098,
-0.10962620377540588,
0.11116643249988556,
0.018395090475678444,
-0.12774915993213654,
-0.046758946031332016,
-0.00034417875576764345,
-0.030627883970737457,
-0.13380590081214905,
0.023821892216801643,
-0.12530454993247986,
-0.013925260864198208,
0.07631316035985947,
0.04617244005203247,
-0.06610636413097382,
0.1686568409204483,
0.025783155113458633,
-0.05862165614962578,
0.04407497122883797,
0.005071717314422131,
0.030627712607383728,
0.036808080971241,
0.019056929275393486,
0.042267970740795135,
-0.023637957870960236,
0.03980756178498268,
0.019053928554058075,
-0.02293889783322811,
-0.004951550625264645,
-0.011277002282440662,
0.007529529742896557,
-0.031102370470762253,
0.025259964168071747,
0.04634302109479904,
0.153022900223732,
0.029369212687015533,
-0.07666720449924469,
-0.03415711969137192,
0.17571061849594116,
-0.04609210789203644,
-0.08044152706861496,
-0.13575716316699982,
0.175011545419693,
0.034770842641592026,
0.02278982847929001,
0.020236756652593613,
-0.09436408430337906,
-0.04054979607462883,
0.1998107135295868,
0.08507916331291199,
-0.03846168518066406,
-0.0285955760627985,
-0.004720898345112801,
-0.005176389589905739,
-0.044320568442344666,
0.1987575888633728,
0.02253282628953457,
0.25016507506370544,
0.005181180313229561,
-0.00536038912832737,
-0.05938039347529411,
-0.03564729169011116,
-0.007103485055267811,
0.14860467612743378,
-0.045718129724264145,
-0.019226696342229843,
-0.08305970579385757,
0.00773362722247839,
-0.003796384437009692,
-0.10081103444099426,
0.07179134339094162,
-0.12507736682891846,
-0.09819518029689789,
-0.03584562614560127,
0.03722788020968437,
-0.03376598283648491,
0.023228654637932777,
-0.035849519073963165,
0.0483153872191906,
0.06602292507886887,
-0.025777606293559074,
-0.11698172241449356,
-0.1561298966407776,
0.09941203892230988,
-0.04779454320669174,
0.1480865478515625,
-0.011243357323110104,
0.12447769194841385,
0.09395819902420044,
0.04186229035258293,
-0.05710955336689949,
0.09418869018554688,
0.03453131392598152,
0.03537123650312424,
0.047000035643577576,
0.1096937283873558,
-0.04554801806807518,
0.16894672811031342,
-0.055260904133319855,
-0.033893827348947525,
-0.008112863637506962,
-0.05776333808898926,
-0.01589343510568142,
-0.15718793869018555,
-0.0034207587596029043,
-0.1048792377114296,
0.10362710803747177,
0.1932407170534134,
-0.04287819564342499,
-0.019807305186986923,
-0.09637989848852158,
0.09942284226417542,
-0.022931039333343506,
0.0591498538851738,
-0.03488939255475998,
-0.1849362701177597,
0.005334095098078251,
0.008341463282704353,
0.009554889053106308,
-0.2543010711669922,
-0.017910797148942947,
-0.03767841309309006,
-0.018716901540756226,
-0.06465916335582733,
0.15941867232322693,
0.09316409379243851,
0.051842328161001205,
-0.03514527902007103,
-0.15085414052009583,
-0.03430284559726715,
0.05532820150256157,
-0.12616506218910217,
-0.13016903400421143
] |
null | null |
transformers
|
# CodeTrans model for source code summarization sql
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the sql code snippets.
## Intended uses & limitations
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "select time ( col0 ) from tab0"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/sql/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_sql_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization sql
=================================================
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the sql code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09034965932369232,
0.08320652693510056,
-0.0014689178206026554,
0.1074773520231247,
0.051908448338508606,
0.025665240362286568,
0.055561549961566925,
0.09316596388816833,
-0.03759489580988884,
0.06589843332767487,
0.06930754333734512,
-0.04260404780507088,
0.06699728965759277,
0.18730458617210388,
0.0272356066852808,
-0.17193791270256042,
-0.022606149315834045,
0.024118078872561455,
-0.04465861991047859,
0.10434900224208832,
0.0964234322309494,
-0.08384914696216583,
0.06505250185728073,
-0.03584280610084534,
-0.11078990250825882,
0.06320445239543915,
-0.0467417947947979,
-0.03731608763337135,
0.08678136765956879,
0.0659438967704773,
0.11301594227552414,
-0.026149366050958633,
0.07996836304664612,
-0.2201630026102066,
0.0009636599570512772,
0.015271483920514584,
0.0507032573223114,
0.023201733827590942,
0.06924227625131607,
0.06372658908367157,
0.11202927678823471,
-0.04060458391904831,
0.03431298956274986,
0.05341816693544388,
-0.05960359796881676,
-0.06484667956829071,
-0.05143098533153534,
0.07430605590343475,
0.11275863647460938,
0.08976094424724579,
-0.013879014179110527,
0.00712022976949811,
-0.08529965579509735,
0.08475058525800705,
0.11911144852638245,
-0.23181945085525513,
-0.023604247719049454,
0.09671369194984436,
0.08137483894824982,
0.057089947164058685,
-0.08147954195737839,
-0.0327736958861351,
0.09906584024429321,
0.03404861316084862,
0.046214111149311066,
-0.08327921479940414,
-0.05300334095954895,
0.0070238858461380005,
-0.05643218383193016,
-0.05548154562711716,
0.14733915030956268,
0.05055638775229454,
-0.05058074742555618,
-0.09601672738790512,
-0.04285937547683716,
-0.18809443712234497,
0.04612259566783905,
0.019627604633569717,
0.020570287480950356,
-0.010560523718595505,
0.022032583132386208,
-0.008478849194943905,
-0.10291749984025955,
-0.10349931567907333,
-0.00840695295482874,
0.06213133782148361,
0.06864406168460846,
0.026783499866724014,
-0.013220392167568207,
0.07482779026031494,
-0.009969253093004227,
-0.05028502270579338,
-0.03421309217810631,
0.01741163432598114,
-0.13971750438213348,
0.02254245989024639,
-0.023666854947805405,
-0.0675717443227768,
-0.007591591216623783,
0.10437849909067154,
-0.06221774220466614,
0.08054491877555847,
0.1295023113489151,
0.0033498089760541916,
0.002282596193253994,
0.2357805371284485,
0.04316602647304535,
-0.15009813010692596,
0.002202873583883047,
0.026706485077738762,
-0.001695285551249981,
0.000723798933904618,
-0.06542913615703583,
-0.03664493188261986,
0.0010320359142497182,
0.06062190607190132,
-0.13116878271102905,
0.012957094237208366,
-0.04643721505999565,
-0.010919674299657345,
0.0382070317864418,
-0.13621775805950165,
0.035704899579286575,
0.01541859470307827,
-0.055601101368665695,
-0.06214065104722977,
0.05842284485697746,
-0.11254407465457916,
-0.11567503213882446,
0.02622874453663826,
-0.04735046997666359,
-0.02932082861661911,
-0.12253682315349579,
-0.12003788352012634,
-0.018029944971203804,
-0.046637870371341705,
0.014515397138893604,
-0.09877878427505493,
-0.10604038834571838,
-0.011270754039287567,
0.02979884296655655,
-0.007258799858391285,
-0.01974532939493656,
-0.0556475967168808,
0.02139260433614254,
-0.004637331236153841,
-0.028752971440553665,
0.017071925103664398,
-0.04185706377029419,
0.08461073040962219,
0.10365161299705505,
0.05043669044971466,
0.012508124113082886,
0.028140563517808914,
-0.0712912306189537,
0.06595204025506973,
-0.05073564127087593,
0.07455229759216309,
-0.02435063011944294,
0.05204756185412407,
-0.09552748501300812,
-0.0736822709441185,
0.0394604466855526,
0.054653942584991455,
0.0584648996591568,
0.025621624663472176,
-0.12107353657484055,
0.016406549140810966,
0.1426958590745926,
-0.09570109099149704,
-0.12644924223423004,
0.11385704576969147,
-0.0012386587914079428,
-0.0027506796177476645,
0.06493118405342102,
0.12571072578430176,
0.14226606488227844,
-0.0857108011841774,
-0.04520358517765999,
0.0930362269282341,
0.07011033594608307,
-0.0512545071542263,
0.10261406749486923,
0.013286587782204151,
0.03892393410205841,
0.02652215026319027,
0.04340135306119919,
0.059567272663116455,
-0.0059900362975895405,
-0.042746253311634064,
-0.0251630749553442,
-0.08436289429664612,
-0.03326200693845749,
-0.017598139122128487,
0.027495360001921654,
-0.06509564071893692,
-0.06424663215875626,
0.01619136519730091,
0.1637093722820282,
-0.08938361704349518,
0.024177901446819305,
-0.09918806701898575,
-0.06420513242483139,
-0.08399943262338638,
0.014743403531610966,
-0.1076589822769165,
0.007137317210435867,
0.04498845338821411,
-0.06531240791082382,
0.07211476564407349,
0.07978943735361099,
0.0029888534918427467,
0.0319196917116642,
-0.04073745384812355,
-0.04198478162288666,
-0.04827165603637695,
-0.055358268320560455,
-0.12982217967510223,
-0.014307127334177494,
-0.0973937138915062,
-0.0268827173858881,
-0.05392168089747429,
-0.16136521100997925,
0.01690860092639923,
-0.02484646812081337,
0.024468209594488144,
-0.001687763724476099,
-0.020419195294380188,
0.030725212767720222,
0.054537929594516754,
-0.052652452141046524,
-0.0832982063293457,
0.00679309107363224,
0.0223221518099308,
-0.13068394362926483,
-0.057477809488773346,
-0.10945344716310501,
-0.04307182878255844,
0.06842261552810669,
0.09732158482074738,
-0.07555993646383286,
0.0006198639166541398,
-0.023982154205441475,
-0.0533735491335392,
-0.04665630683302879,
-0.07569529861211777,
0.17008823156356812,
0.008841235190629959,
0.1635078638792038,
-0.12914173305034637,
-0.05005504563450813,
-0.03919973224401474,
-0.011081326752901077,
0.01350303553044796,
0.156375914812088,
0.010818544775247574,
-0.09938394278287888,
0.050729330629110336,
-0.014004196040332317,
-0.058186858892440796,
0.15064597129821777,
-0.00576967466622591,
-0.09241236746311188,
0.01053033210337162,
0.09306728839874268,
-0.017981745302677155,
0.14517270028591156,
-0.07509469240903854,
-0.005685240030288696,
-0.001055260538123548,
0.02363513596355915,
0.04138590395450592,
-0.13540630042552948,
0.02681891992688179,
0.0612407922744751,
-0.07181165367364883,
-0.06939921528100967,
-0.04512837901711464,
-0.04036141186952591,
0.031676214188337326,
0.0051914663054049015,
0.011974329128861427,
-0.011763111688196659,
-0.02844848483800888,
-0.09723065048456192,
0.20672696828842163,
-0.08795087784528732,
-0.2172708809375763,
-0.17345717549324036,
0.041817937046289444,
-0.050604596734046936,
-0.0008405172266066074,
0.055315542966127396,
-0.11901064962148666,
-0.06976187974214554,
-0.09139283001422882,
0.13251131772994995,
-0.11747168004512787,
0.009214909747242928,
-0.002173598622903228,
0.030953915789723396,
0.03645751252770424,
-0.1751973181962967,
0.03330284357070923,
-0.008217845112085342,
-0.003185007954016328,
-0.00004154598718741909,
-0.05722122639417648,
0.0963088870048523,
0.11641070246696472,
-0.07312371581792831,
0.018181879073381424,
-0.010195234790444374,
0.1756681650876999,
-0.04682227596640587,
0.018979979678988457,
0.1886584311723709,
0.018085744231939316,
0.034629758447408676,
0.05451960861682892,
0.015685169026255608,
-0.09240174293518066,
0.05953039973974228,
0.05576283112168312,
-0.032400764524936676,
-0.24984629452228546,
-0.0001852210407378152,
-0.07665404677391052,
0.0358898900449276,
0.11071331799030304,
0.05238674208521843,
-0.15680719912052155,
0.02943272329866886,
-0.009523424319922924,
0.13203251361846924,
-0.03998372703790665,
0.051166415214538574,
0.022263668477535248,
0.0181867852807045,
0.014314576052129269,
-0.0958247035741806,
0.008759912103414536,
0.07732938975095749,
0.11815378069877625,
0.20454871654510498,
-0.04894094914197922,
0.203030526638031,
0.014969154261052608,
0.07262534648180008,
0.028925301507115364,
0.10114719718694687,
-0.11186975985765457,
0.0011489056050777435,
0.006464660167694092,
-0.004444390069693327,
-0.07673206925392151,
0.059788547456264496,
-0.015248289331793785,
0.06943701952695847,
-0.06253892928361893,
0.04045072942972183,
0.01956724189221859,
0.17680935561656952,
0.05734071508049965,
-0.1803465336561203,
-0.11875719577074051,
0.02401982806622982,
-0.10696899145841599,
-0.10385008901357651,
0.061184220016002655,
0.19512921571731567,
-0.04711832478642464,
0.021723760291934013,
-0.004058981779962778,
0.13879533112049103,
-0.06505630165338516,
-0.01894080825150013,
0.019030125811696053,
0.0734090730547905,
0.009875133633613586,
0.12960563600063324,
-0.2688165009021759,
0.08721216768026352,
0.018930770456790924,
0.09339044243097305,
-0.014058531261980534,
0.07309561967849731,
-0.03807607293128967,
-0.00027116562705487013,
0.07324624061584473,
-0.0003580671618692577,
-0.10325402021408081,
-0.2151309698820114,
-0.04992244020104408,
0.0334516316652298,
0.061784736812114716,
-0.017764493823051453,
0.09675852209329605,
-0.014295877888798714,
0.06322598457336426,
-0.02710261568427086,
-0.11602695286273956,
-0.061412762850522995,
-0.14198675751686096,
-0.024562105536460876,
-0.0018493738025426865,
-0.01819271594285965,
-0.030689939856529236,
0.026241980493068695,
-0.006215353962033987,
0.21455027163028717,
-0.18122275173664093,
-0.1025962382555008,
-0.09522662311792374,
0.08660391718149185,
0.12426978349685669,
-0.10399335622787476,
0.018216773867607117,
0.018940536305308342,
0.05441503971815109,
-0.03307703137397766,
-0.06056788191199303,
0.02199552021920681,
-0.05796821415424347,
-0.0726664811372757,
-0.02684568427503109,
0.09185965359210968,
-0.01904606819152832,
0.04999622702598572,
0.005265992134809494,
-0.08593953400850296,
-0.051508400589227676,
-0.1335318386554718,
-0.07244826853275299,
-0.016225939616560936,
0.011218138970434666,
0.013373463414609432,
-0.09872236847877502,
0.06681077927350998,
-0.01150960847735405,
-0.09332456439733505,
0.0835123211145401,
0.1959407478570938,
-0.06429233402013779,
0.014884515665471554,
0.10202962905168533,
-0.05739432945847511,
-0.15849894285202026,
-0.0634971335530281,
0.0523788146674633,
0.09451485425233841,
-0.015248282812535763,
-0.15071053802967072,
0.0774761512875557,
0.035865504294633865,
0.029527373611927032,
-0.0013470185222104192,
-0.2816446125507355,
-0.12767961621284485,
0.045714735984802246,
0.06666011363267899,
0.04901089519262314,
-0.12136730551719666,
-0.0466296412050724,
-0.06161724403500557,
-0.08268165588378906,
0.03226649388670921,
0.06448963284492493,
0.13507285714149475,
-0.03403806686401367,
0.027644839137792587,
0.028156813234090805,
-0.023839537054300308,
0.10023783147335052,
0.00840306468307972,
0.09477534890174866,
-0.010923119261860847,
0.0339660607278347,
0.06037689000368118,
-0.06023937836289406,
0.16595830023288727,
-0.16979214549064636,
0.08183971792459488,
-0.2215377241373062,
-0.05473695322871208,
-0.0033468713518232107,
-0.010790939442813396,
-0.03731485828757286,
-0.052155524492263794,
-0.09674593806266785,
0.0019928684923797846,
0.0545857734978199,
-0.025772200897336006,
0.06614323705434799,
-0.030578799545764923,
-0.05283309891819954,
0.06370384991168976,
0.0905362069606781,
-0.03021901845932007,
-0.14425399899482727,
0.01745680347084999,
0.031018869951367378,
0.08041007816791534,
-0.1852087527513504,
0.02207980304956436,
0.12228406220674515,
0.014273259788751602,
0.11190541088581085,
0.015979191288352013,
-0.06081268936395645,
0.04976256936788559,
0.0705971047282219,
-0.027569783851504326,
-0.11425592005252838,
-0.008546401746571064,
-0.039860986173152924,
-0.10359659790992737,
0.03299986571073532,
0.08453258872032166,
-0.04182177782058716,
-0.020056182518601418,
-0.013974311761558056,
0.0022301306016743183,
-0.0661897137761116,
0.2018807828426361,
0.01849145069718361,
0.0893731638789177,
-0.05831743776798248,
0.07958249747753143,
0.09863654524087906,
-0.09704165905714035,
0.010911255143582821,
0.1634855568408966,
-0.07767748087644577,
-0.022206060588359833,
0.05555732548236847,
0.095949187874794,
-0.060313787311315536,
-0.058134451508522034,
-0.10740702599287033,
-0.07922467589378357,
0.018072133883833885,
0.01709456741809845,
0.06752190738916397,
0.0711832195520401,
-0.03444971516728401,
0.02286524884402752,
-0.08310204744338989,
0.1001177728176117,
0.07492320239543915,
0.048808518797159195,
-0.13723663985729218,
0.1458701342344284,
0.03043290600180626,
0.09019898623228073,
0.002583174267783761,
0.03691701218485832,
-0.09758143126964569,
0.04428567364811897,
-0.03630318492650986,
0.039698921144008636,
-0.010943074710667133,
0.050825633108615875,
-0.02117878384888172,
0.017792003229260445,
-0.03239751607179642,
0.05262777581810951,
-0.04135686531662941,
-0.03282986208796501,
-0.028409205377101898,
0.04483596235513687,
-0.05892856419086456,
-0.016837531700730324,
0.009625950828194618,
-0.07568757981061935,
0.097774438560009,
-0.06646795570850372,
-0.006916769314557314,
0.001910840510390699,
-0.017712444067001343,
0.06321166455745697,
0.01924312859773636,
0.05893731862306595,
-0.012487124651670456,
-0.003284111153334379,
0.04172724857926369,
0.02129059098660946,
-0.009001950733363628,
-0.007501369807869196,
0.04532384127378464,
-0.14091083407402039,
-0.0984504222869873,
-0.1003987044095993,
-0.061113592237234116,
-0.06699312478303909,
0.07873225957155228,
0.09038056433200836,
0.06920546293258667,
0.10225158929824829,
-0.030653996393084526,
-0.00337208085693419,
-0.1456526666879654,
-0.03380502387881279,
0.05067227780818939,
-0.008670526556670666,
-0.11515477299690247,
-0.059671495109796524,
0.057859934866428375,
-0.0441405326128006,
0.1250302642583847,
-0.012766297906637192,
0.04856238514184952,
-0.005329251289367676,
-0.046705178916454315,
0.0008793998858891428,
-0.0007468253024853766,
0.21652689576148987,
-0.08631238341331482,
0.01552459504455328,
0.011244338005781174,
-0.004062749445438385,
0.034702688455581665,
0.1443507969379425,
0.10577196627855301,
0.1327551305294037,
0.04758969321846962,
0.10765773057937622,
-0.051135364919900894,
-0.03650244325399399,
-0.1761338859796524,
0.05554661154747009,
-0.007077659945935011,
0.032884299755096436,
-0.02833281271159649,
0.09014423191547394,
0.13906089961528778,
-0.1313943862915039,
0.0951453372836113,
0.018183760344982147,
-0.09547293931245804,
-0.0515325665473938,
-0.053305983543395996,
-0.05500262603163719,
-0.09104607254266739,
0.01839534379541874,
-0.1124025285243988,
0.02757135033607483,
0.08613210916519165,
0.034480564296245575,
-0.01767333224415779,
0.1470097452402115,
-0.010264855809509754,
-0.05791119113564491,
0.0017211387166753411,
0.021225277334451675,
0.0470103956758976,
0.10056857764720917,
0.008235166780650616,
0.08035850524902344,
-0.05864062160253525,
0.07694525271654129,
0.01745951734483242,
0.011762043461203575,
0.012092169374227524,
-0.0088520348072052,
-0.002698431257158518,
-0.04873042553663254,
0.00008624240581411868,
0.08064892888069153,
0.15972311794757843,
0.041503164917230606,
-0.04976806789636612,
-0.04603460803627968,
0.17945675551891327,
-0.054944854229688644,
-0.07350309938192368,
-0.12185857445001602,
0.1615169793367386,
0.06164112314581871,
0.029322918504476547,
0.012048193253576756,
-0.08523470908403397,
-0.041374243795871735,
0.21685270965099335,
0.01658591441810131,
-0.03316327556967735,
-0.03867937624454498,
-0.014503352344036102,
-0.005958212073892355,
-0.030218712985515594,
0.141526997089386,
0.017450997605919838,
0.21222682297229767,
-0.0021725972183048725,
-0.0014379334170371294,
-0.03530726581811905,
-0.03765778988599777,
-0.037770263850688934,
0.19890305399894714,
-0.03164583072066307,
0.039410822093486786,
-0.09201798588037491,
-0.011950810439884663,
0.041494693607091904,
-0.10722605884075165,
0.10494977980852127,
-0.09417606890201569,
-0.07528677582740784,
0.029415728524327278,
0.0858122855424881,
-0.023045184090733528,
0.03089451789855957,
-0.01781158708035946,
0.05280231684446335,
0.03880247473716736,
-0.02868705987930298,
-0.09867344796657562,
-0.12718108296394348,
0.05758075416088104,
-0.02763882465660572,
0.15380044281482697,
0.025584526360034943,
0.07863344252109528,
0.0914803296327591,
0.019232232123613358,
-0.06880491226911545,
0.11451780796051025,
0.03178834915161133,
0.010973643511533737,
0.08156696707010269,
0.11617472022771835,
-0.03802439942955971,
0.15940023958683014,
0.0029362060595303774,
-0.031411319971084595,
-0.029817070811986923,
-0.012308737263083458,
-0.010748622938990593,
-0.14320239424705505,
0.004953654482960701,
-0.06346011906862259,
0.12890246510505676,
0.18615835905075073,
-0.046650394797325134,
-0.022940294817090034,
-0.03583838418126106,
0.07700324058532715,
-0.020484499633312225,
0.10619001090526581,
-0.005677114240825176,
-0.1614428460597992,
0.021907508373260498,
-0.022567419335246086,
0.012126188725233078,
-0.17271412909030914,
-0.05613481253385544,
-0.03850427642464638,
-0.030056416988372803,
-0.07681944221258163,
0.14174751937389374,
0.09182946383953094,
0.029362179338932037,
-0.047152917832136154,
-0.22069051861763,
-0.02174277789890766,
0.045204732567071915,
-0.14090800285339355,
-0.12189362198114395
] |
null | null |
transformers
|
# CodeTrans model for source code summarization sql
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the sql code snippets.
## Intended uses & limitations
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_base_source_code_summarization_sql_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "select time ( col0 ) from tab0"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/sql/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
|
summarization
|
SEBIS/code_trans_t5_base_source_code_summarization_sql_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization sql
=================================================
Pretrained model on programming language sql using the t5 base model architecture. It was first released in
this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
Model description
-----------------
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the sql code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08475325256586075,
0.08276255428791046,
-0.001344707328826189,
0.11187084019184113,
0.05950814485549927,
0.02255130372941494,
0.03676054626703262,
0.09917067736387253,
-0.04709424450993538,
0.06341373175382614,
0.059013254940509796,
-0.05332465097308159,
0.06462156772613525,
0.18110418319702148,
0.021998664364218712,
-0.18771356344223022,
-0.02691841684281826,
0.0252708587795496,
-0.05556179955601692,
0.1028529703617096,
0.09102953970432281,
-0.07569585740566254,
0.07016294449567795,
-0.041764035820961,
-0.09655830264091492,
0.06522738188505173,
-0.04034983739256859,
-0.034323688596487045,
0.08838052302598953,
0.07288621366024017,
0.11591731756925583,
-0.03347382694482803,
0.07198258489370346,
-0.22672249376773834,
0.001085532596334815,
0.02431555464863777,
0.04808913171291351,
0.025543125346302986,
0.06166832894086838,
0.055871766060590744,
0.13007619976997375,
-0.03545942157506943,
0.037500131875276566,
0.057689860463142395,
-0.06469284743070602,
-0.0692950040102005,
-0.04564584046602249,
0.05210249125957489,
0.10166868567466736,
0.10057784616947174,
-0.01268493290990591,
0.0025030062533915043,
-0.08596014231443405,
0.08385241031646729,
0.12047881633043289,
-0.23536738753318787,
-0.020278573036193848,
0.10251366347074509,
0.08372586965560913,
0.06337302923202515,
-0.08677002042531967,
-0.03231435641646385,
0.09845691174268723,
0.03526492416858673,
0.056416843086481094,
-0.08350271731615067,
-0.05709191411733627,
0.004061426967382431,
-0.058894719928503036,
-0.053291115909814835,
0.14407144486904144,
0.04037948325276375,
-0.045536503195762634,
-0.09726420789957047,
-0.054210931062698364,
-0.19204024970531464,
0.04168827086687088,
0.016376694664359093,
0.024668755009770393,
0.003553237533196807,
0.010799351148307323,
-0.015282787382602692,
-0.09872879087924957,
-0.10156910121440887,
-0.00970466434955597,
0.047198668122291565,
0.06403353065252304,
0.028784971684217453,
-0.019589966163039207,
0.07568816840648651,
-0.013522752560675144,
-0.04951242730021477,
-0.03313278779387474,
0.018449226394295692,
-0.1286628395318985,
0.027457354590296745,
-0.022226041182875633,
-0.06531502306461334,
-0.00986278336495161,
0.10473707318305969,
-0.07312406599521637,
0.08225331455469131,
0.12078171223402023,
0.005439350381493568,
-0.003132761223241687,
0.24208086729049683,
0.04588661715388298,
-0.1642286479473114,
0.011047066189348698,
0.017621075734496117,
0.00255617150105536,
0.0012040426954627037,
-0.07128596305847168,
-0.042725954204797745,
0.0011395254405215383,
0.06524652242660522,
-0.12879987061023712,
0.017814209684729576,
-0.04729382321238518,
-0.004678041208535433,
0.039175789803266525,
-0.13175101578235626,
0.03617379069328308,
0.011434309184551239,
-0.06061543896794319,
-0.04884522408246994,
0.07016049325466156,
-0.11735603213310242,
-0.11730507016181946,
0.02625305950641632,
-0.044720590114593506,
-0.0313260592520237,
-0.1287222057580948,
-0.12378858774900436,
-0.022022292017936707,
-0.042407579720020294,
0.009435979649424553,
-0.10220134258270264,
-0.10555791854858398,
-0.009360075928270817,
0.032012250274419785,
-0.006535004358738661,
-0.01612895354628563,
-0.06502150744199753,
0.012044472619891167,
0.0034996469039469957,
-0.02657979726791382,
0.01462655607610941,
-0.04524688795208931,
0.08426762372255325,
0.09406638145446777,
0.04898917302489281,
0.001780262216925621,
0.029696324840188026,
-0.08332344144582748,
0.06578219681978226,
-0.06380058825016022,
0.08197949826717377,
-0.019785020500421524,
0.04968498647212982,
-0.09910637885332108,
-0.076695516705513,
0.0264899842441082,
0.057264991104602814,
0.06303341686725616,
0.027499929070472717,
-0.12677831947803497,
0.01881164126098156,
0.1411142796278,
-0.10731539875268936,
-0.11512531340122223,
0.11544308811426163,
-0.0017654363764449954,
0.003191906725987792,
0.07137271016836166,
0.12940414249897003,
0.14853395521640778,
-0.08877193182706833,
-0.04145471751689911,
0.09467499703168869,
0.054735101759433746,
-0.059863343834877014,
0.0903545469045639,
0.013018624857068062,
0.03561382740736008,
0.030155615881085396,
0.04274142533540726,
0.06258931010961533,
-0.0016975787002593279,
-0.04071977734565735,
-0.03011288121342659,
-0.08356142044067383,
-0.034914467483758926,
-0.009020207449793816,
0.02360117994248867,
-0.0592024102807045,
-0.06747045367956161,
0.017406273633241653,
0.15785221755504608,
-0.0955033078789711,
0.029518630355596542,
-0.08963682502508163,
-0.06083028018474579,
-0.07196058332920074,
0.018192537128925323,
-0.10849810391664505,
-0.000785553886089474,
0.045136526226997375,
-0.04490271955728531,
0.06924402713775635,
0.08047430962324142,
0.006865437608212233,
0.02221664972603321,
-0.049655407667160034,
-0.044601406902074814,
-0.036267053335905075,
-0.06467647850513458,
-0.12383449077606201,
-0.009276986122131348,
-0.08756078779697418,
-0.02378350868821144,
-0.05978301167488098,
-0.15918000042438507,
0.011693138629198074,
-0.024212317541241646,
0.021364478394389153,
0.004114129580557346,
-0.017990920692682266,
0.024601735174655914,
0.0537579208612442,
-0.049901124089956284,
-0.08677350729703903,
0.01210696343332529,
0.02775302529335022,
-0.12173116207122803,
-0.0396508052945137,
-0.11010397970676422,
-0.04137048497796059,
0.07093244791030884,
0.10238315910100937,
-0.08107749372720718,
-0.010922420769929886,
-0.020549245178699493,
-0.0504467710852623,
-0.05088702589273453,
-0.07165641337633133,
0.19011004269123077,
0.0034699339885264635,
0.16558878123760223,
-0.128036230802536,
-0.048997897654771805,
-0.04457714036107063,
-0.006880569737404585,
0.01794060505926609,
0.1557922661304474,
0.006245039403438568,
-0.09478353708982468,
0.052747808396816254,
-0.02913963980972767,
-0.06300871819257736,
0.15623366832733154,
-0.0030984163749963045,
-0.08569586277008057,
0.009005887433886528,
0.09071426093578339,
-0.010910546407103539,
0.1468513011932373,
-0.061888132244348526,
-0.000935304444283247,
-0.005703480448573828,
0.0250913817435503,
0.04748392850160599,
-0.13119612634181976,
0.028051989153027534,
0.05715286359190941,
-0.07229460775852203,
-0.060189101845026016,
-0.04593369737267494,
-0.04369068145751953,
0.03288167342543602,
0.00918534491211176,
0.014202915132045746,
-0.01637904904782772,
-0.03088127262890339,
-0.09340891987085342,
0.20885628461837769,
-0.09404938668012619,
-0.22358469665050507,
-0.18070706725120544,
0.05317574366927147,
-0.03565910831093788,
-0.0020469760056585073,
0.05075371637940407,
-0.11646315455436707,
-0.07766792923212051,
-0.0968932956457138,
0.13599893450737,
-0.12692338228225708,
0.0035851984284818172,
-0.028333798050880432,
0.04057767242193222,
0.036740418523550034,
-0.1777912676334381,
0.03176190331578255,
-0.01296150777488947,
-0.007257506716996431,
-0.005130078177899122,
-0.06123799830675125,
0.09236029535531998,
0.11873441934585571,
-0.07195821404457092,
0.014810431748628616,
-0.016444087028503418,
0.16134797036647797,
-0.05130549147725105,
0.021539075300097466,
0.18331566452980042,
0.019245844334363937,
0.03773561120033264,
0.05858953669667244,
0.008504475466907024,
-0.09234701842069626,
0.06517798453569412,
0.054417699575424194,
-0.03796544671058655,
-0.24535201489925385,
-0.010257400572299957,
-0.07678905129432678,
0.04573504999279976,
0.11362853646278381,
0.05084724724292755,
-0.1560695320367813,
0.02117191068828106,
-0.010328805074095726,
0.13012878596782684,
-0.030459268018603325,
0.05256526172161102,
0.03037668578326702,
0.016512451693415642,
0.015612265095114708,
-0.09419282525777817,
0.009845017455518246,
0.07461719214916229,
0.1107456162571907,
0.2056286782026291,
-0.06650193780660629,
0.204049214720726,
0.001900241943076253,
0.08214055746793747,
0.039805781096220016,
0.09238932281732559,
-0.11718595027923584,
0.008035523816943169,
0.00722154974937439,
-0.004929465241730213,
-0.07550722360610962,
0.05754731222987175,
-0.023500865325331688,
0.06813175976276398,
-0.06417828053236008,
0.042477063834667206,
0.017751473933458328,
0.17776347696781158,
0.050810277462005615,
-0.17828309535980225,
-0.11459580808877945,
0.020266901701688766,
-0.10404515266418457,
-0.1000315248966217,
0.06979570537805557,
0.20681123435497284,
-0.046442657709121704,
0.014731925912201405,
-0.00500888004899025,
0.14027588069438934,
-0.080509714782238,
-0.025040779262781143,
0.01903514936566353,
0.08822375535964966,
0.00840091798454523,
0.12375444173812866,
-0.2723241150379181,
0.07963655889034271,
0.017896350473165512,
0.09510468691587448,
-0.00479011470451951,
0.07034347951412201,
-0.038857173174619675,
-0.00589263578876853,
0.07171542197465897,
0.0009123184136115015,
-0.10148578882217407,
-0.2035197913646698,
-0.05129871144890785,
0.0322103425860405,
0.06297823041677475,
-0.010491318069398403,
0.09197210520505905,
-0.023743808269500732,
0.05605601146817207,
-0.01823538728058338,
-0.13686348497867584,
-0.052940480411052704,
-0.14857307076454163,
-0.03907288983464241,
-0.0032232999801635742,
-0.00912315584719181,
-0.02970757707953453,
0.033093903213739395,
0.011610719375312328,
0.220854252576828,
-0.16977021098136902,
-0.09941668808460236,
-0.09553716331720352,
0.08707072585821152,
0.126463383436203,
-0.10502149909734726,
0.029308225959539413,
0.023400548845529556,
0.05368296802043915,
-0.034745339304208755,
-0.07137953490018845,
0.03272153064608574,
-0.05388464406132698,
-0.06375477463006973,
-0.025049103423953056,
0.09589232504367828,
-0.012834939174354076,
0.04887967184185982,
0.004362092819064856,
-0.0807604119181633,
-0.04839301481842995,
-0.13088448345661163,
-0.0835932046175003,
0.00044345177593640983,
0.01331502664834261,
0.0179899949580431,
-0.10152347385883331,
0.059914249926805496,
-0.010271617211401463,
-0.08572946488857269,
0.08743574470281601,
0.17180295288562775,
-0.07003632932901382,
0.014330062083899975,
0.09589014947414398,
-0.06379181891679764,
-0.16865037381649017,
-0.05339227244257927,
0.050609149038791656,
0.08911661803722382,
-0.023203637450933456,
-0.14908334612846375,
0.06942279636859894,
0.03395668417215347,
0.032822199165821075,
0.008226996287703514,
-0.2893832325935364,
-0.1254785656929016,
0.03192376717925072,
0.06718813627958298,
0.056159086525440216,
-0.11470632255077362,
-0.04574413225054741,
-0.061777736991643906,
-0.0712428092956543,
0.03813447803258896,
0.07191150635480881,
0.12909133732318878,
-0.03412090986967087,
0.02344963699579239,
0.033172450959682465,
-0.022055134177207947,
0.08495864272117615,
-0.002802412724122405,
0.09551940113306046,
-0.013288616202771664,
0.035657476633787155,
0.06339775770902634,
-0.05923999845981598,
0.16755251586437225,
-0.17724087834358215,
0.08387235552072525,
-0.20114682614803314,
-0.05383812636137009,
-0.003641369752585888,
-0.005750781856477261,
-0.03243935480713844,
-0.05321832746267319,
-0.10646401345729828,
0.00895744375884533,
0.05714450776576996,
-0.024566393345594406,
0.060672350227832794,
-0.03344748541712761,
-0.05684926360845566,
0.06092724949121475,
0.09134099632501602,
-0.022440848872065544,
-0.14363722503185272,
0.02607429400086403,
0.03207499906420708,
0.08650699257850647,
-0.18931236863136292,
0.020926540717482567,
0.12229059636592865,
0.013524799607694149,
0.11000330746173859,
0.017607443034648895,
-0.06070975214242935,
0.0486295223236084,
0.06691990792751312,
-0.024603744968771935,
-0.11614900827407837,
-0.01336541585624218,
-0.040846534073352814,
-0.09703028947114944,
0.024721812456846237,
0.08455121517181396,
-0.04989806190133095,
-0.014264999888837337,
-0.00938672386109829,
0.005952189676463604,
-0.06169181317090988,
0.19186106324195862,
0.01644226163625717,
0.07902927696704865,
-0.057675257325172424,
0.08508822321891785,
0.09477989375591278,
-0.10828401893377304,
0.014076781459152699,
0.16812217235565186,
-0.07922827452421188,
-0.01958363503217697,
0.06748133897781372,
0.10103843361139297,
-0.06366299092769623,
-0.054538942873477936,
-0.1022513285279274,
-0.07531063258647919,
0.01611504890024662,
0.028415001928806305,
0.06667998433113098,
0.07365606725215912,
-0.04129362106323242,
0.025034373626112938,
-0.08925378322601318,
0.09623735398054123,
0.07311980426311493,
0.04814998805522919,
-0.13249607384204865,
0.14400792121887207,
0.03451951593160629,
0.07065121829509735,
0.0024688548874109983,
0.030998677015304565,
-0.10401634871959686,
0.041572075337171555,
-0.023319857195019722,
0.047743335366249084,
-0.00716158002614975,
0.05241008847951889,
-0.028680900111794472,
0.01980229839682579,
-0.031292181462049484,
0.05375757813453674,
-0.03535711020231247,
-0.03617515787482262,
-0.03319164365530014,
0.03591432049870491,
-0.06155640259385109,
-0.01826639659702778,
0.005764760542660952,
-0.07316220551729202,
0.09361114352941513,
-0.06250835955142975,
-0.003526156535372138,
0.006808457896113396,
-0.014542554505169392,
0.06173033267259598,
0.026062101125717163,
0.05532607063651085,
-0.012916158884763718,
0.008582991547882557,
0.03929365426301956,
0.021208057180047035,
-0.012806831859052181,
-0.010362459346652031,
0.0511186420917511,
-0.1448020339012146,
-0.08946292847394943,
-0.09745561331510544,
-0.057016484439373016,
-0.06736782938241959,
0.07504923641681671,
0.0926196351647377,
0.07512880116701126,
0.10405655205249786,
-0.03764720261096954,
0.002205768832936883,
-0.14566729962825775,
-0.031838465481996536,
0.05338640883564949,
-0.009482589550316334,
-0.10671438276767731,
-0.05253944545984268,
0.05897703394293785,
-0.04115109518170357,
0.1200801432132721,
-0.0033710531424731016,
0.056450970470905304,
-0.0081561878323555,
-0.04086950048804283,
0.0005222997860983014,
0.0009077492286451161,
0.210431769490242,
-0.08262721449136734,
0.013528063893318176,
0.013159472495317459,
0.0061490824446082115,
0.04222908988595009,
0.15000712871551514,
0.1058289185166359,
0.1252143830060959,
0.05170534923672676,
0.10893958061933517,
-0.05682012811303139,
-0.03120042383670807,
-0.1730443835258484,
0.060700755566358566,
-0.013689487241208553,
0.02937048114836216,
-0.030590396374464035,
0.10164282470941544,
0.1348118782043457,
-0.13428346812725067,
0.10212442278862,
0.021639710292220116,
-0.09628326445817947,
-0.052536867558956146,
-0.07095367461442947,
-0.05395636335015297,
-0.10416088253259659,
0.012022443115711212,
-0.11555860936641693,
0.024349741637706757,
0.08409625291824341,
0.03257512301206589,
-0.020030617713928223,
0.1402442306280136,
-0.012851828709244728,
-0.06691446155309677,
0.00959879532456398,
0.02751399762928486,
0.04772302508354187,
0.10247838497161865,
0.012429813854396343,
0.07349086552858353,
-0.06671354174613953,
0.07154249399900436,
0.022132979705929756,
0.013543317094445229,
0.015198481269180775,
-0.003715459955856204,
0.0036864695139229298,
-0.05042446404695511,
0.01010183710604906,
0.07431630790233612,
0.16564691066741943,
0.047492824494838715,
-0.05263129621744156,
-0.044775765389204025,
0.19909173250198364,
-0.058829423040151596,
-0.0721118301153183,
-0.12479826807975769,
0.1710355430841446,
0.05777288228273392,
0.03048441745340824,
0.007515306118875742,
-0.0843968465924263,
-0.03717469424009323,
0.22525687515735626,
0.026774803176522255,
-0.03479573130607605,
-0.03585214912891388,
-0.01745767518877983,
-0.008128337562084198,
-0.022307127714157104,
0.14328798651695251,
0.024110453203320503,
0.23162208497524261,
-0.003760356456041336,
-0.013003706000745296,
-0.036534205079078674,
-0.03620123118162155,
-0.030434397980570793,
0.2008715271949768,
-0.038007840514183044,
0.0373188778758049,
-0.09265396744012833,
-0.014161454513669014,
0.025543225929141045,
-0.10396639257669449,
0.09966059029102325,
-0.10682062059640884,
-0.06913257390260696,
0.02162804827094078,
0.08354666829109192,
-0.027212100103497505,
0.03476819023489952,
-0.013848761096596718,
0.05613311007618904,
0.040134064853191376,
-0.0241684690117836,
-0.11097503453493118,
-0.13856694102287292,
0.04452817142009735,
-0.026818107813596725,
0.141433984041214,
0.021794531494379044,
0.0647989884018898,
0.09070412069559097,
0.030224036425352097,
-0.06713880598545074,
0.10860960930585861,
0.02693423628807068,
-0.003537399461492896,
0.07496698945760727,
0.10943789780139923,
-0.04146767035126686,
0.17071476578712463,
0.0040272995829582214,
-0.02842635102570057,
-0.02178890071809292,
-0.011233687400817871,
-0.012922833673655987,
-0.14222003519535065,
0.0008450009045191109,
-0.0632757842540741,
0.1303769201040268,
0.18918681144714355,
-0.03875505179166794,
-0.02100418508052826,
-0.039063725620508194,
0.07002352178096771,
-0.0274648517370224,
0.10492436587810516,
0.00017564173322170973,
-0.15460963547229767,
0.008886423893272877,
-0.012521361000835896,
0.005782620050013065,
-0.17014674842357635,
-0.05283452570438385,
-0.04204206541180611,
-0.03134004771709442,
-0.08119338005781174,
0.1421550065279007,
0.08825937658548355,
0.029350783675909042,
-0.04373116046190262,
-0.20659106969833374,
-0.012293349020183086,
0.05023052915930748,
-0.13941358029842377,
-0.12006913870573044
] |
null | null |
transformers
|
# CodeTrans transfer learning pre-trained model
Pretrained model on programming languages using the t5 base model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-base` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain.
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
It could be used to fine-tune other tasks in the software development domain.
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{}
|
feature-extraction
|
SEBIS/code_trans_t5_base_transfer_learning_pretrain
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us
|
# CodeTrans transfer learning pre-trained model
Pretrained model on programming languages using the t5 base model architecture. It was first released in
this repository.
## Model description
This CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain.
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
It could be used to fine-tune other tasks in the software development domain.
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
|
[
"# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 base model architecture. It was first released in\nthis repository.",
"## Model description\n\nThis CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n",
"# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 base model architecture. It was first released in\nthis repository.",
"## Model description\n\nThis CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn"
] |
[
42,
38,
167
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 base model architecture. It was first released in\nthis repository.## Model description\n\nThis CodeTrans model is based on the 't5-base' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn"
] |
[
-0.05917114019393921,
-0.06829982250928879,
-0.0004522585077211261,
0.09772726893424988,
0.15223434567451477,
0.03596556559205055,
0.13276135921478271,
0.004727792460471392,
-0.07934831827878952,
-0.02748219296336174,
0.05320117995142937,
0.050153542309999466,
0.0025852550752460957,
0.1827877163887024,
0.06651921570301056,
-0.2333351969718933,
0.020102446898818016,
0.017501868307590485,
-0.04350652918219566,
0.10115139186382294,
0.09428075700998306,
-0.06904111802577972,
0.08348218351602554,
-0.012240237556397915,
-0.1666375696659088,
0.03359850496053696,
-0.027705928310751915,
-0.06229233741760254,
0.10494396835565567,
0.06934452801942825,
0.1383880078792572,
0.012918595224618912,
0.05986137315630913,
-0.037234991788864136,
0.017334124073386192,
0.07737699896097183,
-0.023233920335769653,
0.009491721168160439,
0.026456527411937714,
0.08154627680778503,
0.2000758945941925,
0.09329245239496231,
0.06714730709791183,
0.0420311875641346,
-0.08259164541959763,
-0.00661997776478529,
0.031962983310222626,
0.10030611604452133,
0.07093444466590881,
0.12088925391435623,
0.01505995076149702,
0.15527375042438507,
-0.10254363715648651,
0.13135620951652527,
-0.015340409241616726,
-0.3067641854286194,
-0.05098746344447136,
0.15405996143817902,
0.07910829037427902,
0.12889689207077026,
0.02974625490605831,
-0.03340604528784752,
0.03162773698568344,
0.08015034347772598,
0.13349837064743042,
-0.06864795833826065,
-0.07523886114358902,
-0.07012611627578735,
-0.11778409034013748,
-0.07461576163768768,
0.24036866426467896,
0.004461618606001139,
-0.025220129638910294,
-0.09261558949947357,
-0.09910161793231964,
-0.08370061218738556,
0.05103260651230812,
-0.0468151830136776,
-0.0007203472778201103,
0.05624694004654884,
0.08026271313428879,
-0.06440827995538712,
-0.11606168746948242,
-0.09865620732307434,
-0.0073628779500722885,
0.13812477886676788,
0.049576450139284134,
0.05883293226361275,
-0.08902855962514877,
0.09292372316122055,
0.0576605498790741,
-0.05402567237615585,
0.014936734922230244,
-0.05670471861958504,
-0.097271628677845,
0.012029360979795456,
-0.038501888513565063,
-0.18383057415485382,
-0.021201107650995255,
0.04673312231898308,
0.04908718541264534,
0.011874841526150703,
0.1213071420788765,
0.03650980815291405,
0.04415035992860794,
0.11826787143945694,
-0.0537392795085907,
-0.04075128585100174,
0.05784517899155617,
-0.008638069964945316,
-0.03587672486901283,
-0.03670255094766617,
-0.14787104725837708,
-0.01959696225821972,
0.025982186198234558,
0.03423228859901428,
-0.04932457581162453,
0.10358966141939163,
-0.014355843886733055,
-0.08829619735479355,
0.15579964220523834,
-0.07863268256187439,
-0.08280295133590698,
-0.013771305792033672,
-0.006220899987965822,
-0.03613593801856041,
0.08403041213750839,
-0.05092497915029526,
-0.11703392118215561,
-0.05418046936392784,
-0.07156714797019958,
-0.10045654326677322,
-0.1318299025297165,
-0.15131701529026031,
-0.04124963656067848,
-0.11778207868337631,
0.03173447027802467,
-0.21095646917819977,
-0.14124895632266998,
-0.06269655376672745,
0.07785974442958832,
0.031606465578079224,
-0.0721830204129219,
-0.012503372505307198,
-0.03707228600978851,
-0.044122327119112015,
-0.053564801812171936,
-0.010007350705564022,
-0.03431286662817001,
0.07032326608896255,
-0.043692681938409805,
0.03134054318070412,
-0.08470381051301956,
0.043258681893348694,
-0.08156362175941467,
-0.015136243775486946,
-0.13330170512199402,
0.07536857575178146,
0.03737083077430725,
0.11105930805206299,
-0.0534682497382164,
-0.07833083719015121,
-0.04690541699528694,
0.056797727942466736,
0.0006183724035508931,
0.03498465195298195,
-0.07105676084756851,
0.004348265938460827,
0.09440568089485168,
-0.15328530967235565,
-0.13633406162261963,
0.0748286247253418,
-0.0028609191067516804,
0.1724945604801178,
0.07132163643836975,
0.15639859437942505,
0.16278076171875,
-0.04663648456335068,
0.08571896702051163,
0.05525923892855644,
-0.07970204949378967,
-0.16189904510974884,
0.038254547864198685,
0.09171702712774277,
-0.07226487994194031,
0.011861045844852924,
-0.07283223420381546,
0.10978034883737564,
-0.016090426594018936,
-0.03649073839187622,
-0.0014250649837777019,
-0.0995679721236229,
-0.06662159413099289,
0.02061510644853115,
0.11260925978422165,
0.03524323180317879,
-0.06215629726648331,
0.016406629234552383,
0.1057777926325798,
-0.12523998320102692,
0.014747864566743374,
-0.10961557924747467,
-0.049363598227500916,
-0.029042202979326248,
0.010965182445943356,
-0.20694084465503693,
-0.03179212287068367,
0.07085194438695908,
0.04050398990511894,
0.012394949793815613,
0.22133158147335052,
0.01614922657608986,
0.043542955070734024,
0.01132133137434721,
-0.04430605471134186,
-0.09877539426088333,
-0.022350385785102844,
-0.06290262937545776,
-0.03859919309616089,
-0.10056164115667343,
-0.06831106543540955,
-0.07044601440429688,
-0.09611611068248749,
0.029919490218162537,
-0.06820813566446304,
0.01176304928958416,
0.05153878405690193,
-0.003954477608203888,
-0.0002975654788315296,
0.0740240290760994,
-0.05127343535423279,
-0.06772201508283615,
0.07783034443855286,
0.09512791037559509,
-0.039521459490060806,
0.031831420958042145,
-0.15190981328487396,
0.10219255089759827,
0.08244136720895767,
-0.01453489251434803,
-0.09507153928279877,
0.047585103660821915,
-0.04946392402052879,
-0.02012372761964798,
0.0524403378367424,
-0.026020023971796036,
0.23556053638458252,
-0.025170600041747093,
0.1603144109249115,
-0.11131064593791962,
0.005250841379165649,
-0.0030114175751805305,
0.03982457518577576,
0.09067515283823013,
0.07422677427530289,
0.05421492084860802,
-0.08372510969638824,
0.02368365414440632,
-0.0020855767652392387,
-0.03401859849691391,
0.16293708980083466,
-0.028998250141739845,
-0.03818517550826073,
0.017771456390619278,
0.00805666297674179,
0.013061000034213066,
0.042226895689964294,
-0.13874143362045288,
-0.056019216775894165,
0.012908034957945347,
0.031197821721434593,
0.07047790288925171,
-0.08031580597162247,
0.020891396328806877,
0.024253496900200844,
0.01212868932634592,
0.027254536747932434,
0.00921125803142786,
-0.09497382491827011,
0.04302191361784935,
0.013314245268702507,
-0.06419616937637329,
0.021264927461743355,
-0.015774166211485863,
-0.10614396631717682,
0.1685793399810791,
-0.03321564942598343,
-0.23006610572338104,
-0.11949668824672699,
0.16916383802890778,
-0.011503884568810463,
0.04462972283363342,
0.05354113131761551,
-0.019940711557865143,
-0.0644659548997879,
-0.08960277587175369,
0.1036658063530922,
-0.06318188458681107,
0.007083086296916008,
-0.01959172822535038,
0.04079737514257431,
-0.028561122715473175,
-0.15058587491512299,
0.019666578620672226,
-0.017124243080615997,
-0.0856974795460701,
0.056012723594903946,
-0.12427160888910294,
0.09331077337265015,
0.1904131919145584,
-0.030099419876933098,
0.030337050557136536,
-0.019558202475309372,
0.16815631091594696,
-0.04954886436462402,
0.0555475614964962,
0.25494202971458435,
0.017432941123843193,
-0.017758116126060486,
0.06080566719174385,
-0.029489954933524132,
-0.10973625630140305,
0.10343392938375473,
-0.03931340202689171,
-0.08253617584705353,
-0.17197051644325256,
-0.0960901603102684,
-0.09442437440156937,
0.001236348762176931,
0.11037615686655045,
0.05956058204174042,
0.06872887909412384,
0.06841608881950378,
0.015193477272987366,
0.12106816470623016,
-0.019522886723279953,
0.04982675239443779,
0.04238693416118622,
0.005474826321005821,
0.05458642914891243,
-0.08466716855764389,
-0.01608208380639553,
0.04285392537713051,
0.04319729283452034,
0.20696774125099182,
-0.05957217514514923,
0.16673968732357025,
0.07544279843568802,
0.07019921392202377,
0.07509171217679977,
0.1476253867149353,
-0.06903515756130219,
0.01610160432755947,
-0.023245815187692642,
-0.018307385966181755,
-0.10988456010818481,
0.07259658724069595,
-0.05091214179992676,
-0.06766048818826675,
-0.05344470217823982,
0.011128494516015053,
-0.03732551634311676,
0.2583165466785431,
-0.025817586109042168,
-0.2299056202173233,
-0.09269613772630692,
-0.027796318754553795,
-0.04192836955189705,
-0.07611791789531708,
0.07078169286251068,
0.2237374484539032,
-0.04649136960506439,
-0.0936112031340599,
-0.00372966262511909,
0.12015951424837112,
-0.052366144955158234,
-0.00023779793991707265,
0.05068475380539894,
0.020519018173217773,
0.0608743280172348,
0.08158466219902039,
-0.21769866347312927,
0.10060960054397583,
-0.02163832262158394,
0.09644985944032669,
-0.0712500661611557,
0.015706481412053108,
-0.05035199597477913,
0.11899375170469284,
0.07542525231838226,
0.011567274108529091,
0.04491431638598442,
-0.10882480442523956,
-0.04839416965842247,
0.04166540503501892,
0.011056037619709969,
-0.07015732675790787,
0.04203363507986069,
-0.01606248877942562,
0.057586442679166794,
0.004491620697081089,
-0.015273863449692726,
-0.10284651070833206,
-0.07742039114236832,
-0.0016280795680359006,
0.015216178260743618,
0.06440401822328568,
-0.04541897028684616,
-0.020552173256874084,
0.07539141178131104,
0.13983923196792603,
-0.0747344046831131,
-0.07340531796216965,
-0.09753487259149551,
-0.059130359441041946,
0.07768474519252777,
-0.06372101604938507,
0.07586640864610672,
0.011151473969221115,
-0.01991504803299904,
-0.008757596835494041,
-0.11793269217014313,
0.07890598475933075,
-0.08547574281692505,
-0.05061357468366623,
-0.008553429506719112,
-0.026725217700004578,
0.03949150815606117,
0.01951337791979313,
0.021387513726949692,
-0.07669118791818619,
-0.03298623859882355,
-0.10644367337226868,
-0.11944986134767532,
0.0451713465154171,
0.06925076246261597,
-0.05958283320069313,
-0.09000466018915176,
0.06873276084661484,
0.0035131776239722967,
-0.004641400650143623,
0.09020635485649109,
-0.013106964528560638,
-0.05898493900895119,
0.03082037903368473,
0.18708740174770355,
-0.03678227216005325,
-0.2518935203552246,
-0.0737588107585907,
0.08939936757087708,
0.07734353840351105,
-0.04745074734091759,
-0.19165414571762085,
0.07091095298528671,
-0.021784570068120956,
0.023401135578751564,
-0.011727618053555489,
-0.3070952594280243,
-0.10640499740839005,
0.06590090692043304,
0.13622504472732544,
0.14758922159671783,
-0.1318078488111496,
0.022837650030851364,
0.01567094959318638,
-0.08551952987909317,
0.13640856742858887,
-0.04962189123034477,
0.11835715174674988,
-0.019373362883925438,
-0.06590935587882996,
0.010474305599927902,
-0.04518210142850876,
-0.0020341756753623486,
0.04710274934768677,
0.0877050831913948,
-0.056543245911598206,
0.07274533808231354,
0.15154868364334106,
-0.03510533645749092,
0.14539621770381927,
0.009803210385143757,
0.10597420483827591,
-0.15971678495407104,
-0.10633358359336853,
-0.0642232820391655,
0.0321391299366951,
0.013238581828773022,
-0.12354455888271332,
-0.02459677681326866,
0.037528492510318756,
0.03282523527741432,
-0.0007643929566256702,
0.06784999370574951,
-0.03424093872308731,
-0.03948955610394478,
0.10006288439035416,
0.09423711895942688,
-0.09742801636457443,
-0.07746527343988419,
0.003263621125370264,
0.00008971870556706563,
0.15923456847667694,
-0.18362949788570404,
0.0010943793458864093,
0.08955828845500946,
-0.039187949150800705,
0.07019650936126709,
0.039108067750930786,
-0.03137131780385971,
0.014520995318889618,
0.06272675842046738,
-0.10032012313604355,
-0.07094549387693405,
-0.03751464933156967,
-0.04802382364869118,
-0.0291585810482502,
0.04847346991300583,
0.11495120823383331,
-0.10618873685598373,
-0.006190156098455191,
-0.03648705035448074,
-0.06553700566291809,
-0.049659695476293564,
0.12158868461847305,
0.04062549024820328,
0.02320595271885395,
-0.05917138233780861,
0.07012159377336502,
0.08192697912454605,
-0.09873531758785248,
0.018836408853530884,
0.08843331784009933,
-0.1602116823196411,
-0.09378279000520706,
0.031747352331876755,
0.055917661637067795,
-0.029211899265646935,
-0.06624443829059601,
-0.08907195925712585,
-0.05838645622134209,
0.024454189464449883,
0.0780474916100502,
0.051122624427080154,
0.07869988679885864,
-0.10548050701618195,
-0.001812856295146048,
-0.08004391938447952,
-0.002953208750113845,
0.021048778668045998,
0.03562043607234955,
-0.17923754453659058,
0.18369518220424652,
0.08495888113975525,
0.0711759552359581,
-0.059369221329689026,
-0.029773933812975883,
-0.11281231790781021,
0.028966236859560013,
0.031192544847726822,
0.02100178971886635,
-0.030795980244874954,
0.017969993874430656,
-0.02172902040183544,
-0.03143433853983879,
-0.04528746381402016,
0.07680810987949371,
-0.051280636340379715,
0.012008555233478546,
-0.022896258160471916,
0.04821540042757988,
0.043623920530080795,
-0.03015482798218727,
-0.04121769219636917,
-0.08351688832044601,
0.11098389327526093,
-0.03678213059902191,
-0.08243671804666519,
0.023317625746130943,
-0.025183945894241333,
0.06591019779443741,
0.034907266497612,
0.0845230221748352,
-0.017770200967788696,
0.08919748663902283,
0.012717261910438538,
0.04251427575945854,
0.02523433044552803,
-0.01569780707359314,
0.038478150963783264,
-0.10361243784427643,
-0.0064114476554095745,
-0.049774736166000366,
-0.03275667503476143,
-0.06849100440740585,
-0.0021406535524874926,
0.0984727218747139,
0.11619449406862259,
0.12909626960754395,
-0.020765479654073715,
0.03266672044992447,
-0.1353543996810913,
-0.021169820800423622,
0.06743050366640091,
-0.001198120298795402,
-0.03964713588356972,
-0.1316351741552353,
0.029728101566433907,
0.011156998574733734,
0.07834330946207047,
0.03507509082555771,
0.0432010181248188,
-0.024350998923182487,
0.02063789591193199,
0.03710949048399925,
-0.010602019727230072,
0.18274925649166107,
-0.006370586808770895,
0.04728344827890396,
-0.019778519868850708,
0.052921805530786514,
0.017897382378578186,
0.1394420564174652,
0.10153917223215103,
0.10005215555429459,
-0.008539382368326187,
0.08724193274974823,
-0.011243862099945545,
0.011520887725055218,
-0.11603441834449768,
-0.010977692902088165,
-0.042331013828516006,
0.09535203874111176,
-0.0787590816617012,
0.011295663192868233,
0.13285624980926514,
-0.0175551138818264,
0.047603629529476166,
0.04828497767448425,
-0.08020167052745819,
-0.023902231827378273,
-0.11460330337285995,
-0.047053705900907516,
-0.10732001811265945,
-0.012128667905926704,
-0.08528009802103043,
-0.0669860765337944,
0.1392679214477539,
0.004374364856630564,
-0.029719378799200058,
0.1526937484741211,
0.00734288664534688,
-0.06246839091181755,
0.024287985637784004,
-0.03453748673200607,
0.05668069049715996,
-0.03812597319483757,
-0.011380532756447792,
0.010008172132074833,
0.009203329682350159,
0.037138376384973526,
0.007965337485074997,
-0.027936790138483047,
-0.0016826005885377526,
0.00030211228295229375,
-0.010581659153103828,
-0.044816676527261734,
0.03256837651133537,
-0.023840846493840218,
0.17650113999843597,
0.027210965752601624,
-0.11362962424755096,
0.014891128987073898,
0.16406315565109253,
-0.01583048328757286,
-0.14123880863189697,
-0.15883475542068481,
0.11872491240501404,
0.04461982473731041,
-0.004785547032952309,
0.01894834265112877,
-0.01812991127371788,
-0.0650344118475914,
0.2630404531955719,
0.11360127478837967,
-0.04664675518870354,
-0.008725578896701336,
0.035186827182769775,
-0.007829899899661541,
-0.06720773130655289,
0.24704842269420624,
0.028845053166151047,
0.25782310962677,
-0.005208236631006002,
0.011379942297935486,
-0.08705548942089081,
0.008986794389784336,
-0.031238669529557228,
0.0019392423564568162,
0.0028429278172552586,
-0.037609610706567764,
-0.037610024213790894,
0.031928956508636475,
-0.02728208526968956,
-0.027225403115153313,
0.12673303484916687,
-0.031694475561380386,
-0.04277769848704338,
-0.02470952831208706,
0.007395067717880011,
-0.019394101575016975,
0.1044626235961914,
-0.046716202050447464,
0.051476288586854935,
0.055907752364873886,
-0.031520776450634,
-0.10896320641040802,
-0.06318294256925583,
0.09139896929264069,
-0.06402228772640228,
0.1468709409236908,
-0.056028593331575394,
0.08141182363033295,
0.03313993662595749,
0.02682044357061386,
-0.08881954848766327,
0.10646626353263855,
-0.03784860298037529,
0.019129956141114235,
0.04945119470357895,
-0.031250856816768646,
-0.06816188991069794,
0.09483666718006134,
-0.036304324865341187,
-0.17281058430671692,
0.005038995295763016,
-0.00856319535523653,
0.03107582777738571,
-0.0952141061425209,
-0.013736303895711899,
-0.052310213446617126,
0.14720550179481506,
0.08413063734769821,
-0.015451249666512012,
-0.04765142872929573,
-0.0775282010436058,
0.05544235557317734,
0.009942580945789814,
0.029275281354784966,
-0.043144140392541885,
-0.1544739007949829,
-0.07302568107843399,
-0.09568443149328232,
0.0001214316944242455,
-0.15117748081684113,
-0.007214070297777653,
-0.0903681144118309,
-0.025617064908146858,
-0.1045328825712204,
0.07490073144435883,
0.060554079711437225,
0.026993535459041595,
-0.022880656644701958,
-0.0065162512473762035,
-0.020404009148478508,
0.055580563843250275,
-0.1474730670452118,
-0.17207644879817963
] |
null | null |
transformers
|
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_api_generation_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_api_generation_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/api%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]}
|
summarization
|
SEBIS/code_trans_t5_large_api_generation_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 large model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12540273368358612,
-0.022215383127331734,
-0.0004341439052950591,
0.13266415894031525,
0.10596083849668503,
0.024562422186136246,
0.05864175781607628,
0.06575542688369751,
-0.029307594522833824,
0.01806596666574478,
0.04466746002435684,
0.008424014784395695,
0.03236064687371254,
0.19513490796089172,
0.007412065286189318,
-0.11779703944921494,
-0.015279041603207588,
0.04487673565745354,
-0.03705554082989693,
0.1284952461719513,
0.09419479966163635,
-0.07413081079721451,
0.05414975434541702,
-0.06968334317207336,
-0.24698612093925476,
0.06061461567878723,
-0.005288693122565746,
-0.06391359865665436,
0.09927546232938766,
0.04684608057141304,
0.12539806962013245,
-0.00511412275955081,
0.021723616868257523,
-0.1402551382780075,
0.010669889859855175,
0.011186678893864155,
0.03323616832494736,
0.017635714262723923,
0.04504484310746193,
0.054368458688259125,
0.14046117663383484,
0.011922849342226982,
0.04172268137335777,
0.06125297397375107,
-0.07495088875293732,
-0.11944957822561264,
-0.006876903120428324,
0.023917902261018753,
0.04935591667890549,
0.10036467760801315,
-0.012464947067201138,
0.12219161540269852,
-0.1498400717973709,
0.12802183628082275,
0.10141385346651077,
-0.219596728682518,
-0.013253443874418736,
0.12395067512989044,
0.08992189168930054,
0.09827350080013275,
-0.05995164066553116,
-0.06663146615028381,
0.10273071378469467,
0.05324026942253113,
0.046189527958631516,
-0.10130035877227783,
-0.11267637461423874,
0.02350042574107647,
-0.07406409084796906,
-0.06398112326860428,
0.22183942794799805,
0.0019471973646432161,
-0.0769893079996109,
-0.054244399070739746,
-0.025867702439427376,
-0.1353013813495636,
0.03594592586159706,
0.028843723237514496,
0.009016958065330982,
-0.03319674730300903,
0.015491500496864319,
0.030669743195176125,
-0.07389641553163528,
-0.15604016184806824,
0.027708353474736214,
0.09218017756938934,
0.056207798421382904,
0.02564544416964054,
-0.09612933546304703,
0.10461229830980301,
0.03737185150384903,
-0.05994012579321861,
-0.025820165872573853,
-0.018331049010157585,
-0.10257084667682648,
0.03432910144329071,
-0.05219503492116928,
-0.18240772187709808,
0.017447402700781822,
0.009019856341183186,
-0.05134861171245575,
0.050705522298812866,
0.028840307146310806,
0.037034839391708374,
0.021738607436418533,
0.19747446477413177,
0.0566101036965847,
-0.12138901650905609,
0.05429012328386307,
0.04298153892159462,
-0.03649599477648735,
-0.004339849576354027,
-0.06904273480176926,
-0.0970669761300087,
0.09340883046388626,
0.10320858657360077,
-0.13941697776317596,
0.03573166951537132,
-0.06992761045694351,
-0.043362490832805634,
0.002062021056190133,
-0.15824772417545319,
0.002881790744140744,
0.02701963111758232,
-0.06724324077367783,
-0.05528978258371353,
0.09278982877731323,
-0.17079401016235352,
-0.14982858300209045,
-0.04582909867167473,
-0.08016983419656754,
-0.040516357868909836,
-0.16715990006923676,
-0.1566920429468155,
-0.009498518891632557,
-0.03786812350153923,
0.019420430064201355,
-0.08752530813217163,
-0.1574682891368866,
-0.026339281350374222,
0.019305268302559853,
0.004151441156864166,
-0.0018599097384139895,
-0.07786508649587631,
-0.009024999104440212,
-0.029357660561800003,
-0.03860882297158241,
0.014896669425070286,
-0.04769165441393852,
0.12167677283287048,
0.10266944766044617,
0.0536111444234848,
-0.023496149107813835,
0.060770247131586075,
-0.07997515052556992,
0.06476966291666031,
-0.1135995090007782,
0.09476936608552933,
-0.057246796786785126,
0.07962346076965332,
-0.033815376460552216,
-0.10593738406896591,
0.07545337826013565,
0.06265661865472794,
0.06541702151298523,
0.03460940346121788,
-0.13778743147850037,
-0.02449950762093067,
0.18672731518745422,
-0.12283627688884735,
-0.13662436604499817,
0.10153059661388397,
-0.03835577517747879,
0.08211051672697067,
0.08330046385526657,
0.1427721083164215,
0.14837795495986938,
-0.023646943271160126,
0.02429973892867565,
0.04951948672533035,
0.044223375618457794,
-0.13387992978096008,
0.07928222417831421,
0.066283218562603,
-0.0896964818239212,
0.06217040866613388,
-0.016556423157453537,
0.09981612861156464,
-0.010503187775611877,
-0.024510355666279793,
-0.05109924450516701,
-0.07995766401290894,
-0.0051079257391393185,
0.007985919713973999,
0.06639616936445236,
-0.08394365012645721,
-0.059944964945316315,
0.09118154644966125,
0.1739000678062439,
-0.1328577846288681,
-0.003100441535934806,
-0.08044473081827164,
0.03575969859957695,
-0.07718710601329803,
0.029561487957835197,
-0.16179978847503662,
0.035769395530223846,
0.07769378274679184,
-0.02710501290857792,
0.053150590509176254,
0.13163745403289795,
0.012880533002316952,
0.04452856630086899,
0.0009440947906114161,
-0.014975765720009804,
-0.12080421298742294,
-0.056486260145902634,
-0.06212065741419792,
-0.062407661229372025,
-0.08985140919685364,
-0.05958971008658409,
-0.03661254793405533,
-0.19624628126621246,
0.011107951402664185,
0.0021702442318201065,
0.0024957675486803055,
0.02759333699941635,
-0.012878019362688065,
0.029851119965314865,
0.0764768049120903,
-0.060091808438301086,
-0.03760640323162079,
0.03181592375040054,
0.025180064141750336,
-0.041224829852581024,
-0.05684052035212517,
-0.08188313990831375,
0.008808750659227371,
0.107871413230896,
0.04205280914902687,
-0.07948806881904602,
0.02320702001452446,
-0.021161943674087524,
-0.04876451939344406,
0.010106664150953293,
-0.06363364309072495,
0.14412237703800201,
-0.005866991821676493,
0.19975101947784424,
-0.1645556390285492,
-0.03830789029598236,
-0.024414801970124245,
0.0242521483451128,
0.06307784467935562,
0.1378830224275589,
-0.016208486631512642,
-0.08656714856624603,
0.06576360017061234,
0.01602877862751484,
-0.10201939940452576,
0.23287007212638855,
-0.04714355245232582,
-0.0936397835612297,
0.02202770859003067,
0.1009739562869072,
-0.01727364771068096,
0.16700470447540283,
-0.20175205171108246,
-0.02866252511739731,
0.016760677099227905,
0.0061673494055867195,
0.06601157784461975,
-0.12628816068172455,
0.0035895032342523336,
0.009181782603263855,
-0.07290507107973099,
-0.06775758415460587,
-0.00669435178861022,
-0.006280045956373215,
0.03729473054409027,
-0.007480325177311897,
-0.029580390080809593,
0.01742200180888176,
-0.039268966764211655,
-0.10644958168268204,
0.21930469572544098,
-0.09666873514652252,
-0.22144533693790436,
-0.2055220752954483,
0.11367989331483841,
-0.06151984632015228,
-0.012924421578645706,
0.03592987731099129,
-0.07913964986801147,
-0.05486651510000229,
-0.056756019592285156,
0.17134714126586914,
-0.06234952062368393,
-0.011343680322170258,
-0.015544477850198746,
0.07735458016395569,
0.01097035314887762,
-0.20932519435882568,
0.03547607734799385,
-0.00457471190020442,
-0.014564983546733856,
0.005250104703009129,
-0.10155917704105377,
0.09133149683475494,
0.15284296870231628,
-0.08223503828048706,
0.020993927493691444,
0.007785246707499027,
0.18863999843597412,
-0.038409218192100525,
-0.056497201323509216,
0.14137248694896698,
-0.01832461543381214,
-0.010709448717534542,
0.014294966123998165,
-0.012756719253957272,
-0.0979548841714859,
0.06371118128299713,
-0.010170597583055496,
-0.02525796927511692,
-0.2724972367286682,
-0.008385159075260162,
-0.07849147915840149,
0.05528568848967552,
0.03851773589849472,
0.04204170033335686,
-0.08820099383592606,
0.02834341861307621,
0.06103214621543884,
0.1522650122642517,
-0.0050050620920956135,
0.05347972735762596,
0.0580734945833683,
-0.0006667496054433286,
0.007342704106122255,
-0.10020630061626434,
0.011874754913151264,
0.07401255518198013,
0.11116921156644821,
0.27196529507637024,
-0.1003250777721405,
0.19892539083957672,
0.04909124597907066,
0.04813194274902344,
0.05038802698254585,
0.13354265689849854,
-0.131882905960083,
0.0328034982085228,
0.003273364622145891,
-0.008986861445009708,
-0.11049972474575043,
0.009124365635216236,
-0.06617020815610886,
0.09074060618877411,
-0.10557939857244492,
-0.059516437351703644,
0.010938418097794056,
0.1478281170129776,
0.040951360017061234,
-0.22664453089237213,
-0.12999069690704346,
0.0206947922706604,
-0.09567459672689438,
-0.10630892962217331,
0.06572479754686356,
0.24570029973983765,
-0.0769837349653244,
-0.04131823778152466,
-0.004571255762130022,
0.13398291170597076,
-0.03856571391224861,
-0.022040268406271935,
-0.03499656915664673,
0.0649256780743599,
0.01746792159974575,
0.13522720336914062,
-0.2924206554889679,
0.1300937384366989,
-0.008625982329249382,
0.06253848969936371,
-0.029207056388258934,
0.0483180470764637,
-0.03764227032661438,
0.07700616866350174,
0.03814908117055893,
-0.009708741679787636,
0.03554810583591461,
-0.15987233817577362,
0.014476767741143703,
0.04165710508823395,
0.01685858704149723,
0.056045375764369965,
0.06336773186922073,
-0.0032480352092534304,
0.057311367243528366,
-0.019222628325223923,
-0.12531529366970062,
-0.07070320844650269,
-0.06595469266176224,
-0.019129503518342972,
-0.029652511700987816,
-0.01412690244615078,
-0.045150741934776306,
-0.0099725853651762,
0.08012199401855469,
0.18372270464897156,
-0.09488344937562943,
-0.07810988277196884,
-0.07459236681461334,
0.05325052887201309,
0.10933573544025421,
-0.08124101161956787,
0.030123740434646606,
-0.002933887066319585,
0.045913081616163254,
-0.00777831394225359,
-0.07517483830451965,
0.05140019208192825,
-0.037647247314453125,
-0.0691487044095993,
-0.011188827455043793,
0.0626310482621193,
-0.00007605431892443448,
0.02686900645494461,
0.012229496613144875,
-0.09498521685600281,
-0.045102108269929886,
-0.12029077112674713,
-0.1282728761434555,
-0.041879408061504364,
0.017397068440914154,
0.04044605791568756,
-0.14639881253242493,
-0.05574636533856392,
0.004527729004621506,
-0.03888686001300812,
0.1305883526802063,
0.1546846628189087,
-0.056121837347745895,
0.02983704023063183,
0.1469491869211197,
-0.060603562742471695,
-0.1894109845161438,
0.033434510231018066,
0.04494094103574753,
0.11981477588415146,
-0.04356386139988899,
-0.1647861897945404,
0.04816245660185814,
0.020765574648976326,
0.03646833077073097,
0.052325766533613205,
-0.3101271688938141,
-0.1241973489522934,
0.08240305632352829,
0.16065198183059692,
0.11957995593547821,
-0.12259634584188461,
-0.03920579329133034,
-0.0657210573554039,
-0.15841151773929596,
0.09256353974342346,
-0.05175025016069412,
0.13365523517131805,
-0.07583749294281006,
0.027853243052959442,
0.03443406894803047,
-0.045433737337589264,
0.07236211746931076,
0.03143804520368576,
0.121199831366539,
-0.041860997676849365,
0.018753860145807266,
0.1267392933368683,
-0.03322696313261986,
0.18396766483783722,
-0.14773622155189514,
0.09791895747184753,
-0.23239769041538239,
-0.059852682054042816,
-0.07516288757324219,
0.0032966877333819866,
-0.034237731248140335,
-0.0457456149160862,
-0.07543342560529709,
0.03177928552031517,
-0.0032214922830462456,
-0.00714138662442565,
0.040616851300001144,
-0.030294762924313545,
-0.019677750766277313,
0.10471536964178085,
0.10318756848573685,
-0.01591910794377327,
-0.06946760416030884,
0.05532973259687424,
0.050704676657915115,
0.11458373069763184,
-0.19451943039894104,
0.029636383056640625,
0.10335025936365128,
0.015062532387673855,
0.12459808588027954,
0.04357406869530678,
-0.1036883294582367,
0.04257801175117493,
0.08789870142936707,
-0.07603859901428223,
-0.06337906420230865,
-0.02179696410894394,
-0.07959043234586716,
-0.06702035665512085,
0.051368530839681625,
0.09521617740392685,
-0.05096560716629028,
-0.01978333853185177,
-0.024489397183060646,
-0.020254118368029594,
-0.11319029331207275,
0.1851515918970108,
0.07621759176254272,
0.08453357964754105,
-0.06674502789974213,
0.06319493055343628,
0.08439476788043976,
-0.08187151700258255,
0.007716469466686249,
0.18763920664787292,
-0.10325998067855835,
-0.04808143898844719,
0.07215840369462967,
0.22054007649421692,
-0.02661564201116562,
-0.06018601730465889,
-0.13991473615169525,
-0.07710639387369156,
0.03128961846232414,
0.16333499550819397,
0.10173598676919937,
0.09532999992370605,
-0.028008390218019485,
-0.0013612671755254269,
-0.1073371171951294,
0.09211909770965576,
0.0643509104847908,
0.04953795298933983,
-0.10596448183059692,
0.13052251935005188,
0.03954650089144707,
0.11995895951986313,
-0.027077093720436096,
-0.011508775874972343,
-0.137970969080925,
0.06408735364675522,
-0.11263281852006912,
0.03462241590023041,
-0.009852002374827862,
0.05145880952477455,
-0.024485977366566658,
0.0031534465961158276,
-0.031984999775886536,
0.06838013231754303,
-0.08157607167959213,
0.0011102573480457067,
0.0040263645350933075,
0.055166441947221756,
-0.051547612994909286,
-0.019107356667518616,
0.0316338948905468,
-0.09228834509849548,
0.12342341989278793,
-0.036790881305933,
-0.02862262725830078,
0.08001452684402466,
-0.04583330452442169,
0.03991539403796196,
0.015629718080163002,
0.04919089749455452,
0.020337756723165512,
0.01539425365626812,
0.07804597169160843,
0.03714655339717865,
0.05377425625920296,
0.024649620056152344,
0.12235662341117859,
-0.13915666937828064,
-0.08603370189666748,
-0.055139943957328796,
-0.11336752772331238,
-0.05684376880526543,
0.10003044456243515,
0.04823467880487442,
0.10506762564182281,
0.09169973433017731,
-0.03197404369711876,
0.009557845070958138,
-0.12687118351459503,
-0.06632008403539658,
0.028293734416365623,
-0.031337834894657135,
-0.08536095172166824,
-0.055629756301641464,
0.037998296320438385,
-0.03216512128710747,
0.12444713711738586,
0.01882612705230713,
0.0373830683529377,
-0.019867924973368645,
-0.05938030406832695,
-0.014972220174968243,
0.02185095101594925,
0.21228693425655365,
-0.0850178524851799,
0.04151301458477974,
-0.00038309741648845375,
0.015852272510528564,
0.007715173996984959,
0.11560791730880737,
0.11763978004455566,
0.16664092242717743,
-0.03287908434867859,
0.10055568814277649,
0.01837589032948017,
0.00038718560244888067,
-0.07498201727867126,
0.01965821348130703,
0.021877923980355263,
0.06305984407663345,
-0.048541195690631866,
0.18806636333465576,
0.09210503101348877,
-0.12355983257293701,
0.10943177342414856,
0.025615112856030464,
-0.13277794420719147,
-0.03273976594209671,
0.021936383098363876,
-0.03701609745621681,
-0.14781484007835388,
0.023931356146931648,
-0.12872754037380219,
-0.017094828188419342,
0.04990433529019356,
0.05099307373166084,
-0.07901956140995026,
0.17163318395614624,
0.03675435110926628,
-0.059324368834495544,
0.05527300387620926,
-0.0012497943826019764,
0.026331929489970207,
0.02236071042716503,
0.036624543368816376,
0.036556243896484375,
-0.03748730197548866,
0.03706950321793556,
0.024427806958556175,
-0.024411214515566826,
-0.01835264451801777,
-0.020880339667201042,
-0.0025858846493065357,
-0.016129275783896446,
0.020145004615187645,
0.055738020688295364,
0.15872414410114288,
0.03672128543257713,
-0.07565228641033173,
-0.017492450773715973,
0.17090333998203278,
-0.02777804620563984,
-0.09512647986412048,
-0.1273985207080841,
0.12922883033752441,
0.0520399808883667,
0.009994251653552055,
0.02664165012538433,
-0.08157522231340408,
-0.05372351408004761,
0.20870345830917358,
0.055775079876184464,
-0.03157210722565651,
-0.02331574447453022,
0.008121120743453503,
-0.002393467118963599,
-0.04189571738243103,
0.2036024034023285,
0.022609511390328407,
0.22494617104530334,
0.023123562335968018,
-0.008887389674782753,
-0.06920038908720016,
-0.04048539698123932,
0.003648224053904414,
0.11996982246637344,
-0.03892454504966736,
-0.03933882340788841,
-0.08305259048938751,
-0.003272887784987688,
-0.001099956687539816,
-0.08035767078399658,
0.10051431506872177,
-0.13580705225467682,
-0.09833724051713943,
-0.04932410642504692,
0.04860912635922432,
-0.05925953388214111,
0.016285113990306854,
-0.02482168935239315,
0.04453984275460243,
0.07056812196969986,
-0.032765474170446396,
-0.10149015486240387,
-0.17022088170051575,
0.09617283195257187,
-0.05089936777949333,
0.1325681507587433,
-0.01569977030158043,
0.15341472625732422,
0.0848999172449112,
0.025955595076084137,
-0.06386760622262955,
0.11560388654470444,
0.031029831618070602,
0.05793505534529686,
0.048947516828775406,
0.1209183782339096,
-0.050774358212947845,
0.13580456376075745,
-0.050661057233810425,
-0.028489863499999046,
-0.028485335409641266,
-0.07592710107564926,
-0.018393587321043015,
-0.16373339295387268,
-0.020625341683626175,
-0.09538881480693817,
0.09319540858268738,
0.1950041949748993,
-0.043750613927841187,
-0.03062385506927967,
-0.09235469996929169,
0.1096717119216919,
-0.01071971096098423,
0.06306439638137817,
-0.031382638961076736,
-0.17446456849575043,
-0.000667575397528708,
0.013954485766589642,
0.01380910724401474,
-0.27639666199684143,
-0.0070144315250217915,
-0.03947165235877037,
-0.028338028118014336,
-0.08615116775035858,
0.16015233099460602,
0.08800707012414932,
0.049977097660303116,
-0.0405922494828701,
-0.15838408470153809,
-0.03639231622219086,
0.05715565383434296,
-0.13982859253883362,
-0.14614850282669067
] |
null | null |
transformers
|
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the api recommendation generation task for the java apis.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_api_generation_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_api_generation_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/api%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 130,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]}
|
summarization
|
SEBIS/code_trans_t5_large_api_generation_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 large model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the api recommendation generation task for the java apis.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 130,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 130,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 130,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 130,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08296599239110947,
0.0734611377120018,
-0.0014252864057198167,
0.11091944575309753,
0.050809573382139206,
0.01990205980837345,
0.05324557051062584,
0.09288199245929718,
-0.03836359828710556,
0.059193313121795654,
0.06303497403860092,
-0.040737394243478775,
0.06326552480459213,
0.18389774858951569,
0.022780973464250565,
-0.1874907910823822,
-0.008818307891488075,
0.028378356248140335,
-0.025543956086039543,
0.10719577968120575,
0.09967505931854248,
-0.08158387988805771,
0.06307175755500793,
-0.031849514693021774,
-0.11862560361623764,
0.05564872920513153,
-0.04656123369932175,
-0.048628341406583786,
0.08709993958473206,
0.06357279419898987,
0.11551220715045929,
-0.031529612839221954,
0.06286589056253433,
-0.2021338939666748,
0.0012164239306002855,
0.015988528728485107,
0.05465468391776085,
0.02354804240167141,
0.060925811529159546,
0.07938773185014725,
0.1445196270942688,
-0.019450431689620018,
0.04060909524559975,
0.04919937252998352,
-0.06277837604284286,
-0.08000953495502472,
-0.05141248181462288,
0.0776069164276123,
0.10935048013925552,
0.0883592739701271,
-0.012700248509645462,
0.032156698405742645,
-0.08674897998571396,
0.08345948904752731,
0.12120504677295685,
-0.22281938791275024,
-0.023618856444954872,
0.10602307319641113,
0.09264279901981354,
0.06425564736127853,
-0.08737951517105103,
-0.03722136840224266,
0.10877511650323868,
0.03660540655255318,
0.04766974598169327,
-0.08143198490142822,
-0.0746091678738594,
-0.0014947211602702737,
-0.05257265269756317,
-0.051635440438985825,
0.13982394337654114,
0.054610054939985275,
-0.05671103671193123,
-0.10209225118160248,
-0.03920173645019531,
-0.17867957055568695,
0.04878760129213333,
0.014140409417450428,
0.017237862572073936,
-0.02029317431151867,
0.052449002861976624,
0.0014613884268328547,
-0.09256747364997864,
-0.10830477625131607,
-0.0026424808893352747,
0.0779181718826294,
0.07501149922609329,
0.024386677891016006,
-0.013899251818656921,
0.07447251677513123,
-0.01494633685797453,
-0.05931852385401726,
-0.030560016632080078,
0.0069124712608754635,
-0.11968108266592026,
0.02365046925842762,
-0.012826463207602501,
-0.06705993413925171,
-0.010914751328527927,
0.0858476534485817,
-0.08442745357751846,
0.07215984165668488,
0.11334282904863358,
0.004431640263646841,
0.003550349036231637,
0.2160695344209671,
0.03991612046957016,
-0.14499960839748383,
0.0012086288770660758,
0.026588069275021553,
0.004538864828646183,
-0.00601070886477828,
-0.05759795382618904,
-0.04973238334059715,
0.005322261247783899,
0.0541549026966095,
-0.14165745675563812,
0.01674671284854412,
-0.04276615008711815,
-0.018419554457068443,
0.0794399306178093,
-0.1325995773077011,
0.030554957687854767,
0.01159404031932354,
-0.06866296380758286,
-0.0406075082719326,
0.06552474200725555,
-0.10761706531047821,
-0.11231725662946701,
0.02465112693607807,
-0.05150705575942993,
-0.0322769396007061,
-0.12353812903165817,
-0.12080942839384079,
-0.0087139792740345,
-0.040334057062864304,
0.019285298883914948,
-0.10168315470218658,
-0.1065012663602829,
-0.0168512761592865,
0.03246729448437691,
-0.005592459347099066,
-0.019702626392245293,
-0.05252712965011597,
0.02195471152663231,
-0.005404110997915268,
-0.03404068946838379,
0.025381367653608322,
-0.042940057814121246,
0.09402070939540863,
0.09177029877901077,
0.053088702261447906,
0.011662768200039864,
0.03271428868174553,
-0.07199439406394958,
0.06414040923118591,
-0.08494851738214493,
0.05061768740415573,
-0.027879640460014343,
0.06124834716320038,
-0.08875050395727158,
-0.08609975874423981,
0.052695490419864655,
0.053781695663928986,
0.04874720051884651,
0.01296304166316986,
-0.0965537279844284,
0.016762616112828255,
0.14022867381572723,
-0.10426431894302368,
-0.13782808184623718,
0.11442328244447708,
-0.006425460334867239,
0.0050506857223808765,
0.06757953017950058,
0.13614916801452637,
0.1508081555366516,
-0.08799982815980911,
-0.03839514032006264,
0.07368506491184235,
0.0691356360912323,
-0.06352619081735611,
0.10175154358148575,
0.02844761498272419,
0.025029785931110382,
0.028354384005069733,
0.028179308399558067,
0.0703791081905365,
-0.0019990098662674427,
-0.034582529217004776,
-0.01789000816643238,
-0.08495568484067917,
-0.03418324887752533,
-0.010792386718094349,
0.028477374464273453,
-0.07031569629907608,
-0.07097936421632767,
0.03732083737850189,
0.18152572214603424,
-0.10202273726463318,
0.02574017457664013,
-0.08506761491298676,
-0.04622539132833481,
-0.08486679196357727,
0.010707231238484383,
-0.10380175709724426,
0.01990475319325924,
0.05419893562793732,
-0.06484828889369965,
0.05751338601112366,
0.08114974945783615,
0.004398400895297527,
0.04012400284409523,
-0.04288952052593231,
-0.04030213505029678,
-0.05967075750231743,
-0.0547216571867466,
-0.11490844190120697,
-0.023313434794545174,
-0.09061478078365326,
-0.031913913786411285,
-0.07403368502855301,
-0.18757155537605286,
0.009083171375095844,
-0.03976054489612579,
0.029822662472724915,
0.009509352967143059,
-0.020190004259347916,
0.03889971598982811,
0.048598840832710266,
-0.055740777403116226,
-0.08101589977741241,
0.01179079432040453,
0.021008798852562904,
-0.10222449898719788,
-0.039468467235565186,
-0.1155647411942482,
-0.04238857701420784,
0.07267028093338013,
0.08511429280042648,
-0.07162840664386749,
-0.0014309175312519073,
-0.029929012060165405,
-0.06320220232009888,
-0.04513642564415932,
-0.0695798322558403,
0.15864084661006927,
0.009800366126000881,
0.1695498377084732,
-0.14404724538326263,
-0.0542636401951313,
-0.03252771496772766,
-0.012085829861462116,
0.016930270940065384,
0.15488675236701965,
-0.00009265154949389398,
-0.08891768008470535,
0.05451043322682381,
-0.013809806667268276,
-0.07166222482919693,
0.17318812012672424,
-0.0038188251201063395,
-0.09552878886461258,
0.024823259562253952,
0.10133862495422363,
-0.02047516219317913,
0.13893777132034302,
-0.09386471658945084,
-0.007441366091370583,
0.007063631899654865,
0.03187442570924759,
0.04070509970188141,
-0.12058572471141815,
0.018755165860056877,
0.055638257414102554,
-0.0719565823674202,
-0.04888366162776947,
-0.028782516717910767,
-0.043109454214572906,
0.04043160378932953,
0.0007408458623103797,
-0.005882874131202698,
-0.012405668385326862,
-0.024972347542643547,
-0.08925408124923706,
0.20381981134414673,
-0.08333688229322433,
-0.21077768504619598,
-0.17236889898777008,
0.02288687601685524,
-0.061706170439720154,
-0.007122666575014591,
0.04763311520218849,
-0.1176365464925766,
-0.07026471197605133,
-0.08948857337236404,
0.13999983668327332,
-0.10468034446239471,
0.006381221581250429,
-0.00588487321510911,
0.039461083710193634,
0.029656212776899338,
-0.18270012736320496,
0.0322292298078537,
-0.008872644044458866,
0.008963178843259811,
-0.0035244629252701998,
-0.06578072905540466,
0.09242220968008041,
0.12494415789842606,
-0.08647621423006058,
0.01785878650844097,
-0.00247950223274529,
0.1606224775314331,
-0.05069077014923096,
0.02235257811844349,
0.20825554430484772,
0.016265233978629112,
0.029466213658452034,
0.045384880155324936,
0.01167474128305912,
-0.09100442379713058,
0.06658149510622025,
0.054057974368333817,
-0.028457297012209892,
-0.25697386264801025,
-0.0020690045785158873,
-0.06753396987915039,
0.03972161188721657,
0.11043926328420639,
0.052832845598459244,
-0.132309228181839,
0.03251393511891365,
-0.0051133232191205025,
0.154570072889328,
-0.029900692403316498,
0.05471060425043106,
0.016759727150201797,
0.02091173268854618,
0.0058180042542517185,
-0.09844823181629181,
0.01124460157006979,
0.07713824510574341,
0.12035294622182846,
0.21452638506889343,
-0.06490164995193481,
0.18183013796806335,
0.028268076479434967,
0.057029034942388535,
0.029083481058478355,
0.10894010215997696,
-0.1283091902732849,
-0.0034132932778447866,
0.0005596624687314034,
-0.015970101580023766,
-0.06914503127336502,
0.050308506935834885,
-0.045861054211854935,
0.07981140911579132,
-0.061196133494377136,
0.02159259468317032,
0.01827961765229702,
0.14194560050964355,
0.05756896361708641,
-0.1897609829902649,
-0.1192047968506813,
0.02263658121228218,
-0.1138528361916542,
-0.11859024316072464,
0.0688680037856102,
0.20318521559238434,
-0.04436210170388222,
0.019427472725510597,
-0.0030374459456652403,
0.13733163475990295,
-0.07166218757629395,
-0.023662911728024483,
0.02310352772474289,
0.07383681833744049,
0.00765373045578599,
0.13484571874141693,
-0.2630796432495117,
0.09865874797105789,
0.006630305200815201,
0.0946439877152443,
-0.017615433782339096,
0.0668080598115921,
-0.03223666176199913,
0.009601091034710407,
0.07355134189128876,
-0.0010367990471422672,
-0.04937043786048889,
-0.19384242594242096,
-0.05070551857352257,
0.02766474150121212,
0.03690081462264061,
-0.00825695600360632,
0.08570175617933273,
-0.004784216172993183,
0.06611530482769012,
-0.028687627986073494,
-0.1214844211935997,
-0.07234262675046921,
-0.12368768453598022,
-0.027888093143701553,
0.0022327688056975603,
-0.0284267645329237,
-0.024596484377980232,
0.010669318027794361,
-0.024087730795145035,
0.21587038040161133,
-0.13945333659648895,
-0.10994654893875122,
-0.08700508624315262,
0.08034192025661469,
0.1247025802731514,
-0.10119990259408951,
0.013034804724156857,
0.01734374649822712,
0.05482395738363266,
-0.0417645163834095,
-0.05230138078331947,
0.021368544548749924,
-0.053171176463365555,
-0.07952730357646942,
-0.022671252489089966,
0.08108031004667282,
-0.008631270378828049,
0.04586973786354065,
0.003406072501093149,
-0.09552785009145737,
-0.04232863336801529,
-0.13647951185703278,
-0.09606090188026428,
-0.008681331761181355,
0.036981236189603806,
-0.009242305532097816,
-0.08397193253040314,
0.06710730493068695,
-0.010030839592218399,
-0.09416819363832474,
0.07061323523521423,
0.1870931088924408,
-0.05719902738928795,
0.01996733248233795,
0.12656201422214508,
-0.05634547397494316,
-0.14133457839488983,
-0.05705343559384346,
0.05527593567967415,
0.09191430360078812,
-0.024335721507668495,
-0.13720138370990753,
0.08704204112291336,
0.042511023581027985,
0.023481355980038643,
-0.00013640175166074187,
-0.27418139576911926,
-0.1230410635471344,
0.05303748697042465,
0.0936458483338356,
0.060885343700647354,
-0.11420463025569916,
-0.04178544506430626,
-0.060991089791059494,
-0.11150643974542618,
0.05278908088803291,
0.053178247064352036,
0.12428704649209976,
-0.04373519495129585,
0.03608834370970726,
0.02776971086859703,
-0.026738876476883888,
0.10759083926677704,
0.010286246426403522,
0.1026805117726326,
-0.023007534444332123,
0.030010713264346123,
0.05380018800497055,
-0.06175906956195831,
0.1816469132900238,
-0.17320191860198975,
0.07600472122430801,
-0.23103736340999603,
-0.05970709025859833,
-0.017722655087709427,
-0.00531386025249958,
-0.039306771010160446,
-0.061862628906965256,
-0.09884326159954071,
0.01664922945201397,
0.048101671040058136,
-0.01843438297510147,
0.08017434179782867,
-0.025023529306054115,
-0.048479270190000534,
0.04338512197136879,
0.09153246134519577,
-0.025469014421105385,
-0.12153374403715134,
0.016156602650880814,
0.03377821668982506,
0.09387132525444031,
-0.21438837051391602,
0.018231263384222984,
0.11937128752470016,
0.004621073603630066,
0.10645791888237,
0.010961080901324749,
-0.07760363817214966,
0.04423825070261955,
0.07679629325866699,
-0.03783731907606125,
-0.07706455141305923,
-0.007675647735595703,
-0.02032814733684063,
-0.08923584967851639,
0.03471392020583153,
0.0844762921333313,
-0.060457054525613785,
-0.019060974940657616,
-0.010158905759453773,
0.0031717850361019373,
-0.06924564391374588,
0.1919194459915161,
0.03247039392590523,
0.08838068693876266,
-0.06068064272403717,
0.08123263716697693,
0.10302109271287918,
-0.11557160317897797,
0.012518753297626972,
0.16625966131687164,
-0.07892907410860062,
-0.023546716198325157,
0.05030513554811478,
0.10627911239862442,
-0.04687686264514923,
-0.07400566339492798,
-0.09557054191827774,
-0.0716034397482872,
0.01932608149945736,
0.025630900636315346,
0.07365459948778152,
0.07779091596603394,
-0.033197030425071716,
0.014412732794880867,
-0.0956593006849289,
0.09911064058542252,
0.0745251253247261,
0.05437254533171654,
-0.15020084381103516,
0.12939977645874023,
0.041761867702007294,
0.08582466840744019,
0.00039862675475887954,
0.036838676780462265,
-0.09598800539970398,
0.04158169776201248,
-0.037626080214977264,
0.042927954345941544,
-0.01675911620259285,
0.05845581367611885,
-0.03280738741159439,
0.019665300846099854,
-0.03222956880927086,
0.053452134132385254,
-0.044670701026916504,
-0.021962810307741165,
-0.019805876538157463,
0.049834903329610825,
-0.054781317710876465,
-0.02240091562271118,
0.014324103482067585,
-0.0782209262251854,
0.10289463400840759,
-0.07258331030607224,
-0.01001528836786747,
0.007176689337939024,
-0.017269937321543694,
0.07164149731397629,
0.02277563139796257,
0.057170551270246506,
-0.005774246994405985,
-0.009639865718781948,
0.04867947846651077,
0.019301602616906166,
-0.002798996167257428,
-0.00504155782982707,
0.05134255066514015,
-0.14730216562747955,
-0.09266863018274307,
-0.09910554438829422,
-0.07533656060695648,
-0.06914202868938446,
0.08309329301118851,
0.08423729985952377,
0.07305295765399933,
0.08914253115653992,
-0.021352211013436317,
-0.00838333461433649,
-0.14272451400756836,
-0.03411608934402466,
0.049898285418748856,
-0.021083692088723183,
-0.10728376358747482,
-0.04304003715515137,
0.04791618883609772,
-0.040471937507390976,
0.12308767437934875,
-0.007318973541259766,
0.05015984922647476,
-0.012605912983417511,
-0.060367174446582794,
-0.009923771023750305,
-0.010821512900292873,
0.2136114090681076,
-0.09617462009191513,
0.023125534877181053,
0.012705471366643906,
-0.009655858390033245,
0.038458868861198425,
0.13101208209991455,
0.09534357488155365,
0.14888817071914673,
0.042586326599121094,
0.10672565549612045,
-0.05085055157542229,
-0.03444456309080124,
-0.17564155161380768,
0.04457848519086838,
0.006843783427029848,
0.03371258080005646,
-0.02327566407620907,
0.094608373939991,
0.15533266961574554,
-0.13416726887226105,
0.09236740320920944,
0.020174477249383926,
-0.10019584745168686,
-0.05105166137218475,
-0.07582713663578033,
-0.04654403403401375,
-0.10007669031620026,
0.023039018735289574,
-0.11561211943626404,
0.02189861424267292,
0.0828220471739769,
0.04628194123506546,
-0.023263247683644295,
0.14432425796985626,
-0.014617929235100746,
-0.05847790837287903,
0.010045453906059265,
0.02115875482559204,
0.047064412385225296,
0.09206728637218475,
0.012809338048100471,
0.07847068458795547,
-0.05588480830192566,
0.06814830005168915,
0.015011847950518131,
0.01685325615108013,
0.012095706537365913,
0.0019558428321033716,
-0.003839598037302494,
-0.044484034180641174,
-0.0024771117605268955,
0.08524732291698456,
0.17309707403182983,
0.043566517531871796,
-0.045954301953315735,
-0.04621986299753189,
0.17668558657169342,
-0.04514182731509209,
-0.07197928428649902,
-0.1226382926106453,
0.154143825173378,
0.0644172877073288,
0.021428024396300316,
0.012088708579540253,
-0.07316812127828598,
-0.05539300665259361,
0.2221570909023285,
0.005252658389508724,
-0.020311135798692703,
-0.04303978011012077,
-0.010019737295806408,
-0.007686174474656582,
-0.045604318380355835,
0.1488252580165863,
0.024145860224962234,
0.20897524058818817,
0.006629688199609518,
-0.018923699855804443,
-0.04549663886427879,
-0.026247315108776093,
-0.02320646122097969,
0.185756653547287,
-0.03024427592754364,
0.026447255164384842,
-0.08955055475234985,
-0.008538292720913887,
0.04244785010814667,
-0.10732699930667877,
0.08546829223632812,
-0.09261526167392731,
-0.08370433002710342,
0.02074412815272808,
0.08330487459897995,
-0.02032490447163582,
0.02640485391020775,
-0.010330087505280972,
0.047894757241010666,
0.03823532909154892,
-0.02387092448771,
-0.09638497233390808,
-0.12965859472751617,
0.05962314084172249,
-0.0043723806738853455,
0.16649271547794342,
0.02524794265627861,
0.09346330165863037,
0.08502879738807678,
0.011204466223716736,
-0.07789073139429092,
0.11829008162021637,
0.03443975746631622,
0.023288460448384285,
0.0725986659526825,
0.12410934269428253,
-0.03420686721801758,
0.13510102033615112,
-0.0028546948451548815,
-0.03173346444964409,
-0.036336302757263184,
-0.019849121570587158,
-0.010120592080056667,
-0.1439039558172226,
0.012111140415072441,
-0.06021690368652344,
0.13099806010723114,
0.1847773641347885,
-0.045573800802230835,
-0.03181694820523262,
-0.04247575253248215,
0.0856846272945404,
-0.0164178479462862,
0.08898290246725082,
-0.0018488744972273707,
-0.16319485008716583,
0.01282428577542305,
-0.029101569205522537,
0.016863754019141197,
-0.18002818524837494,
-0.039696335792541504,
-0.04079514369368553,
-0.03652491793036461,
-0.08187839388847351,
0.1338813304901123,
0.07527384907007217,
0.030644308775663376,
-0.04618711769580841,
-0.19979585707187653,
-0.026203932240605354,
0.050480104982852936,
-0.14424629509449005,
-0.12720537185668945
] |
null | null |
transformers
|
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the api recommendation generation task for the java apis.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_api_generation_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_api_generation_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/api%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]}
|
summarization
|
SEBIS/code_trans_t5_large_api_generation_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 large model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the api recommendation generation task for the java apis.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
110
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08926795423030853,
0.06268922239542007,
-0.001422892208211124,
0.11793458461761475,
0.0491224005818367,
0.02077338844537735,
0.01915792189538479,
0.10418134182691574,
-0.03654805198311806,
0.06313914060592651,
0.054410479962825775,
-0.06609315425157547,
0.05513741075992584,
0.18819227814674377,
0.01753731071949005,
-0.18957938253879547,
-0.027089271694421768,
0.030282091349363327,
-0.05603143200278282,
0.11104783415794373,
0.08968951553106308,
-0.0726541057229042,
0.07589308172464371,
-0.04299062862992287,
-0.11138742417097092,
0.05208411067724228,
-0.023755576461553574,
-0.03414805978536606,
0.098294198513031,
0.06518632173538208,
0.11386460065841675,
-0.03534825146198273,
0.05402078852057457,
-0.19186173379421234,
0.0022390014491975307,
0.03283596783876419,
0.06100812926888466,
0.03473123908042908,
0.048091236501932144,
0.06517263501882553,
0.15298761427402496,
-0.010817666538059711,
0.04939362406730652,
0.058417003601789474,
-0.07316531240940094,
-0.08434394747018814,
-0.051208171993494034,
0.04963380843400955,
0.07848210632801056,
0.10267521440982819,
-0.00989674311131239,
0.017160670831799507,
-0.09248407930135727,
0.08079934865236282,
0.13057081401348114,
-0.2176477313041687,
-0.02467072755098343,
0.10420237481594086,
0.08397696912288666,
0.06940058618783951,
-0.08608052879571915,
-0.03240250051021576,
0.10590232908725739,
0.04251875728368759,
0.07464641332626343,
-0.08415131270885468,
-0.042241934686899185,
-0.003099484369158745,
-0.051259927451610565,
-0.048403024673461914,
0.16046260297298431,
0.04327797144651413,
-0.04757291451096535,
-0.11300770193338394,
-0.05539545789361,
-0.1936885416507721,
0.04248028248548508,
0.012588704004883766,
0.013188407756388187,
-0.009633025154471397,
0.022453289479017258,
-0.011063473299145699,
-0.08631028234958649,
-0.11420188844203949,
0.014211636036634445,
0.025634758174419403,
0.06200492009520531,
0.03469505161046982,
-0.016768483445048332,
0.08111231029033661,
0.0035817427560687065,
-0.05619360879063606,
-0.022227326408028603,
0.004146009683609009,
-0.11442821472883224,
0.025389941409230232,
-0.00316275330260396,
-0.05404051020741463,
-0.005092842038720846,
0.08072185516357422,
-0.10845987498760223,
0.07548275589942932,
0.08967028558254242,
0.008065230213105679,
-0.005175673868507147,
0.21926750242710114,
0.04428018629550934,
-0.16147707402706146,
0.018341924995183945,
0.0250565093010664,
0.0026870411820709705,
0.0026536385994404554,
-0.056225184351205826,
-0.05697399377822876,
0.013589277863502502,
0.06823969632387161,
-0.13544641435146332,
0.022718511521816254,
-0.05153981223702431,
-0.01574956253170967,
0.08737815916538239,
-0.1244906634092331,
0.030246460810303688,
0.009397435002028942,
-0.07236672192811966,
-0.026931002736091614,
0.08189576119184494,
-0.13459207117557526,
-0.11544378101825714,
0.02949165739119053,
-0.04634591564536095,
-0.04014500230550766,
-0.12353043258190155,
-0.11529776453971863,
-0.009434645064175129,
-0.04424648731946945,
0.0036399411037564278,
-0.10546276718378067,
-0.10246936231851578,
-0.017449552193284035,
0.035357698798179626,
-0.0073781488463282585,
-0.02451343648135662,
-0.05140268802642822,
0.016899676993489265,
-0.00015311459719669074,
-0.030002107843756676,
0.027374323457479477,
-0.04857785254716873,
0.09639377146959305,
0.0623798742890358,
0.055253222584724426,
-0.0011683690827339888,
0.034282855689525604,
-0.0903819128870964,
0.07255189120769501,
-0.1133432611823082,
0.06044790521264076,
-0.020593885332345963,
0.06723922491073608,
-0.09840348362922668,
-0.08375309407711029,
0.014201061800122261,
0.054820671677589417,
0.0539562925696373,
0.018863927572965622,
-0.11954230070114136,
0.018099691718816757,
0.13670171797275543,
-0.11398037523031235,
-0.126398965716362,
0.12319520860910416,
-0.00659062247723341,
0.0169143695384264,
0.07811395823955536,
0.14249995350837708,
0.1572839319705963,
-0.09730061143636703,
-0.014838346280157566,
0.07339321821928024,
0.046886708587408066,
-0.06818448007106781,
0.0694674402475357,
0.017115110531449318,
0.01406875904649496,
0.0351107083261013,
0.04671454057097435,
0.06827130168676376,
0.005723005626350641,
-0.03324901685118675,
-0.037095606327056885,
-0.0911877229809761,
-0.0408736877143383,
-0.0025477951858192682,
0.017979204654693604,
-0.0666842982172966,
-0.07475206255912781,
0.014132420532405376,
0.17525547742843628,
-0.09992673993110657,
0.024571983143687248,
-0.06691477447748184,
-0.03748081624507904,
-0.050189413130283356,
0.024102669209241867,
-0.10896599292755127,
0.01624780148267746,
0.0564662329852581,
-0.030340561643242836,
0.05431974306702614,
0.09036194533109665,
0.00900080893188715,
0.018994957208633423,
-0.060289546847343445,
-0.0417015478014946,
-0.036643676459789276,
-0.0704076737165451,
-0.10257197171449661,
-0.021977614611387253,
-0.08192022889852524,
-0.02021598257124424,
-0.05780565366148949,
-0.18300631642341614,
-0.0027193655259907246,
-0.022005651146173477,
0.033516302704811096,
0.028071779757738113,
-0.02542451210319996,
0.03279270976781845,
0.04643385112285614,
-0.041880473494529724,
-0.08868080377578735,
0.019268814474344254,
0.02648332342505455,
-0.08046257495880127,
-0.02280624583363533,
-0.10003101825714111,
-0.03790445998311043,
0.07353682816028595,
0.09197340160608292,
-0.09862817823886871,
-0.009466619230806828,
-0.025072425603866577,
-0.05499856546521187,
-0.06780115514993668,
-0.06360472738742828,
0.15888294577598572,
0.00923087541013956,
0.1688031703233719,
-0.14296314120292664,
-0.05225150287151337,
-0.03575311228632927,
0.00019408528169151396,
0.03262102231383324,
0.16337795555591583,
-0.014717761427164078,
-0.10421491414308548,
0.045105427503585815,
-0.03065558895468712,
-0.06773233413696289,
0.18501700460910797,
-0.00980560015887022,
-0.08562906831502914,
0.014449797570705414,
0.103258416056633,
-0.008145367726683617,
0.16043490171432495,
-0.06931500881910324,
0.0009704781696200371,
-0.000638685014564544,
0.025299832224845886,
0.04063894972205162,
-0.11686863750219345,
0.019730592146515846,
0.04236144572496414,
-0.0752442479133606,
-0.01933445781469345,
-0.02451471984386444,
-0.038338061422109604,
0.04941543936729431,
0.012720868922770023,
0.006008659955114126,
-0.016141990199685097,
-0.02779296413064003,
-0.09236648678779602,
0.1942426860332489,
-0.08468716591596603,
-0.23422200977802277,
-0.17447613179683685,
0.0652582049369812,
-0.034684982150793076,
-0.01707969233393669,
0.03108193539083004,
-0.10641681402921677,
-0.06920071691274643,
-0.09645038098096848,
0.13783974945545197,
-0.11427135020494461,
-0.00025130173889920115,
-0.04856318607926369,
0.06409238278865814,
0.04627791419625282,
-0.17271462082862854,
0.028157928958535194,
-0.022235576063394547,
0.009142949245870113,
-0.01565961167216301,
-0.0701102539896965,
0.08331532031297684,
0.12362082302570343,
-0.07683584839105606,
0.01978999562561512,
-0.006276327650994062,
0.1450745165348053,
-0.057839587330818176,
0.03842464089393616,
0.19439242780208588,
0.025498703122138977,
0.029018046334385872,
0.052148494869470596,
0.009451320394873619,
-0.09240669012069702,
0.07494350522756577,
0.053940363228321075,
-0.04279652237892151,
-0.236079603433609,
-0.02319270744919777,
-0.07233908772468567,
0.06434227526187897,
0.13081592321395874,
0.056276462972164154,
-0.12775838375091553,
0.01666521653532982,
-0.006450172979384661,
0.1539429873228073,
-0.02036939188838005,
0.04894644394516945,
0.033771317452192307,
0.01494892593473196,
0.007305033039301634,
-0.09749384224414825,
0.0147904884070158,
0.07948973029851913,
0.10889890789985657,
0.2172166407108307,
-0.09568139910697937,
0.188462495803833,
0.010964859277009964,
0.09080566465854645,
0.04982014745473862,
0.08919960260391235,
-0.12621448934078217,
0.0077011664398014545,
0.00350585556589067,
-0.02046571858227253,
-0.06836416572332382,
0.049830250442028046,
-0.04305621236562729,
0.0787227675318718,
-0.05957615748047829,
0.01672663353383541,
0.013743129558861256,
0.1808619201183319,
0.0625481978058815,
-0.18407213687896729,
-0.1221543475985527,
0.013303959742188454,
-0.0998038500547409,
-0.1130148321390152,
0.08095055818557739,
0.21459482610225677,
-0.04767157882452011,
0.01478611584752798,
-0.007368816994130611,
0.13712337613105774,
-0.10709016770124435,
-0.02543874830007553,
0.027667980641126633,
0.09056581556797028,
-0.0013283661101013422,
0.11843302100896835,
-0.2767089307308197,
0.09122313559055328,
0.011240167543292046,
0.09046714007854462,
-0.012157835997641087,
0.053384993225336075,
-0.0351712666451931,
0.004328486509621143,
0.07411718368530273,
0.0038627779576927423,
-0.05492650344967842,
-0.18392117321491241,
-0.04831298813223839,
0.019784821197390556,
0.03298066928982735,
-0.0023224400356411934,
0.07254747301340103,
-0.019587431102991104,
0.04547703266143799,
-0.025416716933250427,
-0.15276116132736206,
-0.05974337086081505,
-0.12908735871315002,
-0.047681402415037155,
0.0052431137301027775,
-0.04008489102125168,
-0.028591016307473183,
0.03327663242816925,
0.033033762127161026,
0.22088100016117096,
-0.11262662708759308,
-0.09985443949699402,
-0.09403374791145325,
0.07134886085987091,
0.13400456309318542,
-0.09409646689891815,
0.029216405004262924,
0.018300462514162064,
0.050381872802972794,
-0.03987528011202812,
-0.07133274525403976,
0.03337433934211731,
-0.04773906245827675,
-0.0753830075263977,
-0.02528870850801468,
0.11304564774036407,
-0.0035061368253082037,
0.045241422951221466,
-0.00011522074782988057,
-0.08605802059173584,
-0.03744734078645706,
-0.12937043607234955,
-0.09174911677837372,
-0.000004747724233311601,
0.017797552049160004,
-0.002294564852491021,
-0.09954105317592621,
0.060919150710105896,
-0.0020896149799227715,
-0.08384457975625992,
0.06770379841327667,
0.15034811198711395,
-0.07077472656965256,
0.03277001157402992,
0.10471788793802261,
-0.04986077919602394,
-0.16349436342716217,
-0.03566944599151611,
0.04659277945756912,
0.08250921219587326,
-0.034522123634815216,
-0.14114372432231903,
0.07257944345474243,
0.036423876881599426,
0.021193046122789383,
0.028267666697502136,
-0.2745894491672516,
-0.12702055275440216,
0.02542799524962902,
0.08057666569948196,
0.05800590664148331,
-0.0967787578701973,
-0.04553568363189697,
-0.06372272223234177,
-0.07040302455425262,
0.0639662891626358,
0.059575099498033524,
0.11539522558450699,
-0.04444970563054085,
0.03034324012696743,
0.03674740716814995,
-0.02038075029850006,
0.07084129005670547,
-0.009621288627386093,
0.10097803175449371,
-0.021110540255904198,
0.036915626376867294,
0.04635542631149292,
-0.06345922499895096,
0.18869704008102417,
-0.17369598150253296,
0.08545985072851181,
-0.19334758818149567,
-0.05045131593942642,
-0.027623092755675316,
0.0066940924152731895,
-0.029835131019353867,
-0.05883248522877693,
-0.11598353832960129,
0.03459735959768295,
0.04147667437791824,
-0.021422158926725388,
0.049901556223630905,
-0.029104694724082947,
-0.04613036662340164,
0.0593879297375679,
0.08416292816400528,
-0.02508421801030636,
-0.1319732367992401,
0.03066585585474968,
0.026874925941228867,
0.10111629962921143,
-0.2219434678554535,
0.018437860533595085,
0.11053723841905594,
0.014603075571358204,
0.09758582711219788,
0.007116768043488264,
-0.08575272560119629,
0.03283633291721344,
0.07129659503698349,
-0.051483627408742905,
-0.08319582790136337,
-0.019253293052315712,
-0.043330494314432144,
-0.08253667503595352,
0.026156162843108177,
0.09183668345212936,
-0.06739436089992523,
-0.004970381502062082,
-0.0019966713152825832,
0.00758753577247262,
-0.06669700145721436,
0.18934467434883118,
0.03341937065124512,
0.08019544184207916,
-0.06447632610797882,
0.08698179572820663,
0.09755527228116989,
-0.10814135521650314,
0.01627320609986782,
0.1755920648574829,
-0.08884989470243454,
-0.022108003497123718,
0.05615612119436264,
0.0990145355463028,
-0.03304314613342285,
-0.06096656620502472,
-0.0880218967795372,
-0.07694981247186661,
0.025365492329001427,
0.03684242069721222,
0.07262348383665085,
0.08437928557395935,
-0.036157429218292236,
0.010252026841044426,
-0.10451837629079819,
0.09904096275568008,
0.07502555847167969,
0.05205903202295303,
-0.13758398592472076,
0.11636186391115189,
0.04707147553563118,
0.06987879425287247,
-0.002216119086369872,
0.026636479422450066,
-0.10709398239850998,
0.0428481288254261,
-0.026784684509038925,
0.04216398671269417,
-0.0025468007661402225,
0.05744267255067825,
-0.04616319015622139,
0.029337216168642044,
-0.029867667704820633,
0.051637135446071625,
-0.037710197269916534,
-0.028005290776491165,
-0.02507498301565647,
0.03558184579014778,
-0.06090163439512253,
-0.02230921760201454,
0.006898883264511824,
-0.08014838397502899,
0.09905955195426941,
-0.06844969093799591,
-0.009831669740378857,
0.010041584260761738,
-0.005234324838966131,
0.06182510405778885,
0.026807885617017746,
0.050090242177248,
-0.0026341683696955442,
0.003288226667791605,
0.04083273559808731,
0.015111887827515602,
-0.009845597669482231,
-0.003949205856770277,
0.04897982254624367,
-0.14936970174312592,
-0.07462592422962189,
-0.09361552447080612,
-0.06854196637868881,
-0.06688424199819565,
0.07872611284255981,
0.08753345161676407,
0.07878581434488297,
0.08515702933073044,
-0.031164584681391716,
-0.0010934325400739908,
-0.15870408713817596,
-0.03880183771252632,
0.04990847408771515,
-0.021641753613948822,
-0.10020345449447632,
-0.02972186729311943,
0.05767391622066498,
-0.037042662501335144,
0.11998151987791061,
0.004272348713129759,
0.06255416572093964,
-0.009773666970431805,
-0.03807198628783226,
-0.03018495999276638,
-0.000328471272950992,
0.18418477475643158,
-0.09596318751573563,
0.012810756452381611,
0.00780487759038806,
0.0003320563118904829,
0.039128515869379044,
0.16400046646595,
0.09352464973926544,
0.13776615262031555,
0.044859763234853745,
0.08689785748720169,
-0.04803408682346344,
-0.02369283139705658,
-0.15306079387664795,
0.07528773695230484,
-0.011060834862291813,
0.04080505296587944,
-0.03218620643019676,
0.12024881690740585,
0.1275663673877716,
-0.13619296252727509,
0.1002398282289505,
0.021040983498096466,
-0.09679391235113144,
-0.04757370054721832,
-0.10221040993928909,
-0.04655006527900696,
-0.11866758018732071,
0.0066779181361198425,
-0.10664422810077667,
0.015785040333867073,
0.08413395285606384,
0.04182115197181702,
-0.029450364410877228,
0.14259256422519684,
-0.007196999154984951,
-0.06236061453819275,
0.043976303189992905,
0.04052454233169556,
0.032848574221134186,
0.09935032576322556,
0.023469725623726845,
0.06682346761226654,
-0.068782277405262,
0.056399889290332794,
0.02096409536898136,
0.0072313775308430195,
0.010103762149810791,
0.0169456098228693,
-0.002297325525432825,
-0.045656755566596985,
0.0022568569984287024,
0.06626689434051514,
0.16531431674957275,
0.05474280193448067,
-0.051575466990470886,
-0.04703319072723389,
0.2074110507965088,
-0.0493948720395565,
-0.060500793159008026,
-0.12788724899291992,
0.15979984402656555,
0.042470309883356094,
0.015069695189595222,
0.012185869738459587,
-0.0669325590133667,
-0.032882336527109146,
0.23701710999011993,
0.03944922611117363,
-0.03247874975204468,
-0.03367854282259941,
-0.015675168484449387,
-0.012273775413632393,
-0.04365479201078415,
0.16049450635910034,
0.017470303922891617,
0.2307826280593872,
0.010848053731024265,
-0.018673332408070564,
-0.04553406313061714,
-0.042658962309360504,
-0.021788805723190308,
0.18718308210372925,
-0.028858929872512817,
0.029489923268556595,
-0.09486658126115799,
-0.010969644412398338,
0.013081087730824947,
-0.11917949467897415,
0.09761960804462433,
-0.12129361182451248,
-0.07879550755023956,
0.00159435102250427,
0.06432191282510757,
-0.033488474786281586,
0.03673435002565384,
-0.016202785074710846,
0.05607430636882782,
0.03996127098798752,
-0.03048589825630188,
-0.11263673007488251,
-0.1543281227350235,
0.04647161811590195,
0.007604599930346012,
0.13484041392803192,
0.011847411282360554,
0.06930641829967499,
0.081077441573143,
0.016722766682505608,
-0.07450230419635773,
0.10331650823354721,
0.028595808893442154,
-0.007917179726064205,
0.056951578706502914,
0.12058147042989731,
-0.038066599518060684,
0.15788006782531738,
0.008609645068645477,
-0.02601524069905281,
-0.029930882155895233,
-0.02130196988582611,
-0.005815183278173208,
-0.15195971727371216,
0.0014848588034510612,
-0.052429571747779846,
0.13297034800052643,
0.19611741602420807,
-0.038803547620773315,
-0.028093019500374794,
-0.055415283888578415,
0.07891742140054703,
-0.014938910491764545,
0.08151368051767349,
0.008466552942991257,
-0.15505197644233704,
0.000033934131351998076,
-0.0012248170096427202,
-0.004330406431108713,
-0.18113631010055542,
-0.04512559622526169,
-0.04192548617720604,
-0.03497164696455002,
-0.0849669873714447,
0.14477220177650452,
0.06442435830831528,
0.03831992670893669,
-0.036665178835392,
-0.15790531039237976,
-0.009861711412668228,
0.050748568028211594,
-0.13709260523319244,
-0.11742979288101196
] |
null | null |
transformers
|
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_comment_generation_java_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_comment_generation_java_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/code%20comment%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_comment_generation_java_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12548638880252838,
-0.023206453770399094,
-0.00040023820474743843,
0.13166682422161102,
0.10537854582071304,
0.023492665961384773,
0.058611106127500534,
0.06601612269878387,
-0.026279471814632416,
0.01916593313217163,
0.04424285516142845,
0.009360011667013168,
0.03237054497003555,
0.19702981412410736,
0.00805832352489233,
-0.11808042973279953,
-0.014320529997348785,
0.0444006510078907,
-0.03665884956717491,
0.12811361253261566,
0.0942813977599144,
-0.0743633434176445,
0.05363839119672775,
-0.06978189200162888,
-0.2451879382133484,
0.0605289451777935,
-0.0060204192996025085,
-0.06380905210971832,
0.09934723377227783,
0.0460575632750988,
0.1253545582294464,
-0.004804073832929134,
0.021847113966941833,
-0.14253480732440948,
0.010993115603923798,
0.011309687979519367,
0.03348323330283165,
0.016816522926092148,
0.045628659427165985,
0.05441629886627197,
0.14078684151172638,
0.010172856040298939,
0.04176173359155655,
0.06084515154361725,
-0.07541649788618088,
-0.11707500368356705,
-0.006993278861045837,
0.024060232564806938,
0.050124697387218475,
0.10124034434556961,
-0.012621933594346046,
0.12281008809804916,
-0.1506541669368744,
0.1274256408214569,
0.10174129903316498,
-0.2177504152059555,
-0.012524150311946869,
0.12550398707389832,
0.09028070420026779,
0.09769792854785919,
-0.0606808140873909,
-0.06723293662071228,
0.1028570905327797,
0.052422769367694855,
0.045990195125341415,
-0.10111536085605621,
-0.11523126065731049,
0.02398029714822769,
-0.07452365010976791,
-0.06423142552375793,
0.22114476561546326,
0.0017140486743301153,
-0.07655224949121475,
-0.05428304150700569,
-0.025411225855350494,
-0.13284681737422943,
0.03658655658364296,
0.027273310348391533,
0.008660121820867062,
-0.033011697232723236,
0.016803039237856865,
0.03060845658183098,
-0.0736815482378006,
-0.15564016997814178,
0.027216678485274315,
0.09252265095710754,
0.05590161308646202,
0.02519376203417778,
-0.09652320295572281,
0.10433264821767807,
0.03697952255606651,
-0.060091063380241394,
-0.026257608085870743,
-0.01774417608976364,
-0.10328368842601776,
0.03365856409072876,
-0.05178293213248253,
-0.18062447011470795,
0.016570381820201874,
0.00869164988398552,
-0.05156320333480835,
0.05137861147522926,
0.028028329834342003,
0.03625360503792763,
0.02305658534169197,
0.1964961439371109,
0.05725817754864693,
-0.12040702253580093,
0.05317219719290733,
0.04351509362459183,
-0.036783672869205475,
-0.005077334586530924,
-0.06851599365472794,
-0.09645543247461319,
0.09304969012737274,
0.10361040383577347,
-0.1387166827917099,
0.03611118718981743,
-0.07046212255954742,
-0.04366922006011009,
0.0028883812483400106,
-0.15942886471748352,
0.0030550952069461346,
0.027072271332144737,
-0.06720653176307678,
-0.05633198469877243,
0.09224288910627365,
-0.1701134294271469,
-0.1500854641199112,
-0.044939491897821426,
-0.08082349598407745,
-0.0402892604470253,
-0.16706545650959015,
-0.1559169441461563,
-0.009464728645980358,
-0.03747759759426117,
0.01849602535367012,
-0.08566874265670776,
-0.15826745331287384,
-0.025850877165794373,
0.01841120794415474,
0.004061270039528608,
-0.0026109765749424696,
-0.07854607701301575,
-0.009494432248175144,
-0.029169192537665367,
-0.03834915533661842,
0.015285882167518139,
-0.04795487970113754,
0.1209389865398407,
0.10300295799970627,
0.05322597175836563,
-0.022783398628234863,
0.06048212945461273,
-0.08099368959665298,
0.06451636552810669,
-0.11474058777093887,
0.0946328341960907,
-0.059612952172756195,
0.07866121828556061,
-0.0341133177280426,
-0.10533427447080612,
0.07400712370872498,
0.062236238270998,
0.06532324850559235,
0.03454083204269409,
-0.1388130933046341,
-0.02460443414747715,
0.18752217292785645,
-0.12371153384447098,
-0.13674281537532806,
0.10039462894201279,
-0.03888971358537674,
0.08233746141195297,
0.08215697854757309,
0.14312149584293365,
0.14816880226135254,
-0.02413291297852993,
0.022483564913272858,
0.050465624779462814,
0.044962331652641296,
-0.13206714391708374,
0.07890917360782623,
0.06598944962024689,
-0.0878089964389801,
0.06190101057291031,
-0.017029128968715668,
0.0985957458615303,
-0.010410722345113754,
-0.024932630360126495,
-0.05128573998808861,
-0.07857146114110947,
-0.005856039933860302,
0.008472708985209465,
0.06518582254648209,
-0.08302098512649536,
-0.05986651033163071,
0.09026704728603363,
0.17366765439510345,
-0.13185682892799377,
-0.002335302298888564,
-0.08070673793554306,
0.035549670457839966,
-0.07613272219896317,
0.029021499678492546,
-0.16287179291248322,
0.03386888653039932,
0.07872220873832703,
-0.02723066322505474,
0.05333145335316658,
0.13030414283275604,
0.01228602696210146,
0.04359062388539314,
0.0011039481032639742,
-0.014153252355754375,
-0.12164393812417984,
-0.056533195078372955,
-0.06168186664581299,
-0.06353763490915298,
-0.0889904648065567,
-0.06069660559296608,
-0.03714396804571152,
-0.19482961297035217,
0.010324802249670029,
0.003463613335043192,
0.002912447787821293,
0.02738959714770317,
-0.012944204732775688,
0.029651232063770294,
0.07602662593126297,
-0.06061306223273277,
-0.03732529655098915,
0.03186825290322304,
0.02457430399954319,
-0.04090133309364319,
-0.05525984615087509,
-0.08309172838926315,
0.004978462588042021,
0.10746024549007416,
0.04257846623659134,
-0.0789351537823677,
0.022962965071201324,
-0.021383747458457947,
-0.04911746084690094,
0.008831126615405083,
-0.06299926340579987,
0.14334657788276672,
-0.005875798873603344,
0.1986861675977707,
-0.1645916998386383,
-0.038908783346414566,
-0.024462811648845673,
0.024794723838567734,
0.062032025307416916,
0.13758884370326996,
-0.014214838854968548,
-0.08756911009550095,
0.06606513261795044,
0.017089368775486946,
-0.10018716752529144,
0.2337431162595749,
-0.046762917190790176,
-0.09389284253120422,
0.02200363762676716,
0.10122436285018921,
-0.017658477649092674,
0.16759373247623444,
-0.20116323232650757,
-0.028308914974331856,
0.01685991883277893,
0.007690170779824257,
0.06656213849782944,
-0.12666431069374084,
0.0037372226361185312,
0.009143571369349957,
-0.07329627126455307,
-0.06863950192928314,
-0.007125280797481537,
-0.006441584322601557,
0.03739137575030327,
-0.007719877175986767,
-0.029858402907848358,
0.016667181625962257,
-0.039990417659282684,
-0.10689879953861237,
0.22015513479709625,
-0.09688085317611694,
-0.22027625143527985,
-0.20437470078468323,
0.11411119252443314,
-0.06106545031070709,
-0.012792224995791912,
0.03534943237900734,
-0.07922639697790146,
-0.055514492094516754,
-0.05763978883624077,
0.16998159885406494,
-0.0627465769648552,
-0.012903217226266861,
-0.015224677510559559,
0.07710034400224686,
0.010055255144834518,
-0.20970001816749573,
0.03486260771751404,
-0.004081244580447674,
-0.01361369714140892,
0.004815289285033941,
-0.10118090361356735,
0.09084023535251617,
0.15373441576957703,
-0.08144190162420273,
0.020727919414639473,
0.007250803057104349,
0.1883137822151184,
-0.038323286920785904,
-0.05665279179811478,
0.14061303436756134,
-0.018559331074357033,
-0.00992836058139801,
0.015010056085884571,
-0.013384472578763962,
-0.09814487397670746,
0.06389088928699493,
-0.00911492109298706,
-0.02512991800904274,
-0.2731406092643738,
-0.009510872885584831,
-0.07944662123918533,
0.05674716457724571,
0.03756942227482796,
0.04217253997921944,
-0.08793394267559052,
0.027865702286362648,
0.0604763962328434,
0.1503075510263443,
-0.006078604143112898,
0.05347444862127304,
0.05767318978905678,
-0.0006212303414940834,
0.007787466514855623,
-0.10019182413816452,
0.012248250655829906,
0.07340269535779953,
0.11104846745729446,
0.27158936858177185,
-0.09944004565477371,
0.19871264696121216,
0.048957888036966324,
0.048270002007484436,
0.04927702248096466,
0.1327913999557495,
-0.1314605325460434,
0.031871747225522995,
0.0029969781171530485,
-0.009297086857259274,
-0.10964042693376541,
0.009469457902014256,
-0.06663809716701508,
0.09030003100633621,
-0.10512405633926392,
-0.05792700871825218,
0.00999071728438139,
0.14734628796577454,
0.04115753993391991,
-0.2252698391675949,
-0.12941326200962067,
0.02065569907426834,
-0.09566820412874222,
-0.10572127997875214,
0.06521912664175034,
0.24527227878570557,
-0.07753432542085648,
-0.04076993837952614,
-0.00428276089951396,
0.13345623016357422,
-0.03696349635720253,
-0.02212725766003132,
-0.03629917651414871,
0.06423921883106232,
0.017216825857758522,
0.1365906447172165,
-0.29292234778404236,
0.1317068338394165,
-0.009369606152176857,
0.06204962357878685,
-0.03018822707235813,
0.04866114631295204,
-0.03788425028324127,
0.07511239498853683,
0.03891889750957489,
-0.00961056724190712,
0.03238922357559204,
-0.1586967259645462,
0.015057162381708622,
0.04170162230730057,
0.018533872440457344,
0.056004736572504044,
0.0633159875869751,
-0.0035809550900012255,
0.05785758048295975,
-0.01868586614727974,
-0.12285591661930084,
-0.07268207520246506,
-0.06704980880022049,
-0.017976492643356323,
-0.029949380084872246,
-0.015376786701381207,
-0.04477902129292488,
-0.010141361504793167,
0.07891125231981277,
0.18380455672740936,
-0.09503772854804993,
-0.0768137127161026,
-0.07473059743642807,
0.053733523935079575,
0.10879552364349365,
-0.08159428089857101,
0.029179561883211136,
-0.0028460666071623564,
0.04637260362505913,
-0.00843376200646162,
-0.07533251494169235,
0.05176239088177681,
-0.03904379904270172,
-0.06905204057693481,
-0.012110895477235317,
0.06260746717453003,
0.0003081902686972171,
0.026976531371474266,
0.012170192785561085,
-0.09456928074359894,
-0.04419291764497757,
-0.1203019767999649,
-0.12726399302482605,
-0.0416005477309227,
0.017504438757896423,
0.04195023328065872,
-0.14680759608745575,
-0.05596623569726944,
0.0037223820108920336,
-0.03951868414878845,
0.13081705570220947,
0.15612013638019562,
-0.05562886968255043,
0.0308695025742054,
0.14689798653125763,
-0.06159733980894089,
-0.19097794592380524,
0.03334030508995056,
0.04532918334007263,
0.11983179301023483,
-0.041953083127737045,
-0.16510683298110962,
0.049243126064538956,
0.020197793841362,
0.03608066588640213,
0.04962373152375221,
-0.31048017740249634,
-0.12409603595733643,
0.08193504065275192,
0.15969400107860565,
0.11786392331123352,
-0.12206515669822693,
-0.03913867101073265,
-0.0646962895989418,
-0.1588725447654724,
0.09461890906095505,
-0.050681766122579575,
0.1327894777059555,
-0.07512626051902771,
0.02729206718504429,
0.034447964280843735,
-0.04503226280212402,
0.07296122610569,
0.03202832117676735,
0.12062075734138489,
-0.04199871048331261,
0.01798434928059578,
0.12876936793327332,
-0.03325449302792549,
0.1827925741672516,
-0.14667706191539764,
0.0981270894408226,
-0.2320140302181244,
-0.05909643694758415,
-0.07570286095142365,
0.003676427761092782,
-0.034707166254520416,
-0.04634843021631241,
-0.07579302042722702,
0.0321839302778244,
-0.0026376431342214346,
-0.007087404374033213,
0.0418294258415699,
-0.030852127820253372,
-0.018284611403942108,
0.10606520622968674,
0.10552047193050385,
-0.015763619914650917,
-0.06798718124628067,
0.05458712577819824,
0.05092126876115799,
0.1140475794672966,
-0.19393129646778107,
0.029831325635313988,
0.10351583361625671,
0.01630527153611183,
0.12435026466846466,
0.04431544989347458,
-0.10381140559911728,
0.04214697703719139,
0.0879165381193161,
-0.07558809220790863,
-0.06534702330827713,
-0.021311957389116287,
-0.07754036039113998,
-0.0678507387638092,
0.050333354622125626,
0.09402567148208618,
-0.050410881638526917,
-0.020146479830145836,
-0.02445336803793907,
-0.020087402313947678,
-0.11228790879249573,
0.18611100316047668,
0.0750415250658989,
0.08506657183170319,
-0.06628236174583435,
0.06372818350791931,
0.08451725542545319,
-0.08487076312303543,
0.00852136965841055,
0.18761801719665527,
-0.10350804775953293,
-0.047741688787937164,
0.07265769690275192,
0.22023707628250122,
-0.02741912566125393,
-0.05952088534832001,
-0.13984405994415283,
-0.07664509117603302,
0.03196634724736214,
0.16560149192810059,
0.10122428089380264,
0.0951787456870079,
-0.0275924913585186,
-0.0008535431115888059,
-0.1068527102470398,
0.0926280990242958,
0.06417201459407806,
0.04958425834774971,
-0.10628190636634827,
0.1300741285085678,
0.03911514952778816,
0.12115852534770966,
-0.02705628052353859,
-0.011716925539076328,
-0.1384083777666092,
0.06367865204811096,
-0.11142557859420776,
0.03500971570611,
-0.009991473518311977,
0.050631023943424225,
-0.02434368059039116,
0.0027568326331675053,
-0.03167230635881424,
0.06853460520505905,
-0.08171895891427994,
0.0009077530703507364,
0.0031853849068284035,
0.05617361143231392,
-0.05194231867790222,
-0.01806519366800785,
0.033428650349378586,
-0.09166742116212845,
0.12301453202962875,
-0.03706768900156021,
-0.028687680140137672,
0.08016999065876007,
-0.048417653888463974,
0.04053641855716705,
0.014888577163219452,
0.04933501034975052,
0.02014060690999031,
0.01410489808768034,
0.07746749371290207,
0.037292130291461945,
0.05376189947128296,
0.024444036185741425,
0.12152566760778427,
-0.139346182346344,
-0.08540999889373779,
-0.05597332864999771,
-0.11422773450613022,
-0.0564475879073143,
0.10112614929676056,
0.04787144809961319,
0.10447303205728531,
0.09117636829614639,
-0.031887587159872055,
0.009519086219370365,
-0.12698447704315186,
-0.06626151502132416,
0.028433281928300858,
-0.029991768300533295,
-0.08537594228982925,
-0.05540292337536812,
0.03811514750123024,
-0.0321565605700016,
0.12282651662826538,
0.018209874629974365,
0.03758823126554489,
-0.01984015665948391,
-0.060554388910532,
-0.014178615063428879,
0.02139859087765217,
0.21412380039691925,
-0.08508855849504471,
0.04134315252304077,
0.0019842307083308697,
0.01542278565466404,
0.0068667009472846985,
0.11770077794790268,
0.11986161768436432,
0.16690166294574738,
-0.03327365964651108,
0.10057161748409271,
0.01891348697245121,
-0.001540541066788137,
-0.07483039796352386,
0.019299428910017014,
0.02210937812924385,
0.0629020556807518,
-0.04859583452343941,
0.18767814338207245,
0.09229663759469986,
-0.12358422577381134,
0.11078134179115295,
0.025089917704463005,
-0.13278256356716156,
-0.033616967499256134,
0.020733730867505074,
-0.03637649863958359,
-0.1477418839931488,
0.024540776386857033,
-0.12999223172664642,
-0.0159057155251503,
0.05207262560725212,
0.05130964145064354,
-0.07863007485866547,
0.1721831113100052,
0.03550204634666443,
-0.060441430658102036,
0.054101213812828064,
-0.0016120373038575053,
0.026370340958237648,
0.021376056596636772,
0.036595419049263,
0.03653668984770775,
-0.03775867074728012,
0.0369718037545681,
0.02409937232732773,
-0.023840852081775665,
-0.018406283110380173,
-0.021523145958781242,
-0.001813453738577664,
-0.015912864357233047,
0.01902829483151436,
0.05591977760195732,
0.1593467742204666,
0.03670182451605797,
-0.07537409663200378,
-0.017521729692816734,
0.1734268218278885,
-0.02832052670419216,
-0.09570615738630295,
-0.12664321064949036,
0.131278395652771,
0.052413858473300934,
0.010859078727662563,
0.026279497891664505,
-0.08203138411045074,
-0.05412648990750313,
0.20698027312755585,
0.05381216108798981,
-0.0313495472073555,
-0.022954575717449188,
0.008130757138133049,
-0.001885650446638465,
-0.04128474369645119,
0.20304350554943085,
0.022568093612790108,
0.22685691714286804,
0.02260131761431694,
-0.009256118908524513,
-0.06884130835533142,
-0.04062306508421898,
0.00291352323256433,
0.11914955824613571,
-0.038572512567043304,
-0.039733339101076126,
-0.08329420536756516,
-0.003751038573682308,
-0.0015000030398368835,
-0.07887066900730133,
0.09868459403514862,
-0.1369459182024002,
-0.09819568693637848,
-0.049443308264017105,
0.04896454140543938,
-0.058650724589824677,
0.01653250679373741,
-0.024533506482839584,
0.04459032788872719,
0.06955583393573761,
-0.03208504617214203,
-0.10116046667098999,
-0.1692589670419693,
0.09537648409605026,
-0.052629292011260986,
0.13274429738521576,
-0.0161448922008276,
0.15420901775360107,
0.08547938615083694,
0.02606341801583767,
-0.06345488131046295,
0.11484677344560623,
0.03206425532698631,
0.05964043736457825,
0.048431117087602615,
0.12216595560312271,
-0.050347890704870224,
0.13664619624614716,
-0.049840036779642105,
-0.029782485216856003,
-0.02762095257639885,
-0.07413607090711594,
-0.018030207604169846,
-0.16388940811157227,
-0.019610511139035225,
-0.09573516994714737,
0.09399913251399994,
0.1948009431362152,
-0.04359893128275871,
-0.030525928363204002,
-0.0925375446677208,
0.10898156464099884,
-0.011269100941717625,
0.0659298449754715,
-0.031775232404470444,
-0.17424800992012024,
-0.0003197411133442074,
0.011981070972979069,
0.013876440934836864,
-0.27664050459861755,
-0.00748731754720211,
-0.038738466799259186,
-0.028373152017593384,
-0.08582613617181778,
0.16009576618671417,
0.0881817415356636,
0.049531590193510056,
-0.04093445464968681,
-0.16064314544200897,
-0.036301229149103165,
0.058112721890211105,
-0.13928526639938354,
-0.14502987265586853
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code comment generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_comment_generation_java_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_comment_generation_java_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/code%20comment%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 25,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_comment_generation_java_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code comment generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 25,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 25,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 25,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 25,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10010585933923721,
0.07614484429359436,
-0.0012888346100226045,
0.10220441967248917,
0.04514016956090927,
0.01701199822127819,
0.023962073028087616,
0.10443686693906784,
-0.01510937511920929,
0.07044023275375366,
0.060023434460163116,
-0.0587029829621315,
0.05398523807525635,
0.19680583477020264,
0.020984133705496788,
-0.16131439805030823,
-0.0332355760037899,
0.028536377474665642,
-0.05833815410733223,
0.10840102285146713,
0.07846006006002426,
-0.08438754826784134,
0.07162156701087952,
-0.054158713668584824,
-0.12501421570777893,
0.04540542513132095,
-0.030866803601384163,
-0.030095309019088745,
0.08848267793655396,
0.05208614096045494,
0.1018572449684143,
-0.02423371560871601,
0.05545879527926445,
-0.1977214217185974,
0.0022052577696740627,
0.02518739178776741,
0.06634023785591125,
0.03582247346639633,
0.05304532125592232,
0.09051348268985748,
0.12986284494400024,
-0.023415235802531242,
0.03537733107805252,
0.06395187228918076,
-0.062218088656663895,
-0.061947911977767944,
-0.05857812240719795,
0.0636664628982544,
0.08723507076501846,
0.1045975610613823,
-0.010793045163154602,
0.01424830686300993,
-0.08305501192808151,
0.08150756359100342,
0.1190609559416771,
-0.21369804441928864,
-0.02513045072555542,
0.12055741250514984,
0.09023971855640411,
0.059752266854047775,
-0.08957214653491974,
-0.03319671377539635,
0.10756386816501617,
0.04703632742166519,
0.07706838846206665,
-0.09919114410877228,
-0.04291006177663803,
-0.0005213223630562425,
-0.04921044781804085,
-0.05007103830575943,
0.15547284483909607,
0.049194470047950745,
-0.0491161122918129,
-0.11687922477722168,
-0.04891454428434372,
-0.18864694237709045,
0.043358106166124344,
0.01281219907104969,
0.01236786600202322,
-0.012750337831676006,
0.02287842333316803,
-0.01840459555387497,
-0.08718989789485931,
-0.11432302743196487,
0.045269887894392014,
0.005076234228909016,
0.06013307347893715,
0.030733196064829826,
-0.016119860112667084,
0.07704885303974152,
0.011407644487917423,
-0.053936734795570374,
-0.016689129173755646,
0.011283181607723236,
-0.1043296828866005,
0.02105359546840191,
0.003975117579102516,
-0.06034018099308014,
0.0009614158771000803,
0.0531112365424633,
-0.11152074486017227,
0.07415999472141266,
0.10374920815229416,
0.01274841744452715,
0.0051902057603001595,
0.21011915802955627,
0.03801574558019638,
-0.1482626050710678,
0.021153829991817474,
0.03304077312350273,
-0.00019568360585253686,
0.01422265637665987,
-0.04852727800607681,
-0.05089686065912247,
0.011589260771870613,
0.07494065910577774,
-0.13905653357505798,
0.009474694728851318,
-0.05040498077869415,
-0.015440678223967552,
0.09725136309862137,
-0.11494945734739304,
0.02703329548239708,
0.013801729306578636,
-0.06387206166982651,
-0.05175132676959038,
0.07099056243896484,
-0.13105109333992004,
-0.12003331631422043,
0.02887408807873726,
-0.049358393996953964,
-0.036229804158210754,
-0.11789070814847946,
-0.11111370474100113,
0.0007416774169541895,
-0.06041847541928291,
0.0005398469511419535,
-0.09328928589820862,
-0.09453136473894119,
-0.030107522383332253,
0.029920199885964394,
-0.010853773914277554,
-0.03529176115989685,
-0.05067635700106621,
0.0064542703330516815,
-0.00936890672892332,
-0.03221564739942551,
0.03490586206316948,
-0.03680926561355591,
0.09704241901636124,
0.07259968668222427,
0.05315184220671654,
0.015660258010029793,
0.031997282058000565,
-0.09094863384962082,
0.08176477998495102,
-0.11929736286401749,
0.052595265209674835,
-0.013796267099678516,
0.07450824230909348,
-0.10202284902334213,
-0.08502383530139923,
0.01432652398943901,
0.05027076229453087,
0.06163327023386955,
0.025691263377666473,
-0.11573592573404312,
0.026805108413100243,
0.14677311480045319,
-0.09679754078388214,
-0.1361386477947235,
0.10584639012813568,
0.0011630505323410034,
0.033397626131772995,
0.059167761355638504,
0.14325165748596191,
0.1417902261018753,
-0.08294808119535446,
-0.03193184733390808,
0.07786759734153748,
0.05883770436048508,
-0.08891469985246658,
0.06253558397293091,
0.014668439514935017,
0.015476705506443977,
0.030333975329995155,
0.05771905183792114,
0.062123194336891174,
0.0037053448613733053,
-0.03224950283765793,
-0.03926922380924225,
-0.09198395907878876,
-0.059621911495923996,
-0.0016500793863087893,
0.020244359970092773,
-0.06402947008609772,
-0.05763210728764534,
-0.0015496499836444855,
0.16878746449947357,
-0.09573102742433548,
0.017466727644205093,
-0.06079690158367157,
-0.046225666999816895,
-0.08235485106706619,
0.028016218915581703,
-0.11378302425146103,
0.02987753413617611,
0.06469014286994934,
-0.04064697399735451,
0.04363773390650749,
0.09723472595214844,
0.0037772459909319878,
0.02541045844554901,
-0.05212467536330223,
-0.04274694249033928,
-0.04352809116244316,
-0.07368253916501999,
-0.105931855738163,
-0.02200234681367874,
-0.09045803546905518,
-0.02430214360356331,
-0.06261274963617325,
-0.17720821499824524,
-0.0014297474408522248,
-0.013106358237564564,
0.025004293769598007,
0.025557082146406174,
-0.0321526937186718,
0.05763138458132744,
0.052272018045186996,
-0.03493024408817291,
-0.08486830443143845,
0.025992870330810547,
0.031380727887153625,
-0.0690387636423111,
-0.0207191314548254,
-0.093287892639637,
-0.05726519227027893,
0.06770579516887665,
0.0945655107498169,
-0.09841372072696686,
-0.0014147841138765216,
-0.02932901307940483,
-0.05530513823032379,
-0.059509165585041046,
-0.04915124177932739,
0.1480722278356552,
0.015824779868125916,
0.16826677322387695,
-0.14595869183540344,
-0.06458552926778793,
-0.018180079758167267,
0.01078985445201397,
0.031786806881427765,
0.1613295078277588,
0.013363842852413654,
-0.10366886854171753,
0.04105130955576897,
-0.029114607721567154,
-0.04958441108465195,
0.17353051900863647,
-0.015410171821713448,
-0.08083881437778473,
0.003748713992536068,
0.11084011942148209,
-0.0176736768335104,
0.16946937143802643,
-0.0771801546216011,
-0.001725094043649733,
-0.005196587648242712,
0.016923803836107254,
0.03700568899512291,
-0.11851571500301361,
0.017341256141662598,
0.03655324503779411,
-0.07762909680604935,
-0.02365289069712162,
-0.009455438703298569,
-0.03775407001376152,
0.03367367014288902,
0.024481065571308136,
0.021075570955872536,
-0.019239794462919235,
-0.029870254918932915,
-0.09816139936447144,
0.18009322881698608,
-0.06936383247375488,
-0.2235124111175537,
-0.16837315261363983,
0.10432875156402588,
-0.029383331537246704,
-0.022631552070379257,
0.03331596776843071,
-0.10496865212917328,
-0.06113606318831444,
-0.0952327772974968,
0.1191297173500061,
-0.10962474346160889,
-0.0075226956978440285,
-0.029630403965711594,
0.06773247569799423,
0.05587493255734444,
-0.16113340854644775,
0.030487503856420517,
-0.013613808900117874,
0.019971929490566254,
-0.02270420826971531,
-0.04767582565546036,
0.08759907633066177,
0.10643434524536133,
-0.06893909722566605,
0.023431234061717987,
0.0013525362592190504,
0.16865788400173187,
-0.04837885499000549,
0.039387188851833344,
0.1908004879951477,
0.0357937291264534,
0.020014531910419464,
0.0502389520406723,
0.01235999632626772,
-0.0936523899435997,
0.06722088158130646,
0.0615701787173748,
-0.029730109497904778,
-0.22929923236370087,
-0.01643584668636322,
-0.07147561758756638,
0.06998290121555328,
0.11771839112043381,
0.05736261606216431,
-0.13957329094409943,
0.019963620230555534,
-0.002312158700078726,
0.15638500452041626,
-0.043178871273994446,
0.05319284647703171,
0.02477158233523369,
0.01149663794785738,
-0.008858650922775269,
-0.10038801282644272,
0.01707923226058483,
0.07395746558904648,
0.10852780938148499,
0.20219533145427704,
-0.09137864410877228,
0.19492587447166443,
0.020499350503087044,
0.1035703644156456,
0.03441552817821503,
0.08144878596067429,
-0.1347004771232605,
0.005805621854960918,
0.009812863543629646,
-0.0211145281791687,
-0.06135522201657295,
0.045075733214616776,
-0.04059801623225212,
0.07933726906776428,
-0.0642351508140564,
0.004565378651022911,
0.014188513159751892,
0.20995348691940308,
0.0700080394744873,
-0.1619933843612671,
-0.1309041827917099,
0.01706613227725029,
-0.09864959865808487,
-0.11658685654401779,
0.07418451458215714,
0.24174164235591888,
-0.052080392837524414,
0.00740091223269701,
-0.015009366907179356,
0.1332256942987442,
-0.10326322913169861,
-0.014808692038059235,
0.03926386684179306,
0.07593332231044769,
0.013968810439109802,
0.12371961027383804,
-0.26083987951278687,
0.08689907193183899,
0.017994623631238937,
0.08869557082653046,
-0.012170692905783653,
0.0499209426343441,
-0.052315618842840195,
-0.004867885261774063,
0.08124306797981262,
0.0057825371623039246,
-0.04387740045785904,
-0.19953790307044983,
-0.044639796018600464,
0.023989001289010048,
0.041216958314180374,
-0.015284441411495209,
0.08044902980327606,
-0.014050355181097984,
0.033065084367990494,
-0.03724852576851845,
-0.1456250250339508,
-0.06650656461715698,
-0.11908724159002304,
-0.04075635224580765,
0.017299043014645576,
-0.05197109654545784,
-0.02187172882258892,
0.038817960768938065,
0.04701961949467659,
0.22429589927196503,
-0.14315126836299896,
-0.08475884795188904,
-0.08825253695249557,
0.06692443042993546,
0.13666993379592896,
-0.08692264556884766,
0.020185237750411034,
0.01632995717227459,
0.0490826852619648,
-0.036202333867549896,
-0.05502421408891678,
0.0319368802011013,
-0.05340156704187393,
-0.08225248008966446,
-0.028218241408467293,
0.10658404231071472,
-0.019920213147997856,
0.04186755791306496,
-0.004837024956941605,
-0.08201148360967636,
-0.03497005254030228,
-0.12932324409484863,
-0.07874844968318939,
-0.009235596284270287,
0.050937507301568985,
-0.026311665773391724,
-0.11287683993577957,
0.09933598339557648,
0.007149845827370882,
-0.09410252422094345,
0.06994202733039856,
0.16005252301692963,
-0.07410185039043427,
0.0290555227547884,
0.09953689575195312,
-0.04796195775270462,
-0.1655418872833252,
-0.028608068823814392,
0.034751977771520615,
0.08853520452976227,
-0.04287584498524666,
-0.14439016580581665,
0.057019367814064026,
0.0154745914041996,
0.0160959605127573,
0.01955537125468254,
-0.28152361512184143,
-0.12659044563770294,
-0.004965372383594513,
0.08091969788074493,
0.051726385951042175,
-0.09809106588363647,
-0.05120362341403961,
-0.05590124428272247,
-0.0550667941570282,
0.05334959179162979,
0.06138881668448448,
0.11645425111055374,
-0.049812525510787964,
0.022014252841472626,
0.03938418626785278,
-0.030622901394963264,
0.06354033201932907,
-0.017526542767882347,
0.0956811010837555,
-0.016497228294610977,
0.03145427256822586,
0.04104023054242134,
-0.0651993677020073,
0.20179051160812378,
-0.17505741119384766,
0.09543288499116898,
-0.18580715358257294,
-0.048829127103090286,
-0.035919830203056335,
-0.009019522927701473,
-0.03655733913183212,
-0.050235021859407425,
-0.10352163761854172,
0.029583849012851715,
0.04183760657906532,
-0.024116165935993195,
0.04564560577273369,
-0.025503914803266525,
-0.04554852843284607,
0.0783756673336029,
0.0771414265036583,
-0.016154935583472252,
-0.13946054875850677,
0.03457989543676376,
0.02603849023580551,
0.09670957922935486,
-0.2185061126947403,
0.010989371687173843,
0.10741784423589706,
0.01650845818221569,
0.09624219685792923,
-0.0054466440342366695,
-0.08353206515312195,
0.0280433502048254,
0.06483551114797592,
-0.07003513723611832,
-0.09419690072536469,
-0.02040190063416958,
-0.04895331710577011,
-0.09688732773065567,
0.023547060787677765,
0.08376219868659973,
-0.07008563727140427,
-0.012754280120134354,
-0.007921366952359676,
0.01176532357931137,
-0.0702485740184784,
0.1860254853963852,
0.02715381793677807,
0.08841590583324432,
-0.06237797811627388,
0.08655860275030136,
0.10340514034032822,
-0.1257822960615158,
0.02517080120742321,
0.18598049879074097,
-0.08830477297306061,
-0.0292365625500679,
0.07574982196092606,
0.1099613755941391,
-0.008451337926089764,
-0.05502980202436447,
-0.10264571011066437,
-0.07361530512571335,
0.029297547414898872,
0.04957902804017067,
0.0648065060377121,
0.08455315977334976,
-0.028459738940000534,
-0.0026479216758161783,
-0.12497629970312119,
0.10422375798225403,
0.07046180963516235,
0.04727683216333389,
-0.13952158391475677,
0.11513141542673111,
0.04203678295016289,
0.09065191447734833,
0.004429756663739681,
0.03191079944372177,
-0.09797371178865433,
0.03581620380282402,
-0.03364661708474159,
0.03655499219894409,
-0.0050482009537518024,
0.04507371783256531,
-0.04865339770913124,
0.04026828333735466,
-0.026992013677954674,
0.04923158138990402,
-0.03545616194605827,
-0.030786430463194847,
-0.03739519417285919,
0.04051569476723671,
-0.0584610253572464,
-0.02119772881269455,
0.0007723816088400781,
-0.07801051437854767,
0.09399593621492386,
-0.07260848581790924,
-0.01334686391055584,
-0.005424997769296169,
0.018932610750198364,
0.06417778879404068,
0.028737151995301247,
0.04115015268325806,
-0.0001378922170260921,
-0.002544622402638197,
0.03508594259619713,
0.01433489564806223,
-0.0033477297984063625,
-0.005627818405628204,
0.06836608797311783,
-0.15025678277015686,
-0.08887603878974915,
-0.08397045731544495,
-0.07102449983358383,
-0.06254949420690536,
0.07914988696575165,
0.09042368084192276,
0.06559296697378159,
0.08409039676189423,
-0.03628396987915039,
0.0044500879012048244,
-0.1613461673259735,
-0.042059656232595444,
0.045368093997240067,
-0.017647039145231247,
-0.12580552697181702,
-0.03353108838200569,
0.056365035474300385,
-0.040441833436489105,
0.10168382525444031,
-0.0016673149075359106,
0.060824599117040634,
-0.005286434665322304,
-0.06367263942956924,
-0.0339646115899086,
0.008396382443606853,
0.16689741611480713,
-0.10697351396083832,
0.0036749402061104774,
0.0020162113942205906,
0.015124822966754436,
0.03287183493375778,
0.16472524404525757,
0.0909084901213646,
0.14481227099895477,
0.05195171758532524,
0.08631211519241333,
-0.0444769486784935,
-0.034311406314373016,
-0.13920433819293976,
0.10054045915603638,
-0.025272797793149948,
0.05809498205780983,
-0.05032818391919136,
0.1322181075811386,
0.10913196206092834,
-0.14222215116024017,
0.09928591549396515,
0.013443776406347752,
-0.09557896107435226,
-0.04251478984951973,
-0.08709277957677841,
-0.04790190979838371,
-0.10547328740358353,
0.008128232322633266,
-0.10031547397375107,
0.016853824257850647,
0.06746722012758255,
0.03312963992357254,
-0.032779935747385025,
0.16067294776439667,
0.010455440729856491,
-0.05215321108698845,
0.050225239247083664,
0.0465967170894146,
0.030071699991822243,
0.0831005647778511,
0.03465459868311882,
0.06823401898145676,
-0.07742062211036682,
0.06092774495482445,
0.029022853821516037,
-0.000730290194042027,
0.006151067558676004,
0.016353970393538475,
-0.010752943344414234,
-0.0455266498029232,
0.0020444560796022415,
0.07926062494516373,
0.16378624737262726,
0.05191720277070999,
-0.052857957780361176,
-0.04773150011897087,
0.1990393102169037,
-0.05181664973497391,
-0.03949693962931633,
-0.11734036356210709,
0.14793407917022705,
0.05050453916192055,
0.010757246054708958,
0.020452212542295456,
-0.06614255160093307,
-0.035148825496435165,
0.23060795664787292,
0.03963084518909454,
-0.03402494639158249,
-0.035305172204971313,
-0.009761928580701351,
-0.011299257166683674,
-0.05319460108876228,
0.157515287399292,
0.014817267656326294,
0.22280508279800415,
0.009495827369391918,
-0.0078009869903326035,
-0.03474507853388786,
-0.04960409179329872,
-0.017307661473751068,
0.17979946732521057,
-0.03717701509594917,
0.030257510021328926,
-0.09318805485963821,
-0.016588492318987846,
0.024297388270497322,
-0.10371862351894379,
0.1201789602637291,
-0.12890732288360596,
-0.07689670473337173,
0.003648263867944479,
0.056408196687698364,
-0.03669294714927673,
0.0403536856174469,
-0.015740344300866127,
0.06412815302610397,
0.055190559476614,
-0.03604097291827202,
-0.09959952533245087,
-0.15578976273536682,
0.042007267475128174,
-0.012881476432085037,
0.13735900819301605,
0.010400429368019104,
0.07358286529779434,
0.08373978734016418,
0.0005566804320551455,
-0.08955889940261841,
0.09439117461442947,
0.03579285740852356,
0.0012457711854949594,
0.054042376577854156,
0.13294106721878052,
-0.03760918974876404,
0.17254823446273804,
0.0040708607994019985,
-0.028832780197262764,
-0.033731695264577866,
-0.041862793266773224,
-0.002399913966655731,
-0.16010455787181854,
0.006176628638058901,
-0.05115211382508278,
0.13472996652126312,
0.1960451900959015,
-0.04868926480412483,
-0.027014529332518578,
-0.05216730386018753,
0.08577682077884674,
-0.015852803364396095,
0.08985396474599838,
0.00901452824473381,
-0.15741729736328125,
0.012782811187207699,
-0.002873307093977928,
0.011644117534160614,
-0.21112200617790222,
-0.050504621118307114,
-0.03146331384778023,
-0.029148170724511147,
-0.09271962195634842,
0.1463906168937683,
0.054730117321014404,
0.04011118412017822,
-0.03956657648086548,
-0.16004613041877747,
-0.013511174358427525,
0.05239669978618622,
-0.1306469589471817,
-0.1270594596862793
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_go_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_go_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/go/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_go_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.15466320514678955,
-0.03576550632715225,
-0.0002074259682558477,
0.12487772107124329,
0.1188269630074501,
0.029773984104394913,
0.06117966026067734,
0.06201598048210144,
-0.03638748452067375,
0.02788642793893814,
0.051507722586393356,
0.006813278421759605,
0.02923520840704441,
0.1981082260608673,
0.009050172753632069,
-0.10532064735889435,
-0.025245048105716705,
0.04958450049161911,
-0.0616445355117321,
0.13225197792053223,
0.08232738822698593,
-0.06741895526647568,
0.0577203743159771,
-0.06504110246896744,
-0.2360084354877472,
0.0565427802503109,
0.013453242368996143,
-0.05960369110107422,
0.10060770809650421,
0.05878092348575592,
0.12523137032985687,
-0.010083415545523167,
0.016343651339411736,
-0.13122935593128204,
0.012256433255970478,
0.005794198252260685,
0.025182263925671577,
0.015067026019096375,
0.0323016531765461,
0.037054285407066345,
0.1658298224210739,
0.010085806250572205,
0.03939799219369888,
0.06616969406604767,
-0.07514920085668564,
-0.11493782699108124,
-0.0003157974570058286,
0.029994908720254898,
0.03565416485071182,
0.1175704076886177,
-0.01766682229936123,
0.12287358939647675,
-0.14541518688201904,
0.1347969025373459,
0.08917615562677383,
-0.22424019873142242,
-0.015488517470657825,
0.1175960972905159,
0.08508892357349396,
0.0849381610751152,
-0.05047231167554855,
-0.052705489099025726,
0.10709238797426224,
0.04516468942165375,
0.028475429862737656,
-0.0904269590973854,
-0.07202878594398499,
0.01167664397507906,
-0.08203022181987762,
-0.06922618299722672,
0.21427330374717712,
0.005557910539209843,
-0.08073530346155167,
-0.05541282147169113,
-0.030423054471611977,
-0.14620597660541534,
0.03706774115562439,
0.032951489090919495,
0.0004983568214811385,
-0.039002928882837296,
-0.00006556297739734873,
0.026669656857848167,
-0.07582846283912659,
-0.15288953483104706,
0.022587642073631287,
0.10805046558380127,
0.05150008946657181,
0.02650478482246399,
-0.0969977006316185,
0.09993480145931244,
0.03781912848353386,
-0.053036246448755264,
-0.026035107672214508,
-0.009717078879475594,
-0.10386306792497635,
0.030205579474568367,
-0.060531411319971085,
-0.17052142322063446,
0.01295087393373251,
0.03199101984500885,
-0.06065123900771141,
0.04939835146069527,
0.04229896888136864,
0.04120170697569847,
0.007467569317668676,
0.21903368830680847,
0.06199530139565468,
-0.12005352228879929,
0.05724092572927475,
0.0441366508603096,
-0.022594137117266655,
-0.007565524894744158,
-0.07684144377708435,
-0.09737098962068558,
0.10217253863811493,
0.10083912312984467,
-0.1302456557750702,
0.03919690474867821,
-0.07180268317461014,
-0.03746377304196358,
0.012426459230482578,
-0.15237921476364136,
0.0019169885199517012,
0.03494960814714432,
-0.07259110361337662,
-0.053788360208272934,
0.10709647834300995,
-0.1762627512216568,
-0.14527474343776703,
-0.03242724761366844,
-0.06939177215099335,
-0.03987354412674904,
-0.1625671535730362,
-0.16017533838748932,
-0.010678950697183609,
-0.03618011623620987,
0.025836952030658722,
-0.10193376988172531,
-0.13606053590774536,
-0.03182296082377434,
0.021472331136465073,
0.014128904789686203,
-0.005559481680393219,
-0.0816899985074997,
-0.012197834439575672,
-0.021997205913066864,
-0.036891110241413116,
0.0034150031860917807,
-0.04663839936256409,
0.13061223924160004,
0.10777575522661209,
0.05108381807804108,
-0.039265040308237076,
0.05860349163413048,
-0.07862593978643417,
0.05684524402022362,
-0.11602962017059326,
0.09367471933364868,
-0.05069645866751671,
0.07807385921478271,
-0.028841450810432434,
-0.11010521650314331,
0.06779389828443527,
0.0691809132695198,
0.07887369394302368,
0.04573749750852585,
-0.11512867361307144,
-0.042477820068597794,
0.18217508494853973,
-0.12651187181472778,
-0.1483783721923828,
0.10585139691829681,
-0.02634289301931858,
0.08363806456327438,
0.09204258769750595,
0.13529865443706512,
0.15440142154693604,
-0.04698438197374344,
0.008031352423131466,
0.04453802853822708,
0.034805942326784134,
-0.14443133771419525,
0.07883648574352264,
0.06106142699718475,
-0.0895567536354065,
0.05442986637353897,
-0.007993034087121487,
0.11427977681159973,
-0.014129799790680408,
-0.03143791854381561,
-0.048514798283576965,
-0.09030430018901825,
0.005716080777347088,
0.019904442131519318,
0.07472077757120132,
-0.08386240899562836,
-0.07899878919124603,
0.08758684247732162,
0.16562242805957794,
-0.12940682470798492,
0.0020000755321234465,
-0.09009622782468796,
0.0667736753821373,
-0.07729736715555191,
0.028360314667224884,
-0.16733284294605255,
0.022191161289811134,
0.0650496780872345,
-0.00532521540299058,
0.07377537339925766,
0.12438538670539856,
0.022067714482545853,
0.04058944433927536,
-0.011449997313320637,
-0.02091505564749241,
-0.11679745465517044,
-0.06187755987048149,
-0.07294779270887375,
-0.06994409114122391,
-0.08037328720092773,
-0.055000629276037216,
0.0015624447260051966,
-0.2053994983434677,
0.013433496467769146,
0.008592854253947735,
-0.009053496643900871,
0.019296614453196526,
-0.012996118515729904,
0.01956956274807453,
0.07802366465330124,
-0.06115507334470749,
-0.032853856682777405,
0.04189296066761017,
0.02422548271715641,
-0.0643637403845787,
-0.06782420724630356,
-0.09303178638219833,
0.009979340247809887,
0.12378086894750595,
0.03244591876864433,
-0.09787922352552414,
0.0188418161123991,
-0.018385302275419235,
-0.043657638132572174,
0.019737914204597473,
-0.07094082236289978,
0.16024789214134216,
-0.012423710897564888,
0.1956804245710373,
-0.15425267815589905,
-0.03690369799733162,
-0.026832565665245056,
0.028514305129647255,
0.06252170354127884,
0.14802664518356323,
-0.007765916641801596,
-0.08560866117477417,
0.06474576145410538,
0.019970141351222992,
-0.11027950048446655,
0.23739901185035706,
-0.047369334846735,
-0.09486716985702515,
0.03452054038643837,
0.10537604242563248,
-0.0037326780147850513,
0.17927993834018707,
-0.20895794034004211,
-0.028972551226615906,
0.006000875495374203,
-0.0021759166847914457,
0.07093299925327301,
-0.1306435763835907,
0.006930292584002018,
0.01425198744982481,
-0.07174395024776459,
-0.08460263162851334,
-0.0034551438875496387,
-0.01628810539841652,
0.046341270208358765,
-0.00898795947432518,
-0.03513479232788086,
0.017031673341989517,
-0.03251494839787483,
-0.12190423160791397,
0.22367320954799652,
-0.08430071920156479,
-0.20685714483261108,
-0.19738264381885529,
0.11557692289352417,
-0.06618548929691315,
-0.013031484559178352,
0.03742773085832596,
-0.08907713741064072,
-0.041777126491069794,
-0.05274127423763275,
0.17674726247787476,
-0.07187127321958542,
-0.005669788923114538,
-0.029194707050919533,
0.07388046383857727,
0.017352622002363205,
-0.201584592461586,
0.03519096598029137,
-0.022474028170108795,
-0.014774000272154808,
0.015010645613074303,
-0.10861562192440033,
0.0989282876253128,
0.17083081603050232,
-0.07904659956693649,
0.019841356202960014,
-0.006914022844284773,
0.19981756806373596,
-0.04830445349216461,
-0.05907538905739784,
0.14159351587295532,
-0.01789260283112526,
-0.011205348186194897,
0.014077896252274513,
-0.01455466728657484,
-0.10226771235466003,
0.06636232882738113,
-0.01603364199399948,
-0.03275476023554802,
-0.27416542172431946,
-0.02132842130959034,
-0.07606849074363708,
0.04843667522072792,
0.04219479858875275,
0.04032419994473457,
-0.09001310169696808,
0.0313456729054451,
0.05182238295674324,
0.13496002554893494,
-0.01162771787494421,
0.04525826871395111,
0.059854697436094284,
0.0014280218165367842,
0.015706414356827736,
-0.1006411612033844,
0.011510268785059452,
0.07454625517129898,
0.09343717247247696,
0.26819729804992676,
-0.09966231882572174,
0.18490228056907654,
0.03846235200762749,
0.04317439720034599,
0.04552757367491722,
0.13942337036132812,
-0.12237168103456497,
0.030699344351887703,
0.013679265044629574,
-0.006892370525747538,
-0.1153249442577362,
0.021177582442760468,
-0.03595908358693123,
0.08527331054210663,
-0.12149462848901749,
-0.04673311859369278,
0.005983682814985514,
0.1381099373102188,
0.05216921120882034,
-0.2301148623228073,
-0.14250709116458893,
0.010344298556447029,
-0.07674771547317505,
-0.09550762176513672,
0.06355851143598557,
0.22403819859027863,
-0.06858347356319427,
-0.025974642485380173,
-0.0057188901118934155,
0.13470333814620972,
-0.02808081917464733,
-0.02969542145729065,
-0.03949879854917526,
0.05548974126577377,
0.013369905762374401,
0.13131675124168396,
-0.2983572781085968,
0.14110665023326874,
-0.010470472276210785,
0.06693540513515472,
-0.03527132421731949,
0.041269220411777496,
-0.030144965276122093,
0.07505534589290619,
0.041308194398880005,
-0.0083125876262784,
0.03396259620785713,
-0.173338383436203,
0.005491908174008131,
0.038515008985996246,
0.02419016696512699,
0.06224675104022026,
0.06750617921352386,
-0.0001616606314200908,
0.05681991204619408,
-0.012858816422522068,
-0.1353842169046402,
-0.07004715502262115,
-0.06331074237823486,
-0.026636909693479538,
-0.029355254024267197,
-0.021901804953813553,
-0.03956422209739685,
-0.019090307876467705,
0.06730543822050095,
0.19204069674015045,
-0.09287558495998383,
-0.08020513504743576,
-0.07659731805324554,
0.06124686077237129,
0.09077138453722,
-0.0937921330332756,
0.040308550000190735,
-0.0033282421063631773,
0.02544107474386692,
-0.010312550701200962,
-0.0786861777305603,
0.06774753332138062,
-0.04198906198143959,
-0.06619691848754883,
-0.008473427966237068,
0.07094080001115799,
0.005436290521174669,
0.040901970118284225,
0.008115902543067932,
-0.09598380327224731,
-0.0415930412709713,
-0.11909328401088715,
-0.1147843673825264,
-0.05054774135351181,
-0.0011266041547060013,
0.05466940626502037,
-0.14499984681606293,
-0.06021186709403992,
-0.00808598380535841,
-0.03813109174370766,
0.14203257858753204,
0.16439029574394226,
-0.06127706170082092,
0.014602466486394405,
0.1279076337814331,
-0.055744316428899765,
-0.20701083540916443,
0.0384712889790535,
0.05257558450102806,
0.12562139332294464,
-0.04929480329155922,
-0.15954844653606415,
0.04870110750198364,
0.0029403474181890488,
0.0363214947283268,
0.07323405146598816,
-0.2925385534763336,
-0.13453660905361176,
0.09456218779087067,
0.1616881787776947,
0.14026229083538055,
-0.1320093721151352,
-0.036295026540756226,
-0.06253679096698761,
-0.11482506990432739,
0.07768576592206955,
-0.0544152595102787,
0.13446159660816193,
-0.06712309271097183,
0.02087187021970749,
0.03284471109509468,
-0.04260040819644928,
0.07246328145265579,
0.023158246651291847,
0.10221701115369797,
-0.03789167478680611,
0.014004501514136791,
0.13724397122859955,
-0.033392153680324554,
0.17128470540046692,
-0.14637431502342224,
0.09399871528148651,
-0.23194566369056702,
-0.05944416671991348,
-0.07376158982515335,
0.01522063184529543,
-0.03428514301776886,
-0.03905989229679108,
-0.08330690115690231,
0.028824683278799057,
-0.008289138786494732,
-0.007824711501598358,
0.019557876512408257,
-0.044030364602804184,
-0.01820209063589573,
0.09282564371824265,
0.12069125473499298,
-0.009878743439912796,
-0.0721650943160057,
0.062259748578071594,
0.04504954069852829,
0.11785227060317993,
-0.18816125392913818,
0.02362331748008728,
0.11751986294984818,
0.02097110077738762,
0.11229904741048813,
0.0449366495013237,
-0.1062808707356453,
0.04848143085837364,
0.09241082519292831,
-0.062214601784944534,
-0.06878859549760818,
-0.02857157215476036,
-0.10697702318429947,
-0.07315566390752792,
0.05429213494062424,
0.09961028397083282,
-0.04621383547782898,
-0.008595028892159462,
-0.02624421752989292,
-0.028209498152136803,
-0.12008225172758102,
0.19563597440719604,
0.0782453864812851,
0.08127017319202423,
-0.06154835969209671,
0.05276269093155861,
0.07191845029592514,
-0.08587009459733963,
0.01657087542116642,
0.16601163148880005,
-0.09799395501613617,
-0.04492895305156708,
0.06542003154754639,
0.2134128361940384,
-0.04869595915079117,
-0.06488465517759323,
-0.13909056782722473,
-0.07955724000930786,
0.028362808749079704,
0.1726854294538498,
0.11040182411670685,
0.07883057743310928,
-0.02551480382680893,
0.006237505469471216,
-0.10621768236160278,
0.08682266622781754,
0.0710146501660347,
0.04099804162979126,
-0.11002130806446075,
0.1298045814037323,
0.04885004460811615,
0.11892430484294891,
-0.030050434172153473,
-0.009382280521094799,
-0.1510925143957138,
0.075237937271595,
-0.09624363481998444,
0.03027869202196598,
-0.002932974137365818,
0.051033735275268555,
-0.02713589183986187,
-0.004886171314865351,
-0.0353112630546093,
0.06278280168771744,
-0.08517270535230637,
0.005226492881774902,
0.010331319645047188,
0.04241175204515457,
-0.05628174915909767,
-0.017437389120459557,
0.024096103385090828,
-0.09286291897296906,
0.12564541399478912,
-0.025443125516176224,
-0.03151839226484299,
0.0859275832772255,
-0.05755019560456276,
0.038388606160879135,
0.020995037630200386,
0.053345054388046265,
0.0131071200594306,
0.015313779935240746,
0.0787883922457695,
0.039591800421476364,
0.05883276090025902,
0.03534352406859398,
0.12402983009815216,
-0.1265508383512497,
-0.07691704481840134,
-0.05655001848936081,
-0.10861942172050476,
-0.05600743740797043,
0.10154034197330475,
0.029601892456412315,
0.1030721440911293,
0.09961032122373581,
-0.0373440682888031,
0.010506639257073402,
-0.13231679797172546,
-0.059766001999378204,
0.028055982664227486,
-0.02085982821881771,
-0.09005780518054962,
-0.05616088584065437,
0.05039646103978157,
-0.027064992114901543,
0.12243682146072388,
0.00885152630507946,
0.04542149603366852,
-0.01973392255604267,
-0.04705938324332237,
-0.00177170115057379,
0.011634299531579018,
0.2207467257976532,
-0.07702385634183884,
0.05135243386030197,
0.004485736601054668,
0.018261490389704704,
0.01602715440094471,
0.12232381105422974,
0.14283014833927155,
0.15440204739570618,
-0.04290693253278732,
0.11212138831615448,
0.00876358337700367,
0.002441404154524207,
-0.0819159746170044,
-0.001966541400179267,
0.009844468906521797,
0.05423976480960846,
-0.03995890915393829,
0.19005638360977173,
0.09777828305959702,
-0.113339364528656,
0.1016562208533287,
0.021100541576743126,
-0.133134126663208,
-0.04078555479645729,
0.027927648276090622,
-0.036982402205467224,
-0.15198840200901031,
0.029230250045657158,
-0.11891785264015198,
-0.040925733745098114,
0.040510281920433044,
0.051568660885095596,
-0.0792178139090538,
0.1910257488489151,
0.01503763347864151,
-0.05336093530058861,
0.055162034928798676,
-0.007277767639607191,
0.022144654765725136,
0.026333842426538467,
0.03227757290005684,
0.029799511656165123,
-0.0394941121339798,
0.042640816420316696,
0.025759205222129822,
-0.048040710389614105,
0.0014732227427884936,
-0.005831568036228418,
0.0005547921173274517,
-0.022240759804844856,
0.030475236475467682,
0.06272730976343155,
0.16876351833343506,
0.030998999252915382,
-0.06844120472669601,
-0.024051863700151443,
0.15048526227474213,
-0.033282067626714706,
-0.10328616946935654,
-0.12229670584201813,
0.15834777057170868,
0.03992731124162674,
0.0076249754056334496,
0.015184203162789345,
-0.09169594943523407,
-0.04348796606063843,
0.22867894172668457,
0.07420370727777481,
-0.038835301995277405,
-0.01817219890654087,
0.006539700552821159,
0.0010470583802089095,
-0.034807078540325165,
0.20626798272132874,
0.026791144162416458,
0.2389928251504898,
0.018020516261458397,
-0.031071079894900322,
-0.0768844410777092,
-0.03747839108109474,
0.013164001516997814,
0.11667278409004211,
-0.023261096328496933,
-0.0398191474378109,
-0.08676912635564804,
0.00888631772249937,
-0.00801059976220131,
-0.07730384916067123,
0.10034219175577164,
-0.14243866503238678,
-0.09383464604616165,
-0.04708464443683624,
0.04097128286957741,
-0.05120231211185455,
0.02257440984249115,
-0.02963011898100376,
0.03625338152050972,
0.05654774606227875,
-0.03582078218460083,
-0.11845553666353226,
-0.16074365377426147,
0.08399122208356857,
-0.05183849483728409,
0.13354915380477905,
-0.020566845312714577,
0.16232796013355255,
0.09482652693986893,
0.03920264169573784,
-0.047906745225191116,
0.11584359407424927,
0.032966963946819305,
0.037170469760894775,
0.05837535485625267,
0.11193965375423431,
-0.051815763115882874,
0.13504722714424133,
-0.04849546402692795,
-0.02281566523015499,
-0.012795542366802692,
-0.06594829261302948,
-0.019976655021309853,
-0.1646079421043396,
-0.012896524742245674,
-0.10622086375951767,
0.09890025109052658,
0.1959817260503769,
-0.04126667603850365,
-0.032621100544929504,
-0.09207368642091751,
0.10172664374113083,
-0.005403540562838316,
0.061022963374853134,
-0.03350784629583359,
-0.18574362993240356,
0.001451182528398931,
0.0033083746675401926,
0.0035727908834815025,
-0.285354346036911,
-0.007365329656749964,
-0.046383798122406006,
-0.024187501519918442,
-0.09183843433856964,
0.1607247143983841,
0.07638286054134369,
0.04082804173231125,
-0.040870968252420425,
-0.1310499608516693,
-0.04081134498119354,
0.06745606660842896,
-0.15661196410655975,
-0.14703260362148285
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the go function/method.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_go_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_go_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/go/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_go_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the go function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
88,
107
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12961432337760925,
0.04318214952945709,
-0.0012143257772549987,
0.10005561262369156,
0.0445324145257473,
0.0328620970249176,
0.048590898513793945,
0.09677129983901978,
-0.01750057190656662,
0.0708751454949379,
0.04085950553417206,
-0.07053956389427185,
0.060021769255399704,
0.18865598738193512,
0.019667647778987885,
-0.12096923589706421,
-0.03284921869635582,
0.04534958675503731,
-0.08154614269733429,
0.10533542931079865,
0.07044792175292969,
-0.08947398513555527,
0.07261359691619873,
-0.034332551062107086,
-0.13215477764606476,
0.029290467500686646,
-0.021832209080457687,
-0.01006719097495079,
0.10021580755710602,
0.0708538219332695,
0.11816911399364471,
-0.016236020252108574,
0.06009779870510101,
-0.1794905662536621,
0.0034372888039797544,
0.020567217841744423,
0.061202920973300934,
0.04391789063811302,
0.0529881976544857,
0.08635532110929489,
0.10163381695747375,
-0.014098918065428734,
0.033540837466716766,
0.0557718388736248,
-0.0647207722067833,
-0.04535330459475517,
-0.0742819756269455,
0.07715360820293427,
0.07988835871219635,
0.10047171264886856,
-0.005308208521455526,
0.03290635719895363,
-0.08274072408676147,
0.08600340783596039,
0.1097903624176979,
-0.23070047795772552,
-0.02088518999516964,
0.10348565876483917,
0.09373573213815689,
0.038044918328523636,
-0.07886297255754471,
-0.03643980622291565,
0.10461915284395218,
0.041331734508275986,
0.044777631759643555,
-0.08372219651937485,
0.002709043212234974,
-0.013480380177497864,
-0.0515979528427124,
-0.042486000806093216,
0.16700001060962677,
0.04067426174879074,
-0.05961359664797783,
-0.10656443983316422,
-0.04139881208539009,
-0.19178669154644012,
0.03927311673760414,
0.014850139617919922,
-0.001200059661641717,
-0.015425787307322025,
0.009518716484308243,
-0.0067320591770112514,
-0.09632663428783417,
-0.12036248296499252,
0.024723341688513756,
0.020610148087143898,
0.05546474829316139,
0.03411612659692764,
-0.04398658871650696,
0.08404601365327835,
0.030885737389326096,
-0.04346350207924843,
-0.01550001185387373,
0.020157597959041595,
-0.1222652867436409,
-0.0027478551492094994,
-0.012313141487538815,
-0.07398860901594162,
-0.006348174065351486,
0.06585423648357391,
-0.1030125766992569,
0.08297103643417358,
0.09157925099134445,
0.021652137860655785,
0.018112780526280403,
0.20332971215248108,
0.04675571247935295,
-0.14542855322360992,
0.02697668969631195,
0.024344095960259438,
0.004536271560937166,
0.005328051280230284,
-0.04705403372645378,
-0.05156918987631798,
0.026063639670610428,
0.062280237674713135,
-0.12280019372701645,
0.02184685692191124,
-0.06814678758382797,
-0.019549962133169174,
0.09772398322820663,
-0.12249047309160233,
0.0383429229259491,
0.029142923653125763,
-0.046309519559144974,
-0.04343119636178017,
0.08880960196256638,
-0.13498026132583618,
-0.11935276538133621,
0.04182935506105423,
-0.04532380774617195,
-0.03347808122634888,
-0.11355391889810562,
-0.10451693832874298,
-0.000048814235924510285,
-0.019581986591219902,
-0.003126012161374092,
-0.09020304679870605,
-0.08025942742824554,
-0.02092050202190876,
0.04099879413843155,
-0.0044721197336912155,
-0.03749842941761017,
-0.0346870943903923,
0.012600652873516083,
-0.007264726795256138,
-0.022046782076358795,
0.015724949538707733,
-0.026847263798117638,
0.09647220373153687,
0.08359101414680481,
0.046054475009441376,
-0.0071005020290613174,
0.025456760078668594,
-0.07577010989189148,
0.08436888456344604,
-0.12074725329875946,
0.05801820755004883,
-0.0069659119471907616,
0.05586972087621689,
-0.09099820256233215,
-0.07407139986753464,
0.004147994797676802,
0.0506548248231411,
0.07937967777252197,
0.03763171657919884,
-0.11321950703859329,
0.01931697130203247,
0.1480296403169632,
-0.11800567060709,
-0.14227566123008728,
0.11313187330961227,
-0.004896841011941433,
0.03663333132863045,
0.06105809658765793,
0.1186910942196846,
0.15780603885650635,
-0.10020294785499573,
-0.030680742114782333,
0.06848659366369247,
0.041243862360715866,
-0.07887734472751617,
0.05362645909190178,
0.024167746305465698,
-0.017940310761332512,
0.019851889461278915,
0.06673339754343033,
0.055311255156993866,
-0.006355568300932646,
-0.04198348522186279,
-0.034971725195646286,
-0.09679196029901505,
-0.052768029272556305,
-0.014208597131073475,
0.03045007213950157,
-0.05520140007138252,
-0.048926569521427155,
-0.0012532154796645045,
0.1648467630147934,
-0.09092911332845688,
0.029713142663240433,
-0.08706642687320709,
-0.031609438359737396,
-0.08370205014944077,
0.026862213388085365,
-0.13010624051094055,
0.023697400465607643,
0.0589909628033638,
-0.04884719476103783,
0.05262986198067665,
0.08372337371110916,
0.00469626672565937,
0.0319363959133625,
-0.06113831698894501,
-0.031386177986860275,
-0.04008708521723747,
-0.07512331008911133,
-0.11254625767469406,
-0.042780790477991104,
-0.09458363056182861,
-0.02524913288652897,
-0.03732465207576752,
-0.18084855377674103,
0.00044978177174925804,
0.012614966370165348,
0.021592020988464355,
0.019671713933348656,
-0.04068024456501007,
0.025183916091918945,
0.05558079481124878,
-0.04666278883814812,
-0.06503614783287048,
0.02549438551068306,
0.04934066906571388,
-0.09951663762331009,
-0.04104740172624588,
-0.0879855677485466,
-0.06747164577245712,
0.07970138639211655,
0.09353120625019073,
-0.12886172533035278,
-0.01606825366616249,
-0.02610418200492859,
-0.0529564768075943,
-0.05101405456662178,
-0.061792001128196716,
0.1625327467918396,
0.016491087153553963,
0.15310712158679962,
-0.13215872645378113,
-0.061162639409303665,
-0.021425466984510422,
0.01781468465924263,
0.029628220945596695,
0.14809949696063995,
0.035460080951452255,
-0.10493122786283493,
0.027455944567918777,
-0.03734079375863075,
-0.053216416388750076,
0.15837699174880981,
-0.01798832044005394,
-0.06530218571424484,
-0.004639722406864166,
0.11958850920200348,
0.00628485856577754,
0.2030881941318512,
-0.07544933259487152,
0.004759013652801514,
-0.011575058102607727,
0.013916891068220139,
0.04447275772690773,
-0.13038583099842072,
0.026885023340582848,
0.03464177995920181,
-0.06323070079088211,
-0.0465536005795002,
-0.029115837067365646,
-0.034869153052568436,
0.04272754117846489,
0.019836945459246635,
0.0477655753493309,
-0.00656908331438899,
-0.03491860628128052,
-0.11466673761606216,
0.17455336451530457,
-0.062404777854681015,
-0.20522809028625488,
-0.17182666063308716,
0.10338517278432846,
-0.033691588789224625,
-0.015022194012999535,
0.03132756054401398,
-0.08860695362091064,
-0.04942617192864418,
-0.08515559136867523,
0.13523583114147186,
-0.09634780883789062,
0.003908609040081501,
-0.004371694289147854,
0.06176038831472397,
0.06454849243164062,
-0.1590893268585205,
0.03382696211338043,
-0.01970415748655796,
0.01826668716967106,
-0.005233631934970617,
-0.05116560682654381,
0.08430831134319305,
0.11071454733610153,
-0.056399740278720856,
0.013899346813559532,
0.001973336096853018,
0.1768331527709961,
-0.06368476152420044,
0.04517367482185364,
0.17469415068626404,
0.005003875587135553,
0.02788654714822769,
0.05076801776885986,
0.016110554337501526,
-0.09823203086853027,
0.057554539293050766,
0.044140685349702835,
-0.036538638174533844,
-0.20988434553146362,
-0.026373857632279396,
-0.08432729542255402,
0.06085430458188057,
0.1102631688117981,
0.04664149135351181,
-0.15918780863285065,
0.02740374580025673,
-0.004410222172737122,
0.16043142974376678,
-0.02901773527264595,
0.0518888421356678,
0.010518169961869717,
0.01779300719499588,
-0.002098431810736656,
-0.10477802902460098,
0.006594941485673189,
0.07863975316286087,
0.11501029133796692,
0.2022119164466858,
-0.08874018490314484,
0.17917346954345703,
0.008359679020941257,
0.1092521995306015,
0.04221910238265991,
0.1093563660979271,
-0.1295294612646103,
0.008515611290931702,
0.011338469572365284,
-0.016718395054340363,
-0.06124917045235634,
0.04325427487492561,
-0.028857266530394554,
0.0676695704460144,
-0.06995061039924622,
0.007928535342216492,
0.021677235141396523,
0.1931314766407013,
0.07574120163917542,
-0.15875419974327087,
-0.1437532603740692,
0.008593272417783737,
-0.07052692770957947,
-0.10430359840393066,
0.06424988061189651,
0.22154659032821655,
-0.054041896015405655,
0.026398876681923866,
-0.014236249960958958,
0.13408789038658142,
-0.09828436374664307,
-0.018117589876055717,
0.03145016357302666,
0.05894001945853233,
0.00549617875367403,
0.11277895420789719,
-0.25192156434059143,
0.07509820908308029,
0.01715872809290886,
0.0895325243473053,
-0.030149953439831734,
0.05377528443932533,
-0.048136986792087555,
0.005459455773234367,
0.07485319674015045,
0.014670164324343204,
-0.05106541886925697,
-0.19840958714485168,
-0.04971996694803238,
0.022535160183906555,
0.047237951308488846,
-0.0020311197731643915,
0.08457230776548386,
-0.00862043909728527,
0.04344559833407402,
-0.03146754577755928,
-0.12988552451133728,
-0.05880075693130493,
-0.12609677016735077,
-0.029019512236118317,
0.001537773059681058,
-0.07023721933364868,
-0.021297303959727287,
0.039774827659130096,
0.04738001897931099,
0.23812735080718994,
-0.16233406960964203,
-0.06652987003326416,
-0.09430927783250809,
0.04911075532436371,
0.12273697555065155,
-0.08808010816574097,
0.014490759931504726,
0.020772738382220268,
0.06132375821471214,
-0.04517021402716637,
-0.0684475302696228,
0.03732335940003395,
-0.05997995287179947,
-0.09351358562707901,
-0.03953173756599426,
0.11506083607673645,
-0.007717794273048639,
0.046175505965948105,
0.0036587396170943975,
-0.08787229657173157,
-0.029803097248077393,
-0.1339079737663269,
-0.06252569705247879,
-0.04006963223218918,
0.03589079901576042,
-0.013999158516526222,
-0.1341986507177353,
0.06946339458227158,
0.0026168562471866608,
-0.09462839365005493,
0.06474001705646515,
0.17935055494308472,
-0.07194458693265915,
0.03132752329111099,
0.08489367365837097,
-0.05034017935395241,
-0.19343851506710052,
-0.028156843036413193,
0.04532407969236374,
0.08925499022006989,
-0.028535937890410423,
-0.14691831171512604,
0.07146712392568588,
-0.003164872759953141,
0.012835458852350712,
0.04267274588346481,
-0.2434527724981308,
-0.13281357288360596,
-0.0007617794326506555,
0.07713419944047928,
0.0511043481528759,
-0.09597066044807434,
-0.05062604323029518,
-0.05950821936130524,
-0.05477423220872879,
0.061175066977739334,
0.06727378815412521,
0.10994399338960648,
-0.030837593600153923,
0.026404811069369316,
0.04242711886763573,
-0.03544299304485321,
0.06460998952388763,
-0.009355215355753899,
0.09585478901863098,
-0.01812097430229187,
0.012052076868712902,
0.06020888313651085,
-0.059282537549734116,
0.18044617772102356,
-0.1597433239221573,
0.09338494390249252,
-0.16735753417015076,
-0.03893900290131569,
-0.032216545194387436,
-0.0002390016452409327,
-0.04477435350418091,
-0.03277726098895073,
-0.11484414339065552,
0.03729567676782608,
0.046151723712682724,
-0.03231192007660866,
0.03343871235847473,
-0.010473160073161125,
-0.05616139620542526,
0.07255329936742783,
0.08533471822738647,
-0.0038508251309394836,
-0.13515986502170563,
0.03562885895371437,
0.015579826198518276,
0.10046173632144928,
-0.16478334367275238,
0.03353221341967583,
0.10783351212739944,
0.017704477533698082,
0.08803519606590271,
0.0156171265989542,
-0.09000279754400253,
0.028568636626005173,
0.0765703022480011,
-0.07464151084423065,
-0.07474478334188461,
-0.016283821314573288,
-0.045247454196214676,
-0.0908273309469223,
0.044600505381822586,
0.08968769013881683,
-0.045433659106492996,
-0.0034565776586532593,
-0.0072857593186199665,
0.013075906783342361,
-0.08313006162643433,
0.17751500010490417,
0.01103442907333374,
0.08652332425117493,
-0.059212468564510345,
0.07186342030763626,
0.0964537039399147,
-0.11251475661993027,
0.026308119297027588,
0.1493367850780487,
-0.08659382909536362,
-0.022925810888409615,
0.0831790491938591,
0.13254007697105408,
-0.01568043977022171,
-0.05341384559869766,
-0.1010720506310463,
-0.08856551349163055,
0.020458746701478958,
0.05128488317131996,
0.06639843434095383,
0.08734143525362015,
-0.020022407174110413,
0.002109395805746317,
-0.12929576635360718,
0.09306235611438751,
0.07390620559453964,
0.046897705644369125,
-0.13012942671775818,
0.13537757098674774,
0.03488632291555405,
0.0897635817527771,
-0.0019895080476999283,
0.03291216120123863,
-0.12477650493383408,
0.03344544395804405,
-0.02878548577427864,
0.03285545855760574,
-0.0069809481501579285,
0.04504775628447533,
-0.044235631823539734,
0.0376349613070488,
-0.03179881349205971,
0.045479826629161835,
-0.03826218843460083,
-0.022066012024879456,
-0.03894219920039177,
0.01996396668255329,
-0.05939473584294319,
-0.016075754538178444,
0.015518829226493835,
-0.09672114253044128,
0.09674014896154404,
-0.053086038678884506,
-0.01011095941066742,
-0.00757794501259923,
0.04055967926979065,
0.05080725625157356,
0.004433480091392994,
0.05179478973150253,
-0.01070825383067131,
-0.01877603866159916,
0.022017452865839005,
0.02301713079214096,
-0.0053716483525931835,
0.0017120196716859937,
0.10234203934669495,
-0.1365416944026947,
-0.07547812908887863,
-0.08675646036863327,
-0.08313445001840591,
-0.060388628393411636,
0.07277790457010269,
0.08391644805669785,
0.07331359386444092,
0.08647153526544571,
-0.036102090030908585,
0.0068206521682441235,
-0.16669361293315887,
-0.04134005680680275,
0.0554535798728466,
-0.0006925713387317955,
-0.11916570365428925,
-0.04011749476194382,
0.06700712442398071,
-0.03426053375005722,
0.11705774813890457,
-0.03456352278590202,
0.03268313407897949,
-0.008462450467050076,
-0.0538034662604332,
-0.06443025171756744,
0.008619830943644047,
0.17027869820594788,
-0.10910375416278839,
0.003947839606553316,
-0.007310184184461832,
0.008301091380417347,
0.020771540701389313,
0.16111896932125092,
0.12245050817728043,
0.1179441586136818,
0.02982567995786667,
0.08241397887468338,
-0.03891244903206825,
-0.036424584686756134,
-0.12898699939250946,
0.07072020322084427,
-0.04704944044351578,
0.03757166117429733,
-0.03450939804315567,
0.13791878521442413,
0.07985198497772217,
-0.13792647421360016,
0.10688036680221558,
-0.00633251154795289,
-0.10158872604370117,
-0.02916937880218029,
-0.08376587927341461,
-0.04040002450346947,
-0.09603940695524216,
0.004173046909272671,
-0.10059530287981033,
-0.010507236234843731,
0.05204533785581589,
0.029751112684607506,
-0.033338163048028946,
0.18302884697914124,
-0.03985689580440521,
-0.03960671275854111,
0.029078181833028793,
0.04867328330874443,
0.022803397849202156,
0.09886052459478378,
0.026474040001630783,
0.06280343234539032,
-0.05130516365170479,
0.07172251492738724,
0.03785911574959755,
-0.00625234842300415,
0.023318171501159668,
0.04266941174864769,
-0.01241051685065031,
-0.04389040917158127,
-0.017954343929886818,
0.08665766566991806,
0.1382877379655838,
0.03127555176615715,
-0.03173987567424774,
-0.056062787771224976,
0.16891199350357056,
-0.05472451448440552,
-0.050036005675792694,
-0.12481088936328888,
0.1687144637107849,
0.029966838657855988,
0.003937044646590948,
0.01792360283434391,
-0.07985857874155045,
-0.019897086545825005,
0.25395211577415466,
0.058719199150800705,
-0.0629015639424324,
-0.025250138714909554,
-0.002623150125145912,
-0.007261918392032385,
-0.031412381678819656,
0.14500971138477325,
0.0038005108945071697,
0.2455991804599762,
0.01580834947526455,
-0.011729644611477852,
-0.045624349266290665,
-0.04229595512151718,
-0.0013603701954707503,
0.19947390258312225,
-0.0358879417181015,
0.035603757947683334,
-0.11112762242555618,
-0.014760293066501617,
0.030420750379562378,
-0.1480141133069992,
0.1322328746318817,
-0.1376524120569229,
-0.07765519618988037,
0.016612503677606583,
0.06583137810230255,
-0.05703122913837433,
0.04668945074081421,
-0.02305612340569496,
0.06657686829566956,
0.04896204546093941,
-0.032771266996860504,
-0.09428243339061737,
-0.1376454383134842,
0.04815279319882393,
-0.006955983117222786,
0.1359691172838211,
0.018122728914022446,
0.08893677592277527,
0.08270822465419769,
0.011035049334168434,
-0.08104782551527023,
0.0795937329530716,
0.026146098971366882,
-0.015385410748422146,
0.043420739471912384,
0.12560367584228516,
-0.049666982144117355,
0.15223287045955658,
0.009280907921493053,
-0.03484668955206871,
-0.029640033841133118,
-0.0308217816054821,
-0.0073953652754426,
-0.15085850656032562,
0.0018204482039436698,
-0.06592360883951187,
0.14373664557933807,
0.19875803589820862,
-0.04509463533759117,
-0.022867077961564064,
-0.04663677141070366,
0.09247168898582458,
-0.009269338101148605,
0.08898119628429413,
-0.0014542985009029508,
-0.18431036174297333,
0.02180933766067028,
-0.0339851900935173,
0.012191981077194214,
-0.20437021553516388,
-0.066534124314785,
-0.026802970096468925,
-0.029843566939234734,
-0.09774092584848404,
0.13433979451656342,
0.06284169107675552,
0.03483615070581436,
-0.04724835231900215,
-0.12565842270851135,
-0.015737541019916534,
0.04523702710866928,
-0.11887597292661667,
-0.12671923637390137
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the go function/method.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_go_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_go_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/go/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_go_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the go function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
87,
107
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12949706614017487,
0.034436892718076706,
-0.00044798431918025017,
0.09375966340303421,
0.040048133581876755,
0.0259039718657732,
0.05081244930624962,
0.10109797865152359,
-0.015388513915240765,
0.06750671565532684,
0.035508863627910614,
-0.06196257472038269,
0.06373871862888336,
0.20347681641578674,
0.023527249693870544,
-0.12207472324371338,
-0.017923962324857712,
0.05068441480398178,
-0.09416889399290085,
0.10740666836500168,
0.0703180804848671,
-0.09229140728712082,
0.07391558587551117,
-0.03257335349917412,
-0.13235719501972198,
0.021812116727232933,
-0.02372441254556179,
-0.011990861967206001,
0.10730142146348953,
0.06731823086738586,
0.12560978531837463,
-0.015166941098868847,
0.06617745012044907,
-0.19030442833900452,
0.004005081485956907,
0.022786026820540428,
0.066767118871212,
0.04699038714170456,
0.0401332788169384,
0.08657491952180862,
0.08962811529636383,
-0.010548477992415428,
0.0341094508767128,
0.05292697995901108,
-0.06607908010482788,
-0.0656065121293068,
-0.08492571115493774,
0.09074336290359497,
0.07863610982894897,
0.10181088745594025,
-0.007444398477673531,
0.0463702492415905,
-0.08261176943778992,
0.08296209573745728,
0.12282981723546982,
-0.22786755859851837,
-0.018450897186994553,
0.11324573308229446,
0.10426990687847137,
0.03492884337902069,
-0.07487758249044418,
-0.035773441195487976,
0.10636638849973679,
0.029125727713108063,
0.04418705403804779,
-0.0740995854139328,
0.0071053155697882175,
-0.010502354241907597,
-0.05837161839008331,
-0.03769320994615555,
0.1690850853919983,
0.04459795355796814,
-0.06316841393709183,
-0.09801372140645981,
-0.035723280161619186,
-0.19823496043682098,
0.040051013231277466,
-0.0005996858817525208,
-0.005943924188613892,
-0.009915284812450409,
-0.009908917360007763,
0.0032546173315495253,
-0.08903373777866364,
-0.12096773833036423,
0.024144146591424942,
0.03064720891416073,
0.060523033142089844,
0.03266643360257149,
-0.05251473933458328,
0.08677204698324203,
0.049959901720285416,
-0.04435361176729202,
-0.019074855372309685,
0.016952507197856903,
-0.12564243376255035,
-0.007331366650760174,
-0.015113637782633305,
-0.05676749348640442,
-0.01248796284198761,
0.0798366516828537,
-0.08689077943563461,
0.07783334702253342,
0.09460368007421494,
0.022307224571704865,
0.011907776817679405,
0.21155673265457153,
0.046381350606679916,
-0.15259473025798798,
0.028576187789440155,
0.020327594131231308,
0.001607244834303856,
0.006937979254871607,
-0.05366594344377518,
-0.04480393975973129,
0.030605707317590714,
0.06365272402763367,
-0.12165245413780212,
0.034883107990026474,
-0.06735989451408386,
-0.020118217915296555,
0.0856885313987732,
-0.1269037127494812,
0.031414326280355453,
0.019666919484734535,
-0.05039437487721443,
-0.03846714273095131,
0.09826090186834335,
-0.1368265151977539,
-0.10936390608549118,
0.03344013914465904,
-0.0471099391579628,
-0.03975989669561386,
-0.1146918311715126,
-0.10617929697036743,
0.0019142579985782504,
-0.02457369491457939,
-0.004122517537325621,
-0.09353642165660858,
-0.07749395817518234,
-0.015215196646749973,
0.043141525238752365,
0.007863340899348259,
-0.0343320369720459,
-0.03697950020432472,
0.001488340669311583,
-0.007014682050794363,
-0.018114561215043068,
0.00894857756793499,
-0.027636360377073288,
0.09716566652059555,
0.07355326414108276,
0.04735122621059418,
-0.012265775352716446,
0.026047153398394585,
-0.07962781190872192,
0.0836312472820282,
-0.1264830380678177,
0.06449121981859207,
-0.014224747195839882,
0.047233857214450836,
-0.09635758399963379,
-0.06684545427560806,
-0.0004492895386647433,
0.053558677434921265,
0.08729743212461472,
0.03399116173386574,
-0.13507263362407684,
0.017240654677152634,
0.1577875018119812,
-0.12326040863990784,
-0.151546910405159,
0.10967767238616943,
-0.015898320823907852,
0.055278223007917404,
0.06863608211278915,
0.11341256648302078,
0.14990507066249847,
-0.09933365881443024,
-0.0445382297039032,
0.06018267944455147,
0.044273242354393005,
-0.07436966150999069,
0.04783814772963524,
0.02777528204023838,
-0.019745593890547752,
0.013118699193000793,
0.05473371967673302,
0.05771281570196152,
-0.007437317632138729,
-0.04390596225857735,
-0.02729758806526661,
-0.09425670653581619,
-0.055525559931993484,
-0.014908354729413986,
0.023367684334516525,
-0.0462963841855526,
-0.061349451541900635,
0.0076932297088205814,
0.1629735231399536,
-0.10148608684539795,
0.03446480631828308,
-0.09417039901018143,
-0.026067666709423065,
-0.0733528807759285,
0.029250159859657288,
-0.12723104655742645,
0.016395533457398415,
0.058191969990730286,
-0.05240543559193611,
0.05536629632115364,
0.0740540474653244,
-0.005113024264574051,
0.02223033271729946,
-0.05907175689935684,
-0.033805061131715775,
-0.045508213341236115,
-0.06930698454380035,
-0.1139780730009079,
-0.04142824560403824,
-0.09646972268819809,
-0.028548836708068848,
-0.04062207415699959,
-0.18259356915950775,
-0.0009671406005509198,
0.02147257886826992,
0.021985644474625587,
0.023839803412556648,
-0.04480697214603424,
0.01301402784883976,
0.05446771904826164,
-0.05042300745844841,
-0.07239878922700882,
0.020306002348661423,
0.04976954683661461,
-0.10608406364917755,
-0.028245726600289345,
-0.09505971521139145,
-0.0962148979306221,
0.09016899019479752,
0.0938316211104393,
-0.13781017065048218,
-0.01568891480565071,
-0.03001486510038376,
-0.05242478474974632,
-0.054822832345962524,
-0.060909491032361984,
0.162502259016037,
0.019074613228440285,
0.1422857642173767,
-0.1340309977531433,
-0.06707929074764252,
-0.027427082881331444,
0.02175500988960266,
0.037977445870637894,
0.13912878930568695,
0.03253074735403061,
-0.10077769309282303,
0.03048294596374035,
-0.0319080613553524,
-0.05106949061155319,
0.15348173677921295,
-0.023549199104309082,
-0.06094295158982277,
-0.00471835071220994,
0.10888833552598953,
0.011904653161764145,
0.21941110491752625,
-0.06431667506694794,
-0.00030253976001404226,
-0.010468868538737297,
0.011384344659745693,
0.04330955445766449,
-0.13164964318275452,
0.027279658243060112,
0.03243621066212654,
-0.06340266764163971,
-0.03550895303487778,
-0.027324171736836433,
-0.042182669043540955,
0.04034300521016121,
0.021453533321619034,
0.040939413011074066,
-0.00690189003944397,
-0.03173587843775749,
-0.11227072030305862,
0.17769289016723633,
-0.06113337725400925,
-0.20029091835021973,
-0.16585198044776917,
0.09792692214250565,
-0.016779106110334396,
-0.015312439762055874,
0.026927044615149498,
-0.08997242152690887,
-0.0538705550134182,
-0.08512640744447708,
0.14055952429771423,
-0.09134012460708618,
0.007099860347807407,
-0.0058192661963403225,
0.05530990660190582,
0.05404121056199074,
-0.16275908052921295,
0.030984897166490555,
-0.025116650387644768,
0.016034943982958794,
-0.007174857426434755,
-0.06495028734207153,
0.08009158819913864,
0.11947543174028397,
-0.05860326811671257,
0.018838608637452126,
-0.004029365722090006,
0.15945394337177277,
-0.07380127161741257,
0.05587155371904373,
0.16277927160263062,
-0.0065034059807658195,
0.027836909517645836,
0.05407042056322098,
0.0074071018025279045,
-0.10131268203258514,
0.0634692907333374,
0.044016532599925995,
-0.04538567736744881,
-0.21652032434940338,
-0.03053734079003334,
-0.07832489162683487,
0.05811886489391327,
0.10487595945596695,
0.040072470903396606,
-0.14708499610424042,
0.03667520731687546,
-0.008187627419829369,
0.1651274710893631,
-0.018248889595270157,
0.05720558390021324,
0.0008268474484793842,
0.02370789460837841,
0.00817618053406477,
-0.10379243642091751,
0.009547889232635498,
0.0700233057141304,
0.10197623074054718,
0.2153255194425583,
-0.08870353549718857,
0.16720664501190186,
0.012370092794299126,
0.12400897592306137,
0.050289805978536606,
0.11251828819513321,
-0.11904444545507431,
0.012947464361786842,
0.008665471337735653,
-0.021356118842959404,
-0.06118655577301979,
0.046718157827854156,
-0.05029641091823578,
0.07232311367988586,
-0.06823094934225082,
0.023630088195204735,
0.015177610330283642,
0.19058485329151154,
0.08224768191576004,
-0.17475183308124542,
-0.1492713987827301,
0.0011618067510426044,
-0.06760595738887787,
-0.09542237967252731,
0.06081230565905571,
0.20973901450634003,
-0.055430568754673004,
0.023592550307512283,
-0.01647307351231575,
0.13093703985214233,
-0.10035507380962372,
-0.020884737372398376,
0.036769285798072815,
0.06086421385407448,
0.0028045738581568003,
0.10563694685697556,
-0.2444206327199936,
0.08836784958839417,
0.012148617766797543,
0.08577680587768555,
-0.028576888144016266,
0.05291319638490677,
-0.03704871982336044,
-0.004342946223914623,
0.07549010217189789,
0.017513884231448174,
-0.04574449732899666,
-0.20526282489299774,
-0.050660714507102966,
0.02208705060184002,
0.05417191609740257,
-0.009715932421386242,
0.08699573576450348,
-0.01336842030286789,
0.05004514008760452,
-0.022139595821499825,
-0.10022395104169846,
-0.06451717764139175,
-0.12073580175638199,
-0.039108969271183014,
0.0025560162030160427,
-0.042224518954753876,
-0.02577362023293972,
0.03376496210694313,
0.03889928013086319,
0.24589458107948303,
-0.1511976718902588,
-0.05786323547363281,
-0.09312406182289124,
0.05521126464009285,
0.1263829618692398,
-0.0942944586277008,
0.012928158976137638,
0.023177484050393105,
0.06317592412233353,
-0.049189306795597076,
-0.07638487964868546,
0.039870768785476685,
-0.0602126345038414,
-0.09775138646364212,
-0.038254812359809875,
0.10624333471059799,
0.016189411282539368,
0.04396706819534302,
0.013903540559113026,
-0.08455781638622284,
-0.019873976707458496,
-0.13199469447135925,
-0.06959272921085358,
-0.019241539761424065,
0.034638430923223495,
-0.004943151958286762,
-0.1388864517211914,
0.06700605154037476,
-0.013514048419892788,
-0.09015242010354996,
0.054717183113098145,
0.17263105511665344,
-0.07484376430511475,
0.028541311621665955,
0.07691369205713272,
-0.05444032698869705,
-0.2034803181886673,
-0.022078564390540123,
0.048593174666166306,
0.085414357483387,
-0.028002208098769188,
-0.13915839791297913,
0.08835627138614655,
-0.007018524222075939,
0.01181841641664505,
0.025076504796743393,
-0.22006802260875702,
-0.1348876804113388,
0.01606179215013981,
0.06993056833744049,
0.049576323479413986,
-0.0890909880399704,
-0.047177884727716446,
-0.06013781204819679,
-0.04326871410012245,
0.08310825377702713,
0.07947397232055664,
0.10360101610422134,
-0.025511395186185837,
0.026166029274463654,
0.040510352700948715,
-0.030396515503525734,
0.055968210101127625,
0.002266108989715576,
0.09834258258342743,
-0.02329656109213829,
0.005571202840656042,
0.06328177452087402,
-0.06227058917284012,
0.17136120796203613,
-0.14724653959274292,
0.09417913109064102,
-0.1627110093832016,
-0.03240443021059036,
-0.03212250396609306,
0.004361807368695736,
-0.041874583810567856,
-0.041644394397735596,
-0.1354300081729889,
0.04906516149640083,
0.0539059042930603,
-0.030623482540249825,
0.04345737397670746,
0.0006316813523881137,
-0.04703959450125694,
0.04931441694498062,
0.08495371043682098,
-0.0056253401562571526,
-0.11902383714914322,
0.03871603682637215,
0.011936687864363194,
0.10235471278429031,
-0.15331409871578217,
0.031097576022148132,
0.10486000031232834,
0.015775572508573532,
0.08982542902231216,
0.02223590947687626,
-0.10381209850311279,
0.025376543402671814,
0.06948083639144897,
-0.06890992820262909,
-0.05510685592889786,
-0.015347114764153957,
-0.03396812826395035,
-0.08620741218328476,
0.05063953250646591,
0.09146164357662201,
-0.050033532083034515,
0.0004976062336936593,
-0.003726661205291748,
0.009642723947763443,
-0.08112338185310364,
0.1724717915058136,
0.013421869836747646,
0.08535812795162201,
-0.05551048368215561,
0.07332680374383926,
0.09358357638120651,
-0.11160071194171906,
0.03408954292535782,
0.1273130476474762,
-0.09339006245136261,
-0.013199588283896446,
0.09874234348535538,
0.1258188933134079,
-0.0273626409471035,
-0.05254847928881645,
-0.0930904746055603,
-0.08737902343273163,
0.016493139788508415,
0.07464160025119781,
0.07376021891832352,
0.09498916566371918,
-0.018064992502331734,
0.0059086112305521965,
-0.131745383143425,
0.09403818845748901,
0.08081792294979095,
0.05085958540439606,
-0.1292853057384491,
0.14445514976978302,
0.036304451525211334,
0.07936553657054901,
-0.005216402933001518,
0.02745135687291622,
-0.1312066912651062,
0.03358444944024086,
-0.03304251655936241,
0.03849596157670021,
-0.012118279933929443,
0.038437116891145706,
-0.05019732564687729,
0.03474944829940796,
-0.025519859045743942,
0.042723946273326874,
-0.03864036127924919,
-0.02193617634475231,
-0.03423535078763962,
0.014584296382963657,
-0.05987736955285072,
-0.015169380232691765,
0.01615145057439804,
-0.09626700729131699,
0.09602067619562149,
-0.0539831779897213,
-0.004211742430925369,
-0.003585539059713483,
0.03821645677089691,
0.0508146733045578,
0.006532043684273958,
0.052892137318849564,
-0.009744896553456783,
-0.024622419849038124,
0.014739862643182278,
0.022288991138339043,
-0.008989362046122551,
0.001967029180377722,
0.10721918940544128,
-0.12717202305793762,
-0.08120393007993698,
-0.09512358158826828,
-0.06412836909294128,
-0.05839763581752777,
0.07658400386571884,
0.08416006714105606,
0.08181975036859512,
0.08520463854074478,
-0.03562608361244202,
0.004796428605914116,
-0.17310084402561188,
-0.04354685917496681,
0.053649574518203735,
0.004246748052537441,
-0.11319310963153839,
-0.03973570466041565,
0.06846127659082413,
-0.035926368087530136,
0.11459092795848846,
-0.0430663637816906,
0.027536187320947647,
-0.011654962785542011,
-0.055759329348802567,
-0.06715250760316849,
0.007371245417743921,
0.18032361567020416,
-0.09814678877592087,
0.009506222791969776,
-0.004843543749302626,
0.008259376510977745,
0.020497793331742287,
0.1508825570344925,
0.13554519414901733,
0.11646341532468796,
0.023052683100104332,
0.08507188409566879,
-0.036782048642635345,
-0.030501220375299454,
-0.10602890700101852,
0.05950142815709114,
-0.055792272090911865,
0.022927096113562584,
-0.02470601350069046,
0.14765188097953796,
0.0677269995212555,
-0.14061468839645386,
0.1082417219877243,
-0.0017706333892419934,
-0.10102555900812149,
-0.035896558314561844,
-0.10604959726333618,
-0.0355348065495491,
-0.09903658926486969,
0.002044608350843191,
-0.09928739815950394,
-0.023842625319957733,
0.05128738284111023,
0.030461406335234642,
-0.028870074078440666,
0.17219796776771545,
-0.06829293072223663,
-0.04390617832541466,
0.029538491740822792,
0.05158242955803871,
0.008120125159621239,
0.09243859350681305,
0.024012010544538498,
0.06332387030124664,
-0.04141920432448387,
0.07318087667226791,
0.03595393896102905,
-0.00719523336738348,
0.029940618202090263,
0.050491224974393845,
-0.011241803877055645,
-0.042656224220991135,
-0.020768815651535988,
0.09205473214387894,
0.1261732131242752,
0.034272726625204086,
-0.028220191597938538,
-0.05375167354941368,
0.16341932117938995,
-0.05390327796339989,
-0.05381465330719948,
-0.12206318974494934,
0.16584837436676025,
0.020892303436994553,
0.0005672451807186007,
0.016906460747122765,
-0.0782862976193428,
-0.016169248148798943,
0.25723645091056824,
0.06095749884843826,
-0.05464174225926399,
-0.023229360580444336,
-0.003223961917683482,
-0.008739293552935123,
-0.042435240000486374,
0.14948755502700806,
0.006889554671943188,
0.2477572113275528,
0.019123146310448647,
-0.019387992098927498,
-0.04736235737800598,
-0.05153553560376167,
0.011744807474315166,
0.18814709782600403,
-0.039180122315883636,
0.023783979937434196,
-0.1120859831571579,
-0.007267476059496403,
0.011808736249804497,
-0.1645650714635849,
0.1328771710395813,
-0.14280423521995544,
-0.07413715869188309,
0.010446627624332905,
0.06148647144436836,
-0.05668729916214943,
0.041472721844911575,
-0.020465867593884468,
0.07529241591691971,
0.05007968842983246,
-0.02507873624563217,
-0.09402211010456085,
-0.13638629019260406,
0.050157807767391205,
-0.002431051339954138,
0.1219719648361206,
0.014740010723471642,
0.09584219753742218,
0.08230981975793839,
0.02126745879650116,
-0.07346857339143753,
0.07523999363183975,
0.026185842230916023,
-0.01215705368667841,
0.037724364548921585,
0.11612303555011749,
-0.051102254539728165,
0.14784175157546997,
0.01959262043237686,
-0.034533627331256866,
-0.02267705276608467,
-0.041645973920822144,
-0.014711541123688221,
-0.15640650689601898,
-0.005192644894123077,
-0.06927061080932617,
0.14703641831874847,
0.1942681074142456,
-0.051022354513406754,
-0.016083909198641777,
-0.047775980085134506,
0.08716326206922531,
-0.0078054736368358135,
0.06921172887086868,
0.0031421242747455835,
-0.18400269746780396,
0.014786774292588234,
-0.05049731582403183,
0.006572746206074953,
-0.18572497367858887,
-0.06472840160131454,
-0.02680942788720131,
-0.04260478541254997,
-0.09566759318113327,
0.1381567120552063,
0.06617899984121323,
0.038742221891880035,
-0.049912381917238235,
-0.10371892899274826,
-0.013841206207871437,
0.047318849712610245,
-0.12712539732456207,
-0.12601159512996674
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_java_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_java_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/java/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_java_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 180,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12566953897476196,
-0.026267854496836662,
-0.0004683413717430085,
0.1320476084947586,
0.10509327799081802,
0.022798528894782066,
0.058217037469148636,
0.06575638055801392,
-0.026601945981383324,
0.018604641780257225,
0.04472021758556366,
0.010188905522227287,
0.031886253505945206,
0.19626133143901825,
0.006728736218065023,
-0.11774666607379913,
-0.013355366885662079,
0.044089484959840775,
-0.03511533513665199,
0.12797261774539948,
0.09565053880214691,
-0.07385560125112534,
0.05375342816114426,
-0.0692862942814827,
-0.24387364089488983,
0.05968599021434784,
-0.004217242356389761,
-0.06393561512231827,
0.10014932602643967,
0.04671314358711243,
0.12548594176769257,
-0.004402965307235718,
0.020072365179657936,
-0.1419777125120163,
0.010871033184230328,
0.011704258620738983,
0.03401191160082817,
0.016999246552586555,
0.04451578110456467,
0.053937483578920364,
0.14036615192890167,
0.012288306839764118,
0.043343979865312576,
0.06162787228822708,
-0.07582785189151764,
-0.11869467049837112,
-0.007668847683817148,
0.02368328720331192,
0.05110655352473259,
0.10018150508403778,
-0.01253418531268835,
0.12183303385972977,
-0.15174031257629395,
0.12879320979118347,
0.1033463254570961,
-0.21814367175102234,
-0.012576576322317123,
0.1267150342464447,
0.09174585342407227,
0.09818091988563538,
-0.059709321707487106,
-0.06684055924415588,
0.10342862457036972,
0.05272597447037697,
0.04569939151406288,
-0.10132323205471039,
-0.11375537514686584,
0.024002548307180405,
-0.07508742809295654,
-0.06482065469026566,
0.22074149549007416,
0.002043013460934162,
-0.077552430331707,
-0.05416851490736008,
-0.02542480267584324,
-0.13439178466796875,
0.037463411688804626,
0.026990724727511406,
0.007187440060079098,
-0.03342354670166969,
0.01734171248972416,
0.030918588861823082,
-0.07319832593202591,
-0.15563130378723145,
0.0278549175709486,
0.08959373086690903,
0.05579907074570656,
0.025438548997044563,
-0.09638945758342743,
0.10486086457967758,
0.03412603586912155,
-0.060096655040979385,
-0.026456424966454506,
-0.01848994381725788,
-0.10327377915382385,
0.033815521746873856,
-0.05161210894584656,
-0.18332946300506592,
0.016271322965621948,
0.011402473784983158,
-0.05204034969210625,
0.05062827095389366,
0.027018088847398758,
0.0380423478782177,
0.022345690056681633,
0.19802336394786835,
0.05655137076973915,
-0.12004224956035614,
0.05333469808101654,
0.0435841903090477,
-0.03690819442272186,
-0.005171661265194416,
-0.06854286044836044,
-0.09769635647535324,
0.09404978156089783,
0.10367551445960999,
-0.138334259390831,
0.03589172288775444,
-0.07081279903650284,
-0.04324180260300636,
0.004275931511074305,
-0.15888772904872894,
0.002721194177865982,
0.026439141482114792,
-0.06687033176422119,
-0.054982639849185944,
0.09460922330617905,
-0.16937942802906036,
-0.14903360605239868,
-0.043275244534015656,
-0.07957577705383301,
-0.04017869010567665,
-0.16706758737564087,
-0.15680015087127686,
-0.008902661502361298,
-0.03939349576830864,
0.01924777962267399,
-0.0867023691534996,
-0.1573655754327774,
-0.02618982456624508,
0.018991101533174515,
0.004277936182916164,
-0.0025261305272579193,
-0.07775900512933731,
-0.009953104890882969,
-0.029788656160235405,
-0.039860401302576065,
0.014388899318873882,
-0.04774826392531395,
0.12130317091941833,
0.10131479054689407,
0.054393745958805084,
-0.02318398468196392,
0.06074899807572365,
-0.07921862602233887,
0.06472242623567581,
-0.11814825981855392,
0.09500966966152191,
-0.05940631031990051,
0.0793999582529068,
-0.0339047946035862,
-0.10479164123535156,
0.07657478004693985,
0.06272047758102417,
0.06631384044885635,
0.03417585417628288,
-0.1394185870885849,
-0.023904865607619286,
0.18685658276081085,
-0.1237390786409378,
-0.13796474039554596,
0.10302074253559113,
-0.03932394087314606,
0.08303101360797882,
0.0831676572561264,
0.1416729837656021,
0.1484951376914978,
-0.02478179894387722,
0.023798055946826935,
0.04988519102334976,
0.04548872634768486,
-0.13292355835437775,
0.0794093906879425,
0.06583656370639801,
-0.0875430479645729,
0.06185220927000046,
-0.015426438301801682,
0.09978674352169037,
-0.011126374825835228,
-0.02488320879638195,
-0.051011696457862854,
-0.07985159009695053,
-0.0056156073696911335,
0.008240116760134697,
0.06535662710666656,
-0.0832512304186821,
-0.06095663830637932,
0.0907265767455101,
0.1741383671760559,
-0.13154254853725433,
-0.0018408307805657387,
-0.08077193051576614,
0.036196086555719376,
-0.07588835060596466,
0.02888316474854946,
-0.16275399923324585,
0.036139342933893204,
0.07752900570631027,
-0.025082020089030266,
0.052099619060754776,
0.13273026049137115,
0.012929377146065235,
0.04283730685710907,
0.0013329728972166777,
-0.013503518886864185,
-0.12067861109972,
-0.05583258718252182,
-0.06268837302923203,
-0.06356671452522278,
-0.0886518582701683,
-0.06020966172218323,
-0.0369260199368,
-0.19495433568954468,
0.011449298821389675,
0.0010748024797067046,
0.0026411341968923807,
0.02822542004287243,
-0.013053586706519127,
0.028436334803700447,
0.07615344226360321,
-0.06073123216629028,
-0.03702530264854431,
0.03185390681028366,
0.024293215945363045,
-0.040482424199581146,
-0.05571412295103073,
-0.0814976766705513,
0.005736730061471462,
0.10716258734464645,
0.04144405946135521,
-0.07972703129053116,
0.023626720532774925,
-0.020802676677703857,
-0.04860098659992218,
0.009368529543280602,
-0.06486333906650543,
0.1434406042098999,
-0.005609281361103058,
0.19901973009109497,
-0.16482476890087128,
-0.0383974127471447,
-0.024066179990768433,
0.02404545620083809,
0.06352691352367401,
0.1389569640159607,
-0.015538714826107025,
-0.08430075645446777,
0.06561044603586197,
0.01862156391143799,
-0.10133127868175507,
0.23186776041984558,
-0.04687196761369705,
-0.093021921813488,
0.022451262921094894,
0.10229908674955368,
-0.01666913367807865,
0.16619715094566345,
-0.20313608646392822,
-0.028310120105743408,
0.017046915367245674,
0.008351463824510574,
0.06657420098781586,
-0.1257077008485794,
0.0029577629175037146,
0.009334312751889229,
-0.07317160815000534,
-0.06672662496566772,
-0.006741893012076616,
-0.0068491362035274506,
0.038278546184301376,
-0.008915402926504612,
-0.03260858729481697,
0.016895616427063942,
-0.03950607776641846,
-0.106607586145401,
0.2202586680650711,
-0.09637296199798584,
-0.218427836894989,
-0.20436367392539978,
0.11158905178308487,
-0.06185757741332054,
-0.012795254588127136,
0.03647848963737488,
-0.07865152508020401,
-0.05526128038764,
-0.0574600026011467,
0.17019911110401154,
-0.061687979847192764,
-0.01263453159481287,
-0.015343288891017437,
0.07615384459495544,
0.010453881695866585,
-0.20978836715221405,
0.034618526697158813,
-0.004519253503531218,
-0.013795613311231136,
0.005830034147948027,
-0.10181687027215958,
0.09077981114387512,
0.15557362139225006,
-0.08247087895870209,
0.020117156207561493,
0.00711906049400568,
0.18990948796272278,
-0.03808331862092018,
-0.05560564249753952,
0.14247608184814453,
-0.019485890865325928,
-0.010379078797996044,
0.0158530306071043,
-0.013701186515390873,
-0.09851299226284027,
0.06381900608539581,
-0.00956637691706419,
-0.02595406211912632,
-0.2752548158168793,
-0.00828233826905489,
-0.07893648743629456,
0.058361396193504333,
0.03686286881566048,
0.04171337932348251,
-0.08723119646310806,
0.02901594154536724,
0.060655009001493454,
0.15111538767814636,
-0.004223337396979332,
0.05295458436012268,
0.057057030498981476,
-0.0021641391795128584,
0.008256721310317516,
-0.09946198016405106,
0.012581153772771358,
0.07278639823198318,
0.11065100878477097,
0.27014654874801636,
-0.09980520606040955,
0.19594930112361908,
0.050160035490989685,
0.04692680388689041,
0.04942939803004265,
0.13414010405540466,
-0.1335555762052536,
0.03190084919333458,
0.0021621822379529476,
-0.008885829709470272,
-0.11128830164670944,
0.00902069453150034,
-0.0675060898065567,
0.09050199389457703,
-0.10642291605472565,
-0.056577589362859726,
0.009926951490342617,
0.1474466174840927,
0.042432449758052826,
-0.22436583042144775,
-0.1297587901353836,
0.020750805735588074,
-0.09546760469675064,
-0.10583185404539108,
0.06612876057624817,
0.24508601427078247,
-0.07705195993185043,
-0.04173292964696884,
-0.004087029490619898,
0.13380832970142365,
-0.03720070794224739,
-0.021672502160072327,
-0.037364520132541656,
0.06346090137958527,
0.016630234196782112,
0.13582821190357208,
-0.2956693768501282,
0.13129642605781555,
-0.009632235392928123,
0.06246572360396385,
-0.030119361355900764,
0.04934114217758179,
-0.038419291377067566,
0.07636594027280807,
0.03875143453478813,
-0.00984314363449812,
0.03368086367845535,
-0.15863950550556183,
0.014429164119064808,
0.04125816002488136,
0.016477521508932114,
0.057043083012104034,
0.0626043900847435,
-0.0029578746762126684,
0.05836348235607147,
-0.018489301204681396,
-0.12209568917751312,
-0.0717531144618988,
-0.06533460319042206,
-0.01890379749238491,
-0.03032270073890686,
-0.015269001945853233,
-0.04504483938217163,
-0.009791793301701546,
0.07683293521404266,
0.18398693203926086,
-0.09359102696180344,
-0.07720963656902313,
-0.07432379573583603,
0.05234861746430397,
0.10860952734947205,
-0.08234725147485733,
0.029371660202741623,
-0.003533486044034362,
0.04375448450446129,
-0.010142862796783447,
-0.0732463076710701,
0.05229228734970093,
-0.038950785994529724,
-0.06913765519857407,
-0.012424882501363754,
0.06229713559150696,
-0.00008154716488206759,
0.027489861473441124,
0.012341168709099293,
-0.09585622698068619,
-0.04403393343091011,
-0.12056327611207962,
-0.1275550127029419,
-0.04054791480302811,
0.015211760066449642,
0.04201384633779526,
-0.14535854756832123,
-0.05721036344766617,
0.0030853452626615763,
-0.040098004043102264,
0.13113775849342346,
0.15801866352558136,
-0.055735062807798386,
0.03031277470290661,
0.1486806571483612,
-0.06081647798418999,
-0.1895701289176941,
0.03251807019114494,
0.04520820453763008,
0.11916553974151611,
-0.04079049080610275,
-0.16207528114318848,
0.04849784076213837,
0.020683273673057556,
0.035959817469120026,
0.04963374137878418,
-0.30980709195137024,
-0.1244627833366394,
0.08391010016202927,
0.1593671590089798,
0.12157823890447617,
-0.12317510694265366,
-0.03839905932545662,
-0.06465256214141846,
-0.16147124767303467,
0.09193389117717743,
-0.050288259983062744,
0.13289068639278412,
-0.07476884126663208,
0.027256345376372337,
0.03476206958293915,
-0.045695699751377106,
0.07298798114061356,
0.032598935067653656,
0.12096932530403137,
-0.04313795268535614,
0.017150292173027992,
0.12369129806756973,
-0.03409362956881523,
0.18337798118591309,
-0.14540177583694458,
0.09813011437654495,
-0.23416228592395782,
-0.05877136439085007,
-0.0754297524690628,
0.0034111530985683203,
-0.03468434885144234,
-0.04692176356911659,
-0.07727345824241638,
0.03230603039264679,
-0.003136902116239071,
-0.00672521535307169,
0.04307103529572487,
-0.0315021350979805,
-0.01638171635568142,
0.10640843212604523,
0.10464105755090714,
-0.01972014456987381,
-0.06810510158538818,
0.05431210622191429,
0.051100388169288635,
0.11449342221021652,
-0.195863276720047,
0.030705681070685387,
0.10380823910236359,
0.015165573917329311,
0.12565451860427856,
0.04376942291855812,
-0.10588377714157104,
0.04257841780781746,
0.08732897788286209,
-0.0761280283331871,
-0.06243612989783287,
-0.020817698910832405,
-0.07835462689399719,
-0.06715874373912811,
0.05166437476873398,
0.09523142874240875,
-0.0510590523481369,
-0.019191080704331398,
-0.024722713977098465,
-0.018891140818595886,
-0.112869992852211,
0.18681757152080536,
0.07626983523368835,
0.08554013818502426,
-0.06630770862102509,
0.06289146840572357,
0.08512644469738007,
-0.0836234763264656,
0.008098091930150986,
0.18552324175834656,
-0.10228977352380753,
-0.04748924821615219,
0.07117615640163422,
0.22005701065063477,
-0.028768474236130714,
-0.05963427945971489,
-0.13934896886348724,
-0.07768189162015915,
0.03155319765210152,
0.16560842096805573,
0.10206849873065948,
0.09531819075345993,
-0.026515956968069077,
-0.0018551506800577044,
-0.10702230781316757,
0.09290950000286102,
0.06487536430358887,
0.04911443218588829,
-0.10642489045858383,
0.12985101342201233,
0.03980237990617752,
0.12206913530826569,
-0.026385681703686714,
-0.011305241845548153,
-0.1386546641588211,
0.06432199478149414,
-0.1129319816827774,
0.03414444252848625,
-0.009019437246024609,
0.05146399512887001,
-0.0241767056286335,
0.002134549431502819,
-0.03138785436749458,
0.06771896034479141,
-0.08225899189710617,
0.001949717290699482,
0.0036761504597961903,
0.05616704374551773,
-0.05182153731584549,
-0.019170669838786125,
0.03241809830069542,
-0.0919274389743805,
0.12304379791021347,
-0.03858610987663269,
-0.02910756506025791,
0.07976953685283661,
-0.05078088864684105,
0.040963511914014816,
0.015040330588817596,
0.04863974452018738,
0.021058281883597374,
0.01501861959695816,
0.07742194086313248,
0.03660399094223976,
0.053064387291669846,
0.024435944855213165,
0.11821053922176361,
-0.1393018513917923,
-0.08490733802318573,
-0.05415171757340431,
-0.11394436657428741,
-0.05666995793581009,
0.100596584379673,
0.04739058390259743,
0.10476493835449219,
0.09137453138828278,
-0.03135300055146217,
0.010746043175458908,
-0.1273558884859085,
-0.06596009433269501,
0.028599729761481285,
-0.03007775917649269,
-0.08210233598947525,
-0.05518733337521553,
0.03753077983856201,
-0.03207506611943245,
0.12286462634801865,
0.019524965435266495,
0.038453441113233566,
-0.020491719245910645,
-0.06330624967813492,
-0.015385239385068417,
0.02110002189874649,
0.21071754395961761,
-0.08532747626304626,
0.04237411543726921,
0.00005281466292217374,
0.015290734358131886,
0.007983089424669743,
0.11777859181165695,
0.12014882266521454,
0.166439950466156,
-0.03459477052092552,
0.10084211081266403,
0.018300440162420273,
0.000374231138266623,
-0.07341471314430237,
0.017213601619005203,
0.022961340844631195,
0.061910971999168396,
-0.04769651219248772,
0.18612967431545258,
0.0935843363404274,
-0.12319980561733246,
0.10985584557056427,
0.02496957778930664,
-0.13261906802654266,
-0.034953128546476364,
0.023158496245741844,
-0.03619171306490898,
-0.1481981873512268,
0.02391931042075157,
-0.12989971041679382,
-0.016605248674750328,
0.05264417827129364,
0.051978565752506256,
-0.0789811834692955,
0.17059852182865143,
0.034127891063690186,
-0.05868518352508545,
0.05476560443639755,
-0.0019145270343869925,
0.02663242258131504,
0.02098957635462284,
0.03616553917527199,
0.03647179901599884,
-0.03666731342673302,
0.0363340750336647,
0.024523189291357994,
-0.02394724264740944,
-0.017926780506968498,
-0.02043367736041546,
-0.0026214634999632835,
-0.016506260260939598,
0.019478028640151024,
0.05642738565802574,
0.16050660610198975,
0.03728361427783966,
-0.07441207766532898,
-0.017527271062135696,
0.1715027242898941,
-0.027010560035705566,
-0.09728393703699112,
-0.12757310271263123,
0.13132429122924805,
0.05232929438352585,
0.011008246801793575,
0.025639282539486885,
-0.08234512060880661,
-0.053575299680233,
0.2081649750471115,
0.05338634178042412,
-0.03159903734922409,
-0.02296963892877102,
0.007668890990316868,
-0.0018508993089199066,
-0.04189005121588707,
0.20322364568710327,
0.023250672966241837,
0.22705987095832825,
0.02358839474618435,
-0.007431072648614645,
-0.06929399073123932,
-0.040490102022886276,
0.00322463340125978,
0.11818352341651917,
-0.037892624735832214,
-0.03833388164639473,
-0.0822315439581871,
-0.002221507951617241,
-0.00199388200417161,
-0.08113017678260803,
0.09739825874567032,
-0.13637766242027283,
-0.09778021275997162,
-0.04917481169104576,
0.0493946298956871,
-0.058419179171323776,
0.016808584332466125,
-0.0253879614174366,
0.04403304308652878,
0.06884054839611053,
-0.032735493034124374,
-0.09948272258043289,
-0.16969694197177887,
0.09502563625574112,
-0.04897867143154144,
0.13317450881004333,
-0.01569821685552597,
0.15333066880702972,
0.0855412557721138,
0.025945449247956276,
-0.06333158910274506,
0.1153080090880394,
0.031871747225522995,
0.06002165377140045,
0.04899890348315239,
0.12095958739519119,
-0.05027179419994354,
0.13614234328269958,
-0.04954491928219795,
-0.029490340501070023,
-0.027735183015465736,
-0.07567306607961655,
-0.018553752452135086,
-0.1634761542081833,
-0.019636204466223717,
-0.09496936947107315,
0.09365838766098022,
0.19478611648082733,
-0.04397328943014145,
-0.03089415840804577,
-0.09310566633939743,
0.10917609930038452,
-0.01291828602552414,
0.06294375658035278,
-0.032492756843566895,
-0.17376695573329926,
0.0010421309852972627,
0.014148395508527756,
0.013655513525009155,
-0.2749030292034149,
-0.00591592350974679,
-0.038946691900491714,
-0.028356710448861122,
-0.0863049328327179,
0.15956395864486694,
0.08802530914545059,
0.04966261610388756,
-0.04070641100406647,
-0.15967267751693726,
-0.03750288113951683,
0.059156645089387894,
-0.13854917883872986,
-0.14489369094371796
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_java_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_java_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/java/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_java_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08824047446250916,
0.07106105983257294,
-0.0011887062573805451,
0.10766629129648209,
0.043981071561574936,
0.02410716563463211,
0.028470054268836975,
0.10857737809419632,
-0.015162251889705658,
0.06396149098873138,
0.05299476161599159,
-0.06904889643192291,
0.05750822648406029,
0.19024717807769775,
0.02618899941444397,
-0.15481898188591003,
-0.031498175114393234,
0.02804369293153286,
-0.050652842968702316,
0.10297499597072601,
0.07951373606920242,
-0.08188334852457047,
0.07176727801561356,
-0.04988594725728035,
-0.12475427240133286,
0.05490032956004143,
-0.034033793956041336,
-0.021032094955444336,
0.09084132313728333,
0.05894278362393379,
0.10705763846635818,
-0.024881700053811073,
0.061984747648239136,
-0.2151550054550171,
0.0023158376570791006,
0.028476115316152573,
0.06488077342510223,
0.0352504588663578,
0.05638059973716736,
0.07725860923528671,
0.11399281024932861,
-0.010973187163472176,
0.042046189308166504,
0.06128665804862976,
-0.06504334509372711,
-0.07342015951871872,
-0.06114134192466736,
0.06289690732955933,
0.08754485100507736,
0.0965176522731781,
-0.008110318332910538,
0.003532680217176676,
-0.07828439772129059,
0.08465290069580078,
0.12404505908489227,
-0.20982669293880463,
-0.018834346905350685,
0.1167796403169632,
0.0887816920876503,
0.05115916207432747,
-0.08469979465007782,
-0.03856121376156807,
0.10503627359867096,
0.047034021466970444,
0.06697843223810196,
-0.09788810461759567,
-0.05556243658065796,
-0.006411084905266762,
-0.04325174540281296,
-0.04688023775815964,
0.16245399415493011,
0.0396273173391819,
-0.047917239367961884,
-0.11343216150999069,
-0.04566044360399246,
-0.19027955830097198,
0.045668475329875946,
0.010562279261648655,
0.015107251703739166,
-0.004973202012479305,
0.03232760354876518,
-0.01877591572701931,
-0.09317536652088165,
-0.11480047553777695,
0.03344414010643959,
0.011643162928521633,
0.055838692933321,
0.03217162936925888,
-0.029506148770451546,
0.08804492652416229,
0.00687572592869401,
-0.05419592931866646,
-0.016328485682606697,
0.018075203523039818,
-0.1098383441567421,
0.0163401048630476,
0.002733656670898199,
-0.06574185192584991,
0.003351238090544939,
0.06264573335647583,
-0.10680774599313736,
0.08075635880231857,
0.09252099692821503,
0.014997527934610844,
0.018845608457922935,
0.21153591573238373,
0.04263989254832268,
-0.145889014005661,
0.020205890759825706,
0.026644019410014153,
-0.002679215045645833,
0.00902215950191021,
-0.050207674503326416,
-0.04723372682929039,
0.015628118067979813,
0.07571901381015778,
-0.12292774021625519,
0.007006822619587183,
-0.06022942438721657,
-0.010386007837951183,
0.07990316301584244,
-0.12735353410243988,
0.04189940169453621,
0.015166761353611946,
-0.0443003848195076,
-0.03777478262782097,
0.07953278720378876,
-0.1257021576166153,
-0.12313181161880493,
0.04281868785619736,
-0.04298756644129753,
-0.03350798785686493,
-0.12145813554525375,
-0.10668276250362396,
-0.0013523754896596074,
-0.039425306022167206,
-0.002213777042925358,
-0.09368390589952469,
-0.09146501868963242,
-0.027849003672599792,
0.03523486480116844,
-0.01205715723335743,
-0.030060777440667152,
-0.04412280023097992,
0.006746986880898476,
-0.009274973534047604,
-0.024074390530586243,
0.03988582640886307,
-0.033483996987342834,
0.09025856107473373,
0.07453916221857071,
0.04995652660727501,
0.00813916977494955,
0.03271138295531273,
-0.09057772159576416,
0.08555008471012115,
-0.0973452627658844,
0.057548779994249344,
-0.01698150858283043,
0.06444532424211502,
-0.10414740443229675,
-0.07612119615077972,
0.013514584861695766,
0.04745754599571228,
0.06314043700695038,
0.02843611128628254,
-0.12481743097305298,
0.028769390657544136,
0.14624738693237305,
-0.11275433748960495,
-0.1290428340435028,
0.10551916807889938,
0.0005736952880397439,
0.03293994441628456,
0.05844772234559059,
0.1376628428697586,
0.15454250574111938,
-0.08063636720180511,
-0.025502588599920273,
0.07771827280521393,
0.05972503125667572,
-0.06440727412700653,
0.06133950129151344,
0.010196378454566002,
0.009863056242465973,
0.03377170115709305,
0.057042792439460754,
0.0515463687479496,
0.0034767251927405596,
-0.0345652736723423,
-0.0425897091627121,
-0.08828777819871902,
-0.05902048945426941,
-0.00867407862097025,
0.02098696678876877,
-0.057858243584632874,
-0.05079767107963562,
-0.0030772732570767403,
0.1649833619594574,
-0.09470454603433609,
0.029417388141155243,
-0.06925737857818604,
-0.03895147889852524,
-0.0813378170132637,
0.02997659519314766,
-0.11861009150743484,
0.030971329659223557,
0.06775858253240585,
-0.050994548946619034,
0.037169795483350754,
0.08434520661830902,
0.004938673693686724,
0.021803628653287888,
-0.05675486475229263,
-0.04431559145450592,
-0.0442015565931797,
-0.06700965762138367,
-0.10431588441133499,
-0.03247610107064247,
-0.09353014081716537,
-0.03186872974038124,
-0.06892307102680206,
-0.15939457714557648,
-0.0028181057423353195,
-0.0012851913925260305,
0.03328594192862511,
0.03147488832473755,
-0.03524031117558479,
0.03899287432432175,
0.056157346814870834,
-0.040692608803510666,
-0.08229851722717285,
0.017889590933918953,
0.0320037342607975,
-0.08433236926794052,
-0.026015838608145714,
-0.08842125535011292,
-0.06151339411735535,
0.06078384816646576,
0.09923932701349258,
-0.10595385730266571,
-0.0023553899955004454,
-0.026711169630289078,
-0.051777686923742294,
-0.06691434234380722,
-0.059545472264289856,
0.16119793057441711,
0.01401884388178587,
0.16007602214813232,
-0.13705343008041382,
-0.06642712652683258,
-0.025510359555482864,
0.009351659566164017,
0.023930255323648453,
0.1584234982728958,
0.0254217516630888,
-0.09065660089254379,
0.039469338953495026,
-0.024850863963365555,
-0.049669958651065826,
0.17276597023010254,
-0.01552106998860836,
-0.0769452378153801,
-0.0026914936024695635,
0.10912180691957474,
-0.013944568112492561,
0.16548237204551697,
-0.07049085944890976,
0.006542931776493788,
-0.004344636108726263,
0.021352168172597885,
0.040100473910570145,
-0.12446857243776321,
0.024770868942141533,
0.03847268596291542,
-0.07003218680620193,
-0.023965680971741676,
-0.02196495607495308,
-0.03807809203863144,
0.03925984725356102,
0.015546499751508236,
0.038245219737291336,
-0.02383207343518734,
-0.034502044320106506,
-0.10207092761993408,
0.1799120008945465,
-0.07362696528434753,
-0.21109391748905182,
-0.17463189363479614,
0.10479134321212769,
-0.027021773159503937,
-0.01950106769800186,
0.03954271227121353,
-0.0975128710269928,
-0.0633355975151062,
-0.10352525860071182,
0.11677300184965134,
-0.10637541115283966,
-0.007666715420782566,
-0.021740028634667397,
0.0628867819905281,
0.05385919660329819,
-0.16487663984298706,
0.024837935343384743,
-0.003899826668202877,
0.012925499118864536,
-0.012752651236951351,
-0.04235092177987099,
0.08248671144247055,
0.10331280529499054,
-0.0636647418141365,
0.02089543268084526,
0.002860824577510357,
0.1671789139509201,
-0.05401143431663513,
0.04861684516072273,
0.1862344890832901,
0.02094249613583088,
0.02577877789735794,
0.05626088008284569,
0.012851790525019169,
-0.0969662293791771,
0.06288411468267441,
0.058804795145988464,
-0.0323740616440773,
-0.22326014935970306,
-0.01367904618382454,
-0.0784846693277359,
0.0697295293211937,
0.11688800156116486,
0.05766972899436951,
-0.16254907846450806,
0.01481486763805151,
-0.0042137582786381245,
0.15225882828235626,
-0.03276422992348671,
0.05274880677461624,
0.014663700945675373,
0.005472386255860329,
-0.004055328201502562,
-0.10189855098724365,
0.0115758515894413,
0.07746223360300064,
0.11749466508626938,
0.19796422123908997,
-0.08549696207046509,
0.17999745905399323,
0.016805777326226234,
0.09877102822065353,
0.036825016140937805,
0.08756313472986221,
-0.13560836017131805,
0.008686083368957043,
0.009340927004814148,
-0.020226532593369484,
-0.055813491344451904,
0.04498428478837013,
-0.03421352431178093,
0.07725401967763901,
-0.056437212973833084,
0.002170443069189787,
0.019621463492512703,
0.2075292021036148,
0.06616053730249405,
-0.14633628726005554,
-0.12387464940547943,
0.026069531217217445,
-0.09317793697118759,
-0.11025995761156082,
0.0709744319319725,
0.2359153926372528,
-0.0613463930785656,
0.015987418591976166,
-0.007238148245960474,
0.13407520949840546,
-0.10466166585683823,
-0.018469253554940224,
0.03386295586824417,
0.059259239584207535,
0.00977677758783102,
0.12154196947813034,
-0.2661893665790558,
0.06968908756971359,
0.015901997685432434,
0.0885830819606781,
-0.015774108469486237,
0.06218600645661354,
-0.056134551763534546,
0.009341690689325333,
0.07864629477262497,
0.012098625302314758,
-0.052526094019412994,
-0.19502592086791992,
-0.03772952780127525,
0.02840331941843033,
0.04094608500599861,
-0.021847276017069817,
0.07835140824317932,
-0.02298913709819317,
0.038672126829624176,
-0.03689899668097496,
-0.1400708258152008,
-0.06208118423819542,
-0.13212937116622925,
-0.030375996604561806,
0.009936031885445118,
-0.04010908305644989,
-0.026910163462162018,
0.041487034410238266,
0.05461955443024635,
0.22324733436107635,
-0.16006283462047577,
-0.0792151540517807,
-0.09425156563520432,
0.060064222663640976,
0.13571158051490784,
-0.08656447380781174,
0.012243067845702171,
0.01491527073085308,
0.055998604744672775,
-0.04366770014166832,
-0.06668076664209366,
0.028666997328400612,
-0.05617036670446396,
-0.09206898510456085,
-0.03143070265650749,
0.11188267916440964,
-0.022129151970148087,
0.03895263746380806,
-0.001961018657311797,
-0.08015747368335724,
-0.033382151275873184,
-0.13242889940738678,
-0.06372473388910294,
0.00455377297475934,
0.028990188613533974,
-0.013562195934355259,
-0.12515795230865479,
0.07340636104345322,
0.011421513743698597,
-0.09979422390460968,
0.06976744532585144,
0.17141340672969818,
-0.0708998441696167,
0.038796354085206985,
0.09939250349998474,
-0.06280640512704849,
-0.17193736135959625,
-0.03971117362380028,
0.03715669736266136,
0.08308502286672592,
-0.022891845554113388,
-0.14429758489131927,
0.06188264489173889,
0.012454678304493427,
0.015195227228105068,
0.029015909880399704,
-0.2842688262462616,
-0.12954726815223694,
-0.005826987791806459,
0.0787675678730011,
0.04553023725748062,
-0.10171825438737869,
-0.050921306014060974,
-0.06461650878190994,
-0.07864421606063843,
0.050153981894254684,
0.07130587100982666,
0.11324966698884964,
-0.0458759069442749,
0.021635346114635468,
0.04155338928103447,
-0.030161475762724876,
0.06523074954748154,
-0.02001243270933628,
0.10232376307249069,
-0.017367547377943993,
0.021237999200820923,
0.041129473596811295,
-0.058274079114198685,
0.18921597301959991,
-0.1649182140827179,
0.09442891925573349,
-0.17887133359909058,
-0.044311534613370895,
-0.030440539121627808,
-0.002345070708543062,
-0.04153410345315933,
-0.05209007114171982,
-0.10397142171859741,
0.02791118063032627,
0.052176352590322495,
-0.030774429440498352,
0.0384979285299778,
-0.026937488466501236,
-0.04307103157043457,
0.0873158648610115,
0.07332242280244827,
-0.01728682592511177,
-0.12879636883735657,
0.03229295834898949,
0.019374312832951546,
0.09470793604850769,
-0.19191937148571014,
0.02707384154200554,
0.10584704577922821,
0.017285175621509552,
0.10636680573225021,
0.0035660313442349434,
-0.08665554970502853,
0.026539862155914307,
0.0701514482498169,
-0.06780155748128891,
-0.1013542041182518,
-0.01910775899887085,
-0.04718252643942833,
-0.09127246588468552,
0.026222389191389084,
0.0863712802529335,
-0.06217150390148163,
-0.01586620695888996,
-0.005638557951897383,
0.022341912612318993,
-0.07340096682310104,
0.17448848485946655,
0.018141113221645355,
0.08165240287780762,
-0.0615837424993515,
0.08672308176755905,
0.10424868762493134,
-0.11769042909145355,
0.022388316690921783,
0.1727554202079773,
-0.08091620355844498,
-0.026796488091349602,
0.08542276173830032,
0.12374066561460495,
-0.007337938994169235,
-0.05112510174512863,
-0.09659547358751297,
-0.08103471249341965,
0.02292807586491108,
0.026859721168875694,
0.061970412731170654,
0.08848889917135239,
-0.02722930908203125,
-0.0011338739423081279,
-0.1194416731595993,
0.10578332096338272,
0.06636888533830643,
0.05056228116154671,
-0.13402046263217926,
0.12029730528593063,
0.038068030029535294,
0.08244996517896652,
0.003781090024858713,
0.028824690729379654,
-0.1080218106508255,
0.03219453617930412,
-0.020248284563422203,
0.0312504842877388,
-0.003489892929792404,
0.048645224422216415,
-0.03505268320441246,
0.03490189090371132,
-0.027490708976984024,
0.05237702280282974,
-0.03606511652469635,
-0.03054082952439785,
-0.041912879794836044,
0.038875430822372437,
-0.06452932953834534,
-0.02072961814701557,
0.01238863542675972,
-0.0810776948928833,
0.09369227290153503,
-0.07425862550735474,
-0.011537596583366394,
-0.005140297580510378,
0.0033035685773938894,
0.062244489789009094,
0.021158017218112946,
0.04105421155691147,
-0.011464695446193218,
-0.01713511534035206,
0.027940239757299423,
0.019351501017808914,
-0.015583804808557034,
-0.005838179960846901,
0.07534506171941757,
-0.15560318529605865,
-0.08606263995170593,
-0.08376282453536987,
-0.08134030550718307,
-0.059847235679626465,
0.07850166410207748,
0.0919915959239006,
0.06862114369869232,
0.08828716725111008,
-0.03777824714779854,
0.009600106626749039,
-0.15761610865592957,
-0.0427963025867939,
0.050879571586847305,
-0.006224288139492273,
-0.12122467905282974,
-0.0366632305085659,
0.05216510221362114,
-0.042551204562187195,
0.10972747951745987,
-0.009686167351901531,
0.054922573268413544,
-0.0064656720496714115,
-0.06691819429397583,
-0.047667745500802994,
0.011918623931705952,
0.1671806275844574,
-0.11222372949123383,
0.0022334433160722256,
-0.010624518617987633,
0.007059760857373476,
0.02658681757748127,
0.1756371706724167,
0.09127971529960632,
0.1264331340789795,
0.040095485746860504,
0.07755524665117264,
-0.04452955350279808,
-0.031782567501068115,
-0.13013260066509247,
0.08781450986862183,
-0.030343489721417427,
0.05192716419696808,
-0.0484110526740551,
0.12781256437301636,
0.1036609336733818,
-0.14096102118492126,
0.10744807869195938,
0.00958411768078804,
-0.0903085246682167,
-0.04014908894896507,
-0.0964517667889595,
-0.0499090701341629,
-0.10196805745363235,
0.0042650881223380566,
-0.10603096336126328,
0.02484593540430069,
0.059910569339990616,
0.030026642605662346,
-0.03312927857041359,
0.16001659631729126,
-0.009416733868420124,
-0.05377281829714775,
0.04046330600976944,
0.045630525797605515,
0.031885720789432526,
0.09175725281238556,
0.03453094884753227,
0.06681642681360245,
-0.0666627362370491,
0.0617109015583992,
0.040226906538009644,
-0.008415888994932175,
0.0024596648290753365,
0.01590012013912201,
-0.011028437875211239,
-0.043677397072315216,
-0.011858191341161728,
0.0708344429731369,
0.15402251482009888,
0.04864515736699104,
-0.04589641094207764,
-0.04679447039961815,
0.21039816737174988,
-0.054790958762168884,
-0.05103226751089096,
-0.1252451092004776,
0.14174531400203705,
0.05446399003267288,
0.0133226802572608,
0.020307272672653198,
-0.0755005031824112,
-0.03198316693305969,
0.2201705425977707,
0.04406823590397835,
-0.032703500241041183,
-0.027745932340621948,
-0.0060067069716751575,
-0.007467108778655529,
-0.03437956050038338,
0.14581803977489471,
0.003491041250526905,
0.21980828046798706,
0.008883191272616386,
0.01369552779942751,
-0.03740391507744789,
-0.05078544467687607,
-0.017360640689730644,
0.20230534672737122,
-0.03702009841799736,
0.028823092579841614,
-0.09829673171043396,
-0.0190521739423275,
0.029727410525083542,
-0.11860966682434082,
0.1253812611103058,
-0.13332131505012512,
-0.07589229196310043,
0.009547372348606586,
0.07513116300106049,
-0.045196156948804855,
0.039034631103277206,
-0.017812741920351982,
0.06310710310935974,
0.0611409991979599,
-0.03023289144039154,
-0.09487167745828629,
-0.15586218237876892,
0.04065798968076706,
-0.022044779732823372,
0.1315365731716156,
0.01870523765683174,
0.06432921439409256,
0.07845787703990936,
0.0037943385541439056,
-0.08784446120262146,
0.09568388015031815,
0.03170197457075119,
-0.001264679478481412,
0.05175125226378441,
0.13335463404655457,
-0.03751710429787636,
0.1655387431383133,
0.004917070269584656,
-0.02935764193534851,
-0.029869576916098595,
-0.03430942818522453,
-0.0027672755531966686,
-0.1517149657011032,
0.0037994165904819965,
-0.0574691966176033,
0.1331525593996048,
0.19675396382808685,
-0.04625743255019188,
-0.0268776323646307,
-0.053672291338443756,
0.08695726096630096,
-0.02090378850698471,
0.09589537233114243,
0.00473918579518795,
-0.15846799314022064,
0.019866943359375,
-0.005561565048992634,
0.016833679750561714,
-0.19051408767700195,
-0.054004643112421036,
-0.031560592353343964,
-0.030124839395284653,
-0.09777987003326416,
0.14383062720298767,
0.07048807293176651,
0.03868873417377472,
-0.03932628408074379,
-0.1847158968448639,
-0.005135838873684406,
0.04810941964387894,
-0.12071530520915985,
-0.11863734573125839
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_java_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_java_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/java/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_java_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10009463876485825,
0.06650538742542267,
-0.000787688884884119,
0.1080809235572815,
0.04100439324975014,
0.024272188544273376,
0.03394930437207222,
0.11013615876436234,
-0.027234187349677086,
0.05645623803138733,
0.045518409460783005,
-0.06411129236221313,
0.0694429948925972,
0.19530917704105377,
0.01742921769618988,
-0.13977070152759552,
-0.031610578298568726,
0.04427669942378998,
-0.06591898202896118,
0.1082656979560852,
0.0775763988494873,
-0.0859663262963295,
0.07655920833349228,
-0.050509799271821976,
-0.12453184276819229,
0.049173109233379364,
-0.02449238859117031,
-0.02239224500954151,
0.09546685963869095,
0.06994069367647171,
0.12080610543489456,
-0.02375432290136814,
0.06888799369335175,
-0.20746451616287231,
0.002205084078013897,
0.028956294059753418,
0.06382117420434952,
0.044313229620456696,
0.05065414682030678,
0.08795841038227081,
0.10704293102025986,
-0.01669815368950367,
0.033009178936481476,
0.058521319180727005,
-0.0657648965716362,
-0.05374206230044365,
-0.06795709580183029,
0.06788621842861176,
0.09174134582281113,
0.09943698346614838,
-0.005068641621619463,
0.03361901640892029,
-0.07689224183559418,
0.08359362185001373,
0.1254282295703888,
-0.22016514837741852,
-0.02110118605196476,
0.10286889225244522,
0.09089408814907074,
0.050146546214818954,
-0.08376394957304001,
-0.0405367910861969,
0.11178047955036163,
0.04533759504556656,
0.06861624121665955,
-0.09395916759967804,
-0.04765046760439873,
-0.004015498328953981,
-0.04725159332156181,
-0.04479502886533737,
0.18552808463573456,
0.0407889150083065,
-0.0533742792904377,
-0.10141725838184357,
-0.04297646880149841,
-0.19034360349178314,
0.04342981427907944,
0.0025859323795884848,
0.007864846847951412,
-0.009936008602380753,
0.020576156675815582,
-0.008111616596579552,
-0.09425141662359238,
-0.12047155201435089,
0.029585113748908043,
0.0024925251491367817,
0.055448561906814575,
0.03334292024374008,
-0.03992784768342972,
0.08813472837209702,
0.04590572044253349,
-0.04828057065606117,
-0.009611360728740692,
0.011689752340316772,
-0.1064949631690979,
0.0042151655070483685,
0.0010024223010987043,
-0.07651516795158386,
-0.009569037705659866,
0.05346282571554184,
-0.10170076787471771,
0.07229696959257126,
0.0951111912727356,
0.01680537313222885,
0.013967711478471756,
0.2069379687309265,
0.043740030378103256,
-0.15263450145721436,
0.023962724953889847,
0.029594402760267258,
-0.008127035573124886,
0.014848624356091022,
-0.05066457390785217,
-0.05163473263382912,
0.021853312849998474,
0.07216069847345352,
-0.1304122805595398,
0.018306247889995575,
-0.06259189546108246,
-0.015531539916992188,
0.07651250809431076,
-0.1292041540145874,
0.03343704342842102,
0.011948110535740852,
-0.055567558854818344,
-0.041566211730241776,
0.09140974283218384,
-0.1325703114271164,
-0.12411359697580338,
0.022260699421167374,
-0.04553678631782532,
-0.036585789173841476,
-0.12456035614013672,
-0.11350813508033752,
-0.0043463450856506824,
-0.017164042219519615,
-0.004145401529967785,
-0.09963104873895645,
-0.0850672721862793,
-0.020324228331446648,
0.03878394514322281,
-0.004466168582439423,
-0.027771102264523506,
-0.04570261389017105,
0.0025826634373515844,
-0.00704597495496273,
-0.025052623823285103,
0.025944793596863747,
-0.03215599060058594,
0.09765516966581345,
0.07349414378404617,
0.04992154240608215,
0.0000059032126955571584,
0.028707729652523994,
-0.08738366514444351,
0.07978186011314392,
-0.10414191335439682,
0.05562060698866844,
-0.013030579313635826,
0.06181707605719566,
-0.10750119388103485,
-0.07478000223636627,
0.010798018425703049,
0.05166004225611687,
0.07517561316490173,
0.038385290652513504,
-0.14823968708515167,
0.03338688984513283,
0.1467953473329544,
-0.1135871410369873,
-0.14226050674915314,
0.10292429476976395,
-0.00769878551363945,
0.04751714691519737,
0.06202142313122749,
0.13048341870307922,
0.15460240840911865,
-0.08653531968593597,
-0.031652700155973434,
0.07698722928762436,
0.048102930188179016,
-0.0734989121556282,
0.052135515958070755,
0.022160213440656662,
-0.011279027909040451,
0.022216683253645897,
0.060666780918836594,
0.05948810651898384,
-0.0022781207226216793,
-0.03643975034356117,
-0.034724246710538864,
-0.09598358720541,
-0.06159113720059395,
-0.007678365800529718,
0.023708375170826912,
-0.05379737168550491,
-0.04937385395169258,
0.005633668974041939,
0.1613837629556656,
-0.09798488020896912,
0.026873130351305008,
-0.0753767117857933,
-0.047386132180690765,
-0.07800421863794327,
0.02557513862848282,
-0.11183106154203415,
0.03388096019625664,
0.06522705405950546,
-0.039784979075193405,
0.04276095703244209,
0.08770627528429031,
0.0031126036774367094,
0.0169028639793396,
-0.05637074261903763,
-0.0417931042611599,
-0.03533509373664856,
-0.06818334758281708,
-0.10928709805011749,
-0.02886256016790867,
-0.09631086885929108,
-0.02314651571214199,
-0.06744738668203354,
-0.17192783951759338,
-0.002045154571533203,
-0.009398546069860458,
0.03128446638584137,
0.031043680384755135,
-0.02943131886422634,
0.03611967712640762,
0.05115059018135071,
-0.04342717304825783,
-0.08215779811143875,
0.016948753967881203,
0.037285398691892624,
-0.08646734058856964,
-0.025731436908245087,
-0.08683417737483978,
-0.06634397059679031,
0.07300439476966858,
0.09829981625080109,
-0.11951112747192383,
-0.0047627962194383144,
-0.029201576486229897,
-0.04845040664076805,
-0.05817513167858124,
-0.05730365589261055,
0.15741126239299774,
0.012860037386417389,
0.1580010950565338,
-0.13988611102104187,
-0.06751705706119537,
-0.029246637597680092,
0.013290636241436005,
0.030831066891551018,
0.1504487842321396,
0.030761057510972023,
-0.10644683986902237,
0.03330203890800476,
-0.035303667187690735,
-0.046563297510147095,
0.16661445796489716,
-0.0172440093010664,
-0.07218268513679504,
-0.00009811434574658051,
0.10426422208547592,
-0.0021035615354776382,
0.19027070701122284,
-0.047399476170539856,
0.004589795600622892,
-0.008072334341704845,
0.011541959829628468,
0.040089260786771774,
-0.12396527826786041,
0.023085201159119606,
0.03714950382709503,
-0.06650446355342865,
-0.017981063574552536,
-0.027549482882022858,
-0.0385449081659317,
0.042825471609830856,
0.013968836516141891,
0.033629823476076126,
-0.021415917202830315,
-0.03485677391290665,
-0.10511917620897293,
0.17664380371570587,
-0.07395131140947342,
-0.2170315980911255,
-0.17132535576820374,
0.11863862723112106,
-0.011126634664833546,
-0.01643422059714794,
0.028798066079616547,
-0.09080775082111359,
-0.0627245083451271,
-0.1046965941786766,
0.12385773658752441,
-0.09855857491493225,
-0.002959679113700986,
-0.02501904033124447,
0.0641573965549469,
0.054717376828193665,
-0.1681433767080307,
0.03344301879405975,
-0.00882337149232626,
0.016118671745061874,
-0.014350272715091705,
-0.06341049075126648,
0.08364448696374893,
0.10450848191976547,
-0.07393139600753784,
0.018899990245699883,
-0.0020771194249391556,
0.16589605808258057,
-0.05960511043667793,
0.05931389331817627,
0.16976825892925262,
0.01831885613501072,
0.02000420168042183,
0.05357475206255913,
0.00611415971070528,
-0.09691080451011658,
0.06958970427513123,
0.04756540432572365,
-0.026694055646657944,
-0.22195717692375183,
-0.01789174973964691,
-0.07606478035449982,
0.07293027639389038,
0.11812764406204224,
0.045138705521821976,
-0.15871338546276093,
0.021340250968933105,
-0.0059889438562095165,
0.16860170662403107,
-0.028808487579226494,
0.053685929626226425,
-0.00658561522141099,
0.013608557172119617,
-0.0053994613699615,
-0.10448119789361954,
0.008948528207838535,
0.07320282608270645,
0.11265108734369278,
0.19644659757614136,
-0.09551622718572617,
0.17494866251945496,
0.007700867485255003,
0.11395253986120224,
0.042472466826438904,
0.10661356151103973,
-0.13230884075164795,
0.013332265429198742,
0.008042129687964916,
-0.019079910591244698,
-0.07312282919883728,
0.04463103413581848,
-0.044332731515169144,
0.08586829900741577,
-0.06576436012983322,
0.009632994420826435,
0.016637876629829407,
0.19251079857349396,
0.07316833734512329,
-0.15796853601932526,
-0.12903665006160736,
0.01640307530760765,
-0.09022583812475204,
-0.11150144040584564,
0.07313191145658493,
0.23898844420909882,
-0.05511647090315819,
0.00862832646816969,
-0.011305841617286205,
0.1349593698978424,
-0.0989123061299324,
-0.021866902709007263,
0.03355420380830765,
0.0579972080886364,
0.004217684268951416,
0.11832550913095474,
-0.2676333785057068,
0.07617370784282684,
0.01958641968667507,
0.09273934364318848,
-0.01724756881594658,
0.054038893431425095,
-0.05326097831130028,
0.0019004227360710502,
0.07948503643274307,
0.013322866521775723,
-0.043063871562480927,
-0.19291909039020538,
-0.04078350588679314,
0.02693610079586506,
0.0394885279238224,
-0.006371417548507452,
0.08276981115341187,
-0.020186301320791245,
0.03967316821217537,
-0.029506564140319824,
-0.12845148146152496,
-0.07376907020807266,
-0.13136206567287445,
-0.03981095179915428,
0.004249850753694773,
-0.040652066469192505,
-0.0289629939943552,
0.04688546434044838,
0.059151969850063324,
0.22270117700099945,
-0.1592138260602951,
-0.07027487456798553,
-0.0905754491686821,
0.05862126871943474,
0.1369219273328781,
-0.08160529285669327,
0.01166874822229147,
0.02671620436012745,
0.05953274294734001,
-0.04485806077718735,
-0.0713300034403801,
0.0306636281311512,
-0.05447438359260559,
-0.07865490764379501,
-0.03492295742034912,
0.10821574926376343,
-0.006738279014825821,
0.0462992824614048,
0.01111722644418478,
-0.08988934010267258,
-0.029671205207705498,
-0.12631094455718994,
-0.07814387232065201,
-0.01451875176280737,
0.037555158138275146,
-0.012810858897864819,
-0.1351732760667801,
0.06111641228199005,
-0.002014594152569771,
-0.09453701227903366,
0.06377630680799484,
0.15216566622257233,
-0.07220853865146637,
0.029176000505685806,
0.07635834068059921,
-0.06303685903549194,
-0.17855308949947357,
-0.030192118138074875,
0.03883257880806923,
0.0857071503996849,
-0.025961972773075104,
-0.13555894792079926,
0.06113743036985397,
-0.0036157427821308374,
0.021858910098671913,
0.023580757901072502,
-0.2575070858001709,
-0.1266746073961258,
-0.0009604159276932478,
0.07920020818710327,
0.03821568191051483,
-0.09664922952651978,
-0.0476546473801136,
-0.0675845518708229,
-0.07496398687362671,
0.07222160696983337,
0.06938156485557556,
0.10759416222572327,
-0.04090544208884239,
0.02422681637108326,
0.04364478215575218,
-0.030485332012176514,
0.05507237836718559,
-0.017074935138225555,
0.10483862459659576,
-0.020092692226171494,
0.013011641800403595,
0.04793849214911461,
-0.05921802297234535,
0.18800301849842072,
-0.1708230972290039,
0.0995001271367073,
-0.1852114498615265,
-0.039216939359903336,
-0.032358139753341675,
0.0008733801660127938,
-0.04213717579841614,
-0.04869803786277771,
-0.11431930214166641,
0.04241914674639702,
0.058564022183418274,
-0.025981629267334938,
0.02919795550405979,
-0.01701766811311245,
-0.04419070854783058,
0.06878521293401718,
0.08870779722929001,
-0.006610153708606958,
-0.10689244419336319,
0.040141042321920395,
0.021142199635505676,
0.10127706825733185,
-0.1834372729063034,
0.02841620333492756,
0.1063365712761879,
0.012386000715196133,
0.09941744059324265,
0.008191927336156368,
-0.08742435276508331,
0.017443174496293068,
0.0687144473195076,
-0.07030145823955536,
-0.07384626567363739,
-0.02007526531815529,
-0.023758718743920326,
-0.08740691095590591,
0.028432529419660568,
0.08583462983369827,
-0.06677136570215225,
-0.011348075233399868,
-0.004935926757752895,
0.013261020183563232,
-0.07993508875370026,
0.17353777587413788,
0.020347004756331444,
0.08126670122146606,
-0.05188523977994919,
0.08050813525915146,
0.09645993262529373,
-0.11024687439203262,
0.028040913864970207,
0.1617898792028427,
-0.08458837121725082,
-0.024413080886006355,
0.10813616216182709,
0.13542744517326355,
-0.016784004867076874,
-0.05176176503300667,
-0.09558633714914322,
-0.08214050531387329,
0.017446840181946754,
0.05255487188696861,
0.0608677975833416,
0.09178191423416138,
-0.019849607720971107,
-0.00457002641633153,
-0.12860645353794098,
0.09910179674625397,
0.07271599769592285,
0.047364480793476105,
-0.1264299601316452,
0.14405572414398193,
0.03152913972735405,
0.07599808275699615,
0.0003625962417572737,
0.037902045994997025,
-0.1128506064414978,
0.0350000374019146,
-0.02427619695663452,
0.03868655115365982,
0.0008043860434554517,
0.045039981603622437,
-0.04045052081346512,
0.042200129479169846,
-0.028725439682602882,
0.04776237532496452,
-0.034487612545490265,
-0.024470999836921692,
-0.04082692787051201,
0.030847907066345215,
-0.05591122806072235,
-0.021474232897162437,
0.010741504840552807,
-0.08610466867685318,
0.0852426290512085,
-0.07173984497785568,
-0.009969866834580898,
-0.0025121651124209166,
0.013379828073084354,
0.05425434187054634,
0.010119249112904072,
0.048057787120342255,
-0.004874965641647577,
-0.005456794518977404,
0.023962585255503654,
0.024796441197395325,
-0.013153791427612305,
-0.00882062129676342,
0.08369090408086777,
-0.14245426654815674,
-0.08417481184005737,
-0.08697579801082611,
-0.0681748241186142,
-0.062100812792778015,
0.08339566737413406,
0.09020613133907318,
0.06755919754505157,
0.08692484349012375,
-0.03802699223160744,
0.0063073416240513325,
-0.16633515059947968,
-0.04135023429989815,
0.052859608083963394,
-0.004804133903235197,
-0.11476808041334152,
-0.036825116723775864,
0.06039554253220558,
-0.03786570578813553,
0.10603596270084381,
-0.01002019364386797,
0.04330487549304962,
-0.009271113201975822,
-0.07466047257184982,
-0.057142503559589386,
0.012687456794083118,
0.18286322057247162,
-0.1125587448477745,
0.006163707468658686,
-0.011270834133028984,
0.006550596095621586,
0.025904113426804543,
0.1692751944065094,
0.10505902022123337,
0.12684324383735657,
0.025069529190659523,
0.08359068632125854,
-0.047735899686813354,
-0.035682737827301025,
-0.10643069446086884,
0.07779932767152786,
-0.029360266402363777,
0.04255217686295509,
-0.03588186204433441,
0.1341255009174347,
0.08820827305316925,
-0.14194829761981964,
0.10556007921695709,
0.0006007258780300617,
-0.09820037335157394,
-0.03396718576550484,
-0.08892925083637238,
-0.04374891147017479,
-0.09989136457443237,
0.0019611832685768604,
-0.10735996067523956,
0.005774172488600016,
0.04689762741327286,
0.028495939448475838,
-0.03426686301827431,
0.1604510247707367,
-0.021675094962120056,
-0.0537983663380146,
0.0417020320892334,
0.04947285354137421,
0.023386036977171898,
0.07713276147842407,
0.03394006937742233,
0.06436016410589218,
-0.064149409532547,
0.060959458351135254,
0.03440907225012779,
-0.007509732153266668,
0.013805001974105835,
0.03609592840075493,
-0.010229576379060745,
-0.04225078970193863,
-0.018872195854783058,
0.0761382058262825,
0.15172231197357178,
0.04625660553574562,
-0.039227914065122604,
-0.04906502366065979,
0.19669856131076813,
-0.05735928192734718,
-0.05352411046624184,
-0.12536141276359558,
0.14435061812400818,
0.04411955550312996,
0.014389005489647388,
0.01981045864522457,
-0.075264573097229,
-0.021614454686641693,
0.23884017765522003,
0.04809729382395744,
-0.05066170543432236,
-0.03022715263068676,
-0.01073538139462471,
-0.007439516019076109,
-0.04486352205276489,
0.14854536950588226,
0.01008503045886755,
0.22277802228927612,
0.007640248164534569,
-0.003263970836997032,
-0.04067549109458923,
-0.04774961620569229,
0.0015691007720306516,
0.19800817966461182,
-0.034825801849365234,
0.029343407601118088,
-0.09799391031265259,
-0.018953166902065277,
0.02223844826221466,
-0.14220969378948212,
0.12330760806798935,
-0.13471488654613495,
-0.06947990506887436,
0.006110829301178455,
0.07023634761571884,
-0.05230846256017685,
0.04077931493520737,
-0.021708723157644272,
0.07475157827138901,
0.06142207235097885,
-0.02617720328271389,
-0.09894537925720215,
-0.14914706349372864,
0.043181754648685455,
-0.02857111021876335,
0.12780524790287018,
0.018157372251152992,
0.07429876178503036,
0.07577621191740036,
0.005741599947214127,
-0.08234445750713348,
0.08927404880523682,
0.029452800750732422,
0.004649786744266748,
0.04968802258372307,
0.12683485448360443,
-0.041146356612443924,
0.17437094449996948,
0.00260111503303051,
-0.03727349638938904,
-0.027705233544111252,
-0.03258661925792694,
-0.010242984630167484,
-0.15387126803398132,
0.0005360712530091405,
-0.05674917995929718,
0.13933080434799194,
0.19582219421863556,
-0.048670876771211624,
-0.015814056620001793,
-0.053406231105327606,
0.0864698514342308,
-0.013066049665212631,
0.08745740354061127,
0.005180382635444403,
-0.15455856919288635,
0.006948081310838461,
-0.03204597160220146,
0.009981953538954258,
-0.19409547746181488,
-0.04733438789844513,
-0.04134218767285347,
-0.04285025596618652,
-0.10072483867406845,
0.1436881721019745,
0.07253611832857132,
0.04347997531294823,
-0.04374157264828682,
-0.14232157170772552,
-0.004980516619980335,
0.04794207587838173,
-0.12153176218271255,
-0.12148264795541763
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_javascript_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_javascript_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/javascript/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_javascript_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.14623047411441803,
-0.033762071281671524,
-0.0004096078628208488,
0.12788809835910797,
0.12676063179969788,
0.0255763940513134,
0.05189855024218559,
0.06248978525400162,
-0.02888994663953781,
0.024647925049066544,
0.05466973036527634,
-0.0005658566951751709,
0.02834307961165905,
0.19412364065647125,
0.012909176759421825,
-0.09339192509651184,
-0.0394066721200943,
0.04491733759641647,
-0.04280839487910271,
0.1362212598323822,
0.07647832483053207,
-0.06635113805532455,
0.05712641030550003,
-0.0578799732029438,
-0.24938833713531494,
0.06195899099111557,
0.018357135355472565,
-0.06086213141679764,
0.09127986431121826,
0.04963357746601105,
0.12317846715450287,
-0.0028193173930048943,
0.014113661833107471,
-0.14319144189357758,
0.013626388274133205,
0.00968816690146923,
0.02704685553908348,
0.013963687233626842,
0.03831466659903526,
0.02039065770804882,
0.15055866539478302,
-0.00048589977086521685,
0.030770951882004738,
0.07582830637693405,
-0.07709299027919769,
-0.10393857955932617,
-0.002346054883673787,
0.0010666182497516274,
0.04842589795589447,
0.11388256400823593,
-0.02042800560593605,
0.12188265472650528,
-0.1537599116563797,
0.13090120255947113,
0.10146284848451614,
-0.22662904858589172,
-0.018332822248339653,
0.09704894572496414,
0.07957369834184647,
0.0682053193449974,
-0.05308767035603523,
-0.05437459051609039,
0.10686884820461273,
0.04456239566206932,
0.03596150130033493,
-0.09777911752462387,
-0.08355077356100082,
0.01132901106029749,
-0.0717964768409729,
-0.07098453491926193,
0.22575469315052032,
0.022399114444851875,
-0.0811486765742302,
-0.04631175473332405,
-0.030739087611436844,
-0.1200210303068161,
0.02704271301627159,
0.04588950052857399,
-0.002248650649562478,
-0.03474407643079758,
-0.0301838181912899,
0.030652157962322235,
-0.07308743894100189,
-0.14693951606750488,
0.022864345461130142,
0.10049665719270706,
0.04562408849596977,
0.019824376329779625,
-0.11000681668519974,
0.10666403919458389,
0.04846268147230148,
-0.06957478821277618,
-0.025142276659607887,
-0.013953061774373055,
-0.09459137171506882,
0.021545758470892906,
-0.050149086862802505,
-0.1664704978466034,
0.02981199137866497,
0.03600216284394264,
-0.0338306725025177,
0.056287892162799835,
0.02725120633840561,
0.036108508706092834,
0.006809472106397152,
0.225591242313385,
0.08225970715284348,
-0.14400111138820648,
0.06293489038944244,
0.061854131519794464,
-0.024091603234410286,
-0.0019797165878117085,
-0.07950647920370102,
-0.09565187990665436,
0.09928188472986221,
0.10749514400959015,
-0.11970547586679459,
0.03816293925046921,
-0.07090175151824951,
-0.027590395882725716,
0.019099529832601547,
-0.15685106813907623,
0.004805666860193014,
0.04145605117082596,
-0.08040197938680649,
-0.04379289224743843,
0.10126877576112747,
-0.17642931640148163,
-0.14421169459819794,
-0.0656416118144989,
-0.07277112454175949,
-0.026118967682123184,
-0.1647949069738388,
-0.1528354436159134,
-0.006961041130125523,
-0.023982729762792587,
0.028662195429205894,
-0.11101541668176651,
-0.13020876049995422,
-0.036745015531778336,
0.012131058610975742,
0.02396625094115734,
0.004888972733169794,
-0.1015065610408783,
-0.019776593893766403,
-0.01715264283120632,
-0.03825787827372551,
-0.00022954080486670136,
-0.04483307898044586,
0.12926045060157776,
0.12046817690134048,
0.04429112374782562,
-0.014348359778523445,
0.05602992698550224,
-0.094426728785038,
0.06385433673858643,
-0.11022932827472687,
0.10601235181093216,
-0.031113438308238983,
0.08218470960855484,
-0.03674875944852829,
-0.10692229866981506,
0.05607287958264351,
0.05965743586421013,
0.07736247032880783,
0.06529591977596283,
-0.11609767377376556,
-0.05006967484951019,
0.19009003043174744,
-0.10261054337024689,
-0.1521109640598297,
0.10489042103290558,
-0.035293277353048325,
0.07163979858160019,
0.1012735590338707,
0.13818740844726562,
0.15365993976593018,
-0.04179538041353226,
0.0004562644462566823,
0.05637502297759056,
0.039796411991119385,
-0.12978728115558624,
0.08031119406223297,
0.04755151644349098,
-0.08992290496826172,
0.05515477806329727,
-0.01267822552472353,
0.13535919785499573,
-0.01998167484998703,
-0.021631786599755287,
-0.04810487478971481,
-0.08425208926200867,
0.025259723886847496,
0.02519049122929573,
0.060279157012701035,
-0.08277425915002823,
-0.08632448315620422,
0.0883619412779808,
0.16181647777557373,
-0.1353309005498886,
-0.0016389330849051476,
-0.09294536709785461,
0.05666099116206169,
-0.08861381560564041,
0.028250422328710556,
-0.16669994592666626,
0.005418038927018642,
0.06448900699615479,
-0.01307862251996994,
0.06864210218191147,
0.10508313030004501,
0.027113253250718117,
0.03461070731282234,
0.0017727244412526488,
-0.03222808986902237,
-0.12678542733192444,
-0.06737647205591202,
-0.056014612317085266,
-0.06370565295219421,
-0.09010210633277893,
-0.06153050810098648,
0.0073722414672374725,
-0.19213570654392242,
0.013347542844712734,
-0.006886119954288006,
-0.013029877096414566,
0.025772536173462868,
-0.010146165266633034,
0.023824600502848625,
0.0714775025844574,
-0.0652642548084259,
-0.03955911099910736,
0.03853869065642357,
0.022090621292591095,
-0.056190215051174164,
-0.08244403451681137,
-0.09484321624040604,
-0.0020215981639921665,
0.12314824014902115,
0.02204945683479309,
-0.0831746980547905,
0.02001318894326687,
-0.01823481172323227,
-0.03404780104756355,
0.017806222662329674,
-0.0768757089972496,
0.16348208487033844,
-0.009455355815589428,
0.2027481496334076,
-0.15818354487419128,
-0.035795487463474274,
-0.02740773744881153,
0.019634634256362915,
0.06163959950208664,
0.13591225445270538,
-0.007002150174230337,
-0.08079134672880173,
0.053572800010442734,
0.038152117282152176,
-0.09649613499641418,
0.24936343729496002,
-0.054070282727479935,
-0.08827867358922958,
0.03499174490571022,
0.10446158796548843,
-0.0165790393948555,
0.15426005423069,
-0.22319500148296356,
-0.04570399224758148,
0.00019756221445277333,
-0.008193719200789928,
0.06314413249492645,
-0.12354651838541031,
0.008132217451930046,
0.010924935340881348,
-0.06972309947013855,
-0.09228164702653885,
-0.0025134424213320017,
-0.008882022462785244,
0.04137415066361427,
-0.004845933523029089,
-0.04357552155852318,
0.020577218383550644,
-0.037731070071458817,
-0.12477194517850876,
0.22139036655426025,
-0.08292976766824722,
-0.1887095719575882,
-0.18550588190555573,
0.1207662895321846,
-0.06701904535293579,
-0.015874704346060753,
0.03322472423315048,
-0.09127990156412125,
-0.02958216704428196,
-0.055398374795913696,
0.18315498530864716,
-0.06784729659557343,
-0.006661545485258102,
-0.018614942207932472,
0.06978126615285873,
0.01063218992203474,
-0.20531877875328064,
0.027095433324575424,
-0.01656223274767399,
-0.03139941021800041,
-0.005552017595618963,
-0.11693745106458664,
0.11237192153930664,
0.1789647340774536,
-0.07251034677028656,
0.025428635999560356,
-0.006708805914968252,
0.1938963234424591,
-0.04904190078377724,
-0.052367426455020905,
0.1546265035867691,
-0.006608010269701481,
-0.009385163895785809,
0.00370929017663002,
-0.017869403585791588,
-0.08879853785037994,
0.0704866573214531,
-0.018253136426210403,
-0.033385682851076126,
-0.2659692168235779,
-0.015811167657375336,
-0.07858934998512268,
0.04265747591853142,
0.0420585460960865,
0.026870466768741608,
-0.10716842114925385,
0.028437426313757896,
0.046563632786273956,
0.1368676871061325,
-0.0014205375919118524,
0.05092587694525719,
0.05155026167631149,
0.01628870889544487,
0.01421431452035904,
-0.10075746476650238,
-0.002099382458254695,
0.05771641805768013,
0.08109522610902786,
0.26910415291786194,
-0.0990438461303711,
0.18644723296165466,
0.04392090439796448,
0.04095964506268501,
0.034669388085603714,
0.1289745271205902,
-0.10722048580646515,
0.031983669847249985,
0.014841971918940544,
-0.007698063272982836,
-0.11883717775344849,
0.00800609216094017,
-0.04165351763367653,
0.09317725896835327,
-0.13471388816833496,
-0.032989129424095154,
0.021916884928941727,
0.12648653984069824,
0.06285908818244934,
-0.2437068372964859,
-0.13444969058036804,
0.013540648855268955,
-0.08778204023838043,
-0.09629553556442261,
0.07095895707607269,
0.21782144904136658,
-0.0740303248167038,
-0.024946514517068863,
-0.005417310167104006,
0.12966568768024445,
-0.020360879600048065,
-0.02468407154083252,
-0.02435251697897911,
0.06223132833838463,
0.01458492036908865,
0.1291920393705368,
-0.2928767204284668,
0.12811720371246338,
-0.0030768008437007666,
0.0824253261089325,
-0.030003946274518967,
0.04518540948629379,
-0.016658784821629524,
0.06933858245611191,
0.04036586731672287,
-0.015145395882427692,
0.050455473363399506,
-0.16842065751552582,
0.021557023748755455,
0.046885088086128235,
0.032135915011167526,
0.054180942475795746,
0.07861083000898361,
-0.0016238169046118855,
0.04699019715189934,
-0.016798831522464752,
-0.13464781641960144,
-0.09342177957296371,
-0.06156936660408974,
-0.04063427075743675,
-0.04104142263531685,
-0.018439283594489098,
-0.0350380539894104,
-0.00578983873128891,
0.06690367311239243,
0.1763635128736496,
-0.08020681142807007,
-0.08218248933553696,
-0.06193864718079567,
0.06499606370925903,
0.08701631426811218,
-0.10035423189401627,
0.03839351236820221,
-0.007865078747272491,
0.026520829647779465,
-0.007246487308293581,
-0.09096645563840866,
0.057830363512039185,
-0.04247690364718437,
-0.05896738916635513,
-0.008424359373748302,
0.08504018932580948,
0.003478851169347763,
0.035243865102529526,
0.019949166104197502,
-0.10028737783432007,
-0.03943340480327606,
-0.12018530070781708,
-0.11060486733913422,
-0.05911625549197197,
0.00420958548784256,
0.05048465356230736,
-0.12650570273399353,
-0.07146642357110977,
-0.01607026532292366,
-0.00861887913197279,
0.135764941573143,
0.1545802801847458,
-0.05950772389769554,
-0.007337374612689018,
0.1305249184370041,
-0.05226367339491844,
-0.19320307672023773,
0.04832984507083893,
0.04885024204850197,
0.1171865239739418,
-0.05148344114422798,
-0.16860422492027283,
0.04640788212418556,
-0.012318133376538754,
0.03856029734015465,
0.06505318731069565,
-0.3232991397380829,
-0.12088806182146072,
0.09011055529117584,
0.14572110772132874,
0.13492204248905182,
-0.1266801506280899,
-0.033143386244773865,
-0.06291931867599487,
-0.12914137542247772,
0.07864903658628464,
-0.08790373057126999,
0.12637124955654144,
-0.0665893629193306,
0.022846614941954613,
0.03614630550146103,
-0.04421315714716911,
0.06202255189418793,
0.045651037245988846,
0.1159229427576065,
-0.03093440644443035,
-0.0010095685720443726,
0.14228208363056183,
-0.042185328900814056,
0.17624305188655853,
-0.14290353655815125,
0.09564562886953354,
-0.22580526769161224,
-0.054360151290893555,
-0.08720902353525162,
0.021504245698451996,
-0.029496703296899796,
-0.03320353478193283,
-0.08235669881105423,
0.017067501321434975,
-0.011206474155187607,
0.007516119163483381,
0.033074647188186646,
-0.037669502198696136,
-0.022663597017526627,
0.09084876626729965,
0.13453352451324463,
-0.009860444813966751,
-0.06399860978126526,
0.06480695307254791,
0.045777577906847,
0.10994981974363327,
-0.20295198261737823,
0.027723422273993492,
0.11107338219881058,
0.027357405051589012,
0.1153840646147728,
0.047670911997556686,
-0.10571104288101196,
0.06307993829250336,
0.08736278861761093,
-0.05995745211839676,
-0.06472238153219223,
-0.04078705981373787,
-0.11862171441316605,
-0.08767519146203995,
0.05219991132616997,
0.09800632297992706,
-0.029938092455267906,
-0.014202655293047428,
-0.03345847129821777,
-0.03208935633301735,
-0.11579287052154541,
0.17757469415664673,
0.0818248838186264,
0.07695034146308899,
-0.061974652111530304,
0.05047115683555603,
0.06315944343805313,
-0.07963351160287857,
0.0074191405437886715,
0.1667596995830536,
-0.10026445984840393,
-0.048022493720054626,
0.05952255055308342,
0.23629339039325714,
-0.03678088262677193,
-0.05541292205452919,
-0.1412465125322342,
-0.07296818494796753,
0.01947101578116417,
0.16411994397640228,
0.10364610701799393,
0.08269026130437851,
-0.02084093540906906,
0.006845054216682911,
-0.11390924453735352,
0.09191231429576874,
0.07938350737094879,
0.03160862252116203,
-0.09822966903448105,
0.14298246800899506,
0.0464220829308033,
0.12644335627555847,
-0.027041902765631676,
-0.015032483264803886,
-0.13660326600074768,
0.07591333985328674,
-0.09740214049816132,
0.029208248481154442,
-0.015522263944149017,
0.05147143825888634,
-0.03253206983208656,
0.0035006441175937653,
-0.04369167611002922,
0.061504580080509186,
-0.0840386226773262,
0.0016027223318815231,
0.017478106543421745,
0.05363403633236885,
-0.060195501893758774,
-0.012289593927562237,
0.021221483126282692,
-0.09822484105825424,
0.12458527833223343,
-0.024835100397467613,
-0.022761208936572075,
0.09763901680707932,
-0.0689445436000824,
0.023251943290233612,
0.017293566837906837,
0.046987999230623245,
0.008400127291679382,
0.034471213817596436,
0.09126650542020798,
0.04004345089197159,
0.06296790391206741,
0.022787323221564293,
0.10793957114219666,
-0.1261528581380844,
-0.06709152460098267,
-0.04817035421729088,
-0.10736341029405594,
-0.06252828985452652,
0.10984679311513901,
0.03036845289170742,
0.09892617911100388,
0.10704474151134491,
-0.03871320188045502,
0.01857263408601284,
-0.15243205428123474,
-0.06467214226722717,
0.027577871456742287,
-0.018628805875778198,
-0.09527571499347687,
-0.05974164977669716,
0.05259518325328827,
-0.03324604034423828,
0.10583499819040298,
0.02179311029613018,
0.06472360342741013,
-0.02580428309738636,
-0.038373108953237534,
0.008651436306536198,
0.011010763235390186,
0.18885841965675354,
-0.08192905783653259,
0.04259556531906128,
0.0035103086847811937,
0.018708249554038048,
0.03117658570408821,
0.11751412600278854,
0.14898261427879333,
0.14776021242141724,
-0.035383760929107666,
0.1197451651096344,
-0.0005285193328745663,
-0.0009480727603659034,
-0.08471571654081345,
0.00022761784202884883,
0.021574947983026505,
0.05678960680961609,
-0.0411040335893631,
0.1746189147233963,
0.11092688143253326,
-0.1054486408829689,
0.08881901949644089,
0.02670125849545002,
-0.12852537631988525,
-0.04729168489575386,
0.026864847168326378,
-0.03835511580109596,
-0.1495281308889389,
0.03772430494427681,
-0.11396334320306778,
-0.03568796068429947,
0.03928044065833092,
0.03882468864321709,
-0.08253129571676254,
0.19087044894695282,
0.0407010093331337,
-0.07111120223999023,
0.0675765797495842,
-0.016533933579921722,
0.024378392845392227,
0.04106365144252777,
0.03464841842651367,
0.03660275414586067,
-0.056931640952825546,
0.04558020085096359,
0.028408817946910858,
-0.04695456847548485,
-0.0015878853155300021,
-0.025608230382204056,
-0.013965466991066933,
-0.02515771985054016,
0.04119785502552986,
0.06327614933252335,
0.1633576601743698,
0.037743523716926575,
-0.06944802403450012,
-0.021567996591329575,
0.14756998419761658,
-0.024811653420329094,
-0.08801209181547165,
-0.12499767541885376,
0.13233397901058197,
0.048585906624794006,
0.005508482921868563,
0.011081112548708916,
-0.09466041624546051,
-0.03363196924328804,
0.21217943727970123,
0.06429562717676163,
-0.025865156203508377,
-0.01892983354628086,
-0.008798958733677864,
-0.0019045652588829398,
-0.031349051743745804,
0.20893773436546326,
0.019360851496458054,
0.2181035280227661,
0.01952037774026394,
-0.03321487829089165,
-0.07374273240566254,
-0.03754739090800285,
0.024739960208535194,
0.123537078499794,
-0.03068769909441471,
-0.03387731313705444,
-0.08147328346967697,
0.01381647028028965,
-0.010178755968809128,
-0.06332311034202576,
0.09778327494859695,
-0.13295821845531464,
-0.08060488849878311,
-0.0382763110101223,
0.03689340502023697,
-0.0368628203868866,
0.03894350305199623,
-0.03450493514537811,
0.029901213943958282,
0.06491336971521378,
-0.037379149347543716,
-0.12162192165851593,
-0.15714509785175323,
0.08251345157623291,
-0.0650782436132431,
0.13608959317207336,
-0.01933707855641842,
0.16199496388435364,
0.09478443115949631,
0.03758317977190018,
-0.049843598157167435,
0.11373413354158401,
0.03833136707544327,
0.07205750048160553,
0.066661536693573,
0.10597622394561768,
-0.043836839497089386,
0.14435315132141113,
-0.04380924999713898,
-0.010950183495879173,
-0.010484748519957066,
-0.07950295507907867,
-0.026438608765602112,
-0.18669122457504272,
-0.017563730478286743,
-0.09650919586420059,
0.09366040676832199,
0.18323931097984314,
-0.046260908246040344,
-0.02712177485227585,
-0.08205506950616837,
0.10065959393978119,
0.00244780327193439,
0.07382234185934067,
-0.04854549467563629,
-0.16694432497024536,
-0.0005644126795232296,
0.006972852163016796,
0.000764348660595715,
-0.2715403139591217,
0.0027653612196445465,
-0.051545530557632446,
-0.028230799362063408,
-0.094095878303051,
0.16312876343727112,
0.07078108191490173,
0.041460875421762466,
-0.0376589298248291,
-0.13377174735069275,
-0.030491897836327553,
0.0763767883181572,
-0.16219405829906464,
-0.14732937514781952
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the javascript function/method.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_javascript_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_javascript_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/javascript/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_javascript_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the javascript function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
88,
109
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10802479833364487,
0.05783552676439285,
-0.0015377799281850457,
0.10897736996412277,
0.05907665193080902,
0.021058032289147377,
0.021519044414162636,
0.10443426668643951,
-0.004974267911165953,
0.06694957613945007,
0.06203768402338028,
-0.08492853492498398,
0.04964873567223549,
0.19922637939453125,
0.031159285455942154,
-0.1483210027217865,
-0.053383491933345795,
0.018606441095471382,
-0.05335671454668045,
0.10699333995580673,
0.06363793462514877,
-0.0769127905368805,
0.0699901282787323,
-0.03642701730132103,
-0.12230813503265381,
0.044885411858558655,
-0.016161108389496803,
-0.018451256677508354,
0.09082786738872528,
0.053991418331861496,
0.10674197971820831,
-0.02096555195748806,
0.05963560566306114,
-0.2059621810913086,
0.0047501311637461185,
0.02970578894019127,
0.06431761384010315,
0.028143666684627533,
0.04434744268655777,
0.051878973841667175,
0.1219596266746521,
-0.01857270672917366,
0.03426465019583702,
0.0718158558011055,
-0.06561190634965897,
-0.0749322846531868,
-0.05536366626620293,
0.04012029618024826,
0.0929650068283081,
0.09646584093570709,
-0.010688979178667068,
-0.001330626429989934,
-0.0924324318766594,
0.08339845389127731,
0.1247795969247818,
-0.22881704568862915,
-0.019116494804620743,
0.09337152540683746,
0.07567707449197769,
0.030410325154662132,
-0.07925130426883698,
-0.031427882611751556,
0.10382086783647537,
0.04513085260987282,
0.062229570001363754,
-0.0990525409579277,
-0.020432882010936737,
-0.010175026021897793,
-0.04365304484963417,
-0.057100340723991394,
0.14851562678813934,
0.05477078631520271,
-0.056252334266901016,
-0.10994652658700943,
-0.05595027655363083,
-0.16568231582641602,
0.032253071665763855,
0.03182385861873627,
0.002762450836598873,
-0.0033529819920659065,
-0.0185571126639843,
-0.0246825460344553,
-0.09400125592947006,
-0.11147800087928772,
0.02650139294564724,
0.02565052919089794,
0.05000987648963928,
0.03544708341360092,
-0.033798862248659134,
0.09460937231779099,
0.004953675903379917,
-0.05825648084282875,
-0.02314453199505806,
0.012414713390171528,
-0.11053466796875,
0.017855875194072723,
0.003689913311973214,
-0.038529928773641586,
0.01300792582333088,
0.08011603355407715,
-0.08945896476507187,
0.08887981623411179,
0.09173931926488876,
0.01576082408428192,
0.01279863715171814,
0.22135087847709656,
0.07650609314441681,
-0.17172306776046753,
0.027141109108924866,
0.047019634395837784,
0.009805988520383835,
0.005702286027371883,
-0.0593906044960022,
-0.04834791645407677,
0.0038523725233972073,
0.07832479476928711,
-0.11532777547836304,
0.0071494062431156635,
-0.06030334159731865,
0.00349802034907043,
0.09522408992052078,
-0.12355025112628937,
0.04151468724012375,
0.029877159744501114,
-0.05806463956832886,
-0.028977591544389725,
0.0800139382481575,
-0.12893551588058472,
-0.11260143667459488,
0.014329087920486927,
-0.04233268275856972,
-0.023786436766386032,
-0.11791231483221054,
-0.10221706330776215,
0.002266307594254613,
-0.035314418375492096,
-0.001205156440846622,
-0.11157917231321335,
-0.08124095946550369,
-0.025014199316501617,
0.03018793836236,
0.005654614884406328,
-0.019300274550914764,
-0.06630730628967285,
0.00263533485122025,
0.0003871339140459895,
-0.028949322178959846,
0.019482241943478584,
-0.0342702679336071,
0.09935211390256882,
0.0939062237739563,
0.042228419333696365,
0.015970153734087944,
0.02308984100818634,
-0.10259189456701279,
0.08894047886133194,
-0.10635838657617569,
0.07131177186965942,
0.0010765352053567767,
0.06161816045641899,
-0.10182526707649231,
-0.08221174031496048,
-0.009773381985723972,
0.04645448178052902,
0.06603676825761795,
0.05565433204174042,
-0.1010928824543953,
-0.0016999697545543313,
0.15914417803287506,
-0.09564868360757828,
-0.13147105276584625,
0.11898208409547806,
-0.0029486147686839104,
0.012727665714919567,
0.0769505724310875,
0.1409306824207306,
0.16054873168468475,
-0.10067186504602432,
-0.03714670613408089,
0.09672385454177856,
0.05474899709224701,
-0.07004550844430923,
0.06542807817459106,
-0.004225379787385464,
0.010023337788879871,
0.029135875403881073,
0.06030267849564552,
0.08557714521884918,
-0.005385857541114092,
-0.02723885141313076,
-0.038361791521310806,
-0.08804305642843246,
-0.02319694310426712,
0.0017798281041905284,
0.008701224811375141,
-0.06327041983604431,
-0.07124975323677063,
-0.00260102073661983,
0.16373518109321594,
-0.10131815820932388,
0.02796981669962406,
-0.07860929518938065,
-0.026530113071203232,
-0.0835496112704277,
0.029972683638334274,
-0.12479443103075027,
0.006978604942560196,
0.057543519884347916,
-0.033314190804958344,
0.052480217069387436,
0.07745017111301422,
0.018416740000247955,
0.023335758596658707,
-0.052989132702350616,
-0.054308224469423294,
-0.05725227668881416,
-0.07886214554309845,
-0.09922315180301666,
-0.029390348121523857,
-0.09618185460567474,
-0.04104219004511833,
-0.030589988455176353,
-0.15744028985500336,
0.0019077828619629145,
-0.010035122744739056,
0.01385582983493805,
0.03306422382593155,
-0.03610244020819664,
0.03135763108730316,
0.051700614392757416,
-0.045189399272203445,
-0.08234579861164093,
0.023264039307832718,
0.03136806935071945,
-0.09733889997005463,
-0.0547032468020916,
-0.08907493203878403,
-0.05615725740790367,
0.07396427541971207,
0.08814521878957748,
-0.09551448374986649,
-0.01751544140279293,
-0.03018156811594963,
-0.03650955855846405,
-0.06093379110097885,
-0.07101033627986908,
0.17618761956691742,
0.018727634102106094,
0.16719137132167816,
-0.13356664776802063,
-0.05368378385901451,
-0.021417032927274704,
0.0005149621283635497,
0.025819608941674232,
0.147098571062088,
0.021443001925945282,
-0.08519705384969711,
0.030344130471348763,
-0.001993070589378476,
-0.045704614371061325,
0.1854342371225357,
-0.020302893593907356,
-0.07065781205892563,
0.00439984118565917,
0.11820841580629349,
-0.01899619586765766,
0.15807339549064636,
-0.08260463178157806,
-0.013380995020270348,
-0.018414590507745743,
0.017029309645295143,
0.03591761365532875,
-0.1259980946779251,
0.02372630499303341,
0.03366915509104729,
-0.07255720347166061,
-0.05763823539018631,
-0.020183231681585312,
-0.03653237968683243,
0.04336180165410042,
0.022196482867002487,
0.02165582962334156,
-0.01759525015950203,
-0.03309543803334236,
-0.11210890114307404,
0.18183821439743042,
-0.06538528203964233,
-0.18962256610393524,
-0.15867120027542114,
0.11139940470457077,
-0.029662078246474266,
-0.015618708916008472,
0.034982964396476746,
-0.1067623570561409,
-0.03945010527968407,
-0.0966508612036705,
0.1284387856721878,
-0.10731121152639389,
-0.0015757285291329026,
-0.01954967901110649,
0.06569747626781464,
0.04628762975335121,
-0.1629967838525772,
0.015404310077428818,
-0.01603025197982788,
-0.0037782485596835613,
-0.02841438539326191,
-0.06253542006015778,
0.0957472175359726,
0.1296728253364563,
-0.051859550178050995,
0.028768273070454597,
-0.010260161012411118,
0.16042090952396393,
-0.05384116247296333,
0.04358074814081192,
0.20526032149791718,
0.03810372203588486,
0.03201909363269806,
0.04047052934765816,
0.010496764443814754,
-0.08173119276762009,
0.06326854228973389,
0.05656748265028,
-0.048104070127010345,
-0.20887242257595062,
-0.017440257593989372,
-0.08346821367740631,
0.06472375243902206,
0.11652597039937973,
0.04251295328140259,
-0.18438288569450378,
0.012881549075245857,
-0.014285320416092873,
0.14363528788089752,
-0.02154420129954815,
0.05112844705581665,
0.016723044216632843,
0.014441585168242455,
0.003653408959507942,
-0.10310325026512146,
-0.003143679117783904,
0.06794154644012451,
0.10254921764135361,
0.21187636256217957,
-0.09238646179437637,
0.1657407432794571,
0.01314990408718586,
0.09058921784162521,
0.024303536862134933,
0.07562871277332306,
-0.1126910075545311,
0.009629691950976849,
0.008817524649202824,
-0.015802595764398575,
-0.07496652007102966,
0.04462572559714317,
-0.005441853776574135,
0.07099638134241104,
-0.0859035924077034,
0.027629749849438667,
0.03852257505059242,
0.19453473389148712,
0.08999763429164886,
-0.16892212629318237,
-0.1313932090997696,
0.023654427379369736,
-0.09412851184606552,
-0.09834858775138855,
0.0807361900806427,
0.22715169191360474,
-0.062165696173906326,
0.027474787086248398,
-0.013714338652789593,
0.12635229527950287,
-0.09447195380926132,
-0.016263289377093315,
0.035241130739450455,
0.06295708566904068,
0.009076620452105999,
0.11510823667049408,
-0.2539874017238617,
0.07280366867780685,
0.020135417580604553,
0.10717684775590897,
-0.015761809423565865,
0.05959733575582504,
-0.030764304101467133,
-0.005623938050121069,
0.07550179213285446,
0.0069281249307096004,
-0.04738973081111908,
-0.21049664914608002,
-0.03174147009849548,
0.03565065935254097,
0.057790838181972504,
-0.020121531561017036,
0.09369475394487381,
-0.01949019357562065,
0.03146371245384216,
-0.03602611646056175,
-0.15488110482692719,
-0.0760975182056427,
-0.12371081113815308,
-0.052023448050022125,
-0.0076741608791053295,
-0.0679716020822525,
-0.01906728930771351,
0.04461073502898216,
0.05804670974612236,
0.22149047255516052,
-0.15491031110286713,
-0.09170655161142349,
-0.0821034386754036,
0.07669202983379364,
0.11680763214826584,
-0.10399922728538513,
0.023604271933436394,
0.009880253113806248,
0.04364275559782982,
-0.04341109097003937,
-0.08083833754062653,
0.02986885793507099,
-0.05869102478027344,
-0.08825653791427612,
-0.03147279843688011,
0.13760460913181305,
-0.01585567183792591,
0.04367292299866676,
0.005887288600206375,
-0.08804831653833389,
-0.027067797258496284,
-0.13824228942394257,
-0.04645930603146553,
-0.010268107056617737,
0.025876015424728394,
-0.008847958408296108,
-0.09680993854999542,
0.07644333690404892,
0.0028222783003002405,
-0.06017269194126129,
0.07396699488162994,
0.1719031035900116,
-0.07369596511125565,
0.011787191964685917,
0.1022501066327095,
-0.04719170928001404,
-0.17191381752490997,
-0.02648886665701866,
0.03920649737119675,
0.07946415990591049,
-0.027855159714818,
-0.15731802582740784,
0.049319665879011154,
-0.00383642315864563,
0.014032607898116112,
0.03970872983336449,
-0.3052702248096466,
-0.11703227460384369,
-0.004649301990866661,
0.06422745436429977,
0.07218487560749054,
-0.10237114131450653,
-0.042621344327926636,
-0.05906497687101364,
-0.05664512887597084,
0.03949619084596634,
0.02955484949052334,
0.11272113025188446,
-0.03811343014240265,
0.03219658136367798,
0.04408961907029152,
-0.03434589132666588,
0.0533611997961998,
-0.006644133478403091,
0.10112261027097702,
-0.004789517726749182,
0.004678875673562288,
0.05262724310159683,
-0.07062229514122009,
0.18376904726028442,
-0.16729368269443512,
0.09691745042800903,
-0.16637037694454193,
-0.044417157769203186,
-0.04039573669433594,
0.011114158667623997,
-0.03040427900850773,
-0.043382227420806885,
-0.11059369146823883,
0.01716390810906887,
0.034882161766290665,
-0.014974783174693584,
0.04602282866835594,
-0.025245152413845062,
-0.051091842353343964,
0.07325614243745804,
0.08890940248966217,
-0.012444563210010529,
-0.12973925471305847,
0.03149353340268135,
0.015862174332141876,
0.08873176574707031,
-0.22316981852054596,
0.02630165033042431,
0.10914154350757599,
0.031604666262865067,
0.09672514349222183,
0.01436226349323988,
-0.08962613344192505,
0.056246623396873474,
0.06937222927808762,
-0.057633597403764725,
-0.10229610651731491,
-0.027843516319990158,
-0.07840287685394287,
-0.11323175579309464,
0.04449767246842384,
0.09386340528726578,
-0.036951709538698196,
-0.008949823677539825,
-0.01686696894466877,
0.010105278342962265,
-0.07477204501628876,
0.17366524040699005,
0.023127702996134758,
0.0735434889793396,
-0.05811965838074684,
0.07566006481647491,
0.07485783100128174,
-0.11103518307209015,
0.01956946961581707,
0.1558324545621872,
-0.0821760967373848,
-0.027874913066625595,
0.05167534947395325,
0.11710426211357117,
-0.004460704512894154,
-0.04374222084879875,
-0.10145489126443863,
-0.07904598116874695,
0.016948731616139412,
0.018616076558828354,
0.06278036534786224,
0.08163867890834808,
-0.03226451575756073,
0.012805125676095486,
-0.11522722989320755,
0.10112857073545456,
0.07732076197862625,
0.032130151987075806,
-0.13001295924186707,
0.1377057433128357,
0.045280132442712784,
0.09467276930809021,
0.00289473170414567,
0.019227076321840286,
-0.10557296127080917,
0.04419279098510742,
-0.0163886696100235,
0.02719920687377453,
-0.01732637919485569,
0.051838215440511703,
-0.04856681823730469,
0.03639070317149162,
-0.038098521530628204,
0.048928551375865936,
-0.04191359132528305,
-0.027410423383116722,
-0.031096886843442917,
0.03983225300908089,
-0.07312387973070145,
-0.012085073627531528,
0.005056499037891626,
-0.08928212523460388,
0.10404746979475021,
-0.05826586112380028,
-0.003202865831553936,
0.014271298423409462,
0.005272820591926575,
0.0459887720644474,
0.026713965460658073,
0.04394989460706711,
-0.014558066613972187,
0.016498049721121788,
0.044295720756053925,
0.01645406149327755,
0.00016015862638596445,
-0.008137963712215424,
0.07693228870630264,
-0.14283603429794312,
-0.07009228318929672,
-0.07512159645557404,
-0.0867164209485054,
-0.06959078460931778,
0.08372577279806137,
0.08500468730926514,
0.07581541687250137,
0.10043252259492874,
-0.04147478565573692,
0.016792630776762962,
-0.18400710821151733,
-0.044802211225032806,
0.05294891074299812,
-0.0017170453211292624,
-0.11944364756345749,
-0.04477924108505249,
0.06218940392136574,
-0.040547676384449005,
0.09926308691501617,
-0.009979662485420704,
0.08602381497621536,
-0.016662679612636566,
-0.05258061736822128,
-0.015225749462842941,
0.0008572535589337349,
0.13651564717292786,
-0.10568353533744812,
-0.0035323393531143665,
-0.007529895752668381,
0.0065111201256513596,
0.05224456638097763,
0.17296066880226135,
0.11943210661411285,
0.1079196110367775,
0.04887600988149643,
0.0973556861281395,
-0.056766897439956665,
-0.03122876212000847,
-0.155520498752594,
0.0734400674700737,
-0.020686659961938858,
0.04191560670733452,
-0.04515616595745087,
0.11936220526695251,
0.12049073725938797,
-0.1285398155450821,
0.08767517656087875,
0.013220250606536865,
-0.08621205389499664,
-0.051187384873628616,
-0.09234471619129181,
-0.04525800794363022,
-0.10753929615020752,
0.017724240198731422,
-0.09091920405626297,
0.00618381891399622,
0.059771690517663956,
0.021393222734332085,
-0.03138740360736847,
0.16275432705879211,
-0.0050642481073737144,
-0.0606621652841568,
0.04628158360719681,
0.034379709511995316,
0.03395741432905197,
0.12041810899972916,
0.01827407442033291,
0.06956570595502853,
-0.08562895655632019,
0.07140322774648666,
0.039304547011852264,
-0.025782037526369095,
0.01379796676337719,
0.005223241169005632,
-0.014641987159848213,
-0.057036858052015305,
0.009833795949816704,
0.07712595909833908,
0.1589604765176773,
0.05557538568973541,
-0.0485791340470314,
-0.045707304030656815,
0.18500779569149017,
-0.0545504093170166,
-0.03539058938622475,
-0.12723717093467712,
0.1587592363357544,
0.046588364988565445,
0.008169074542820454,
0.004677829798310995,
-0.08010956645011902,
-0.02175273932516575,
0.22216925024986267,
0.05307925492525101,
-0.03683328628540039,
-0.02603961154818535,
-0.025505607947707176,
-0.009962149895727634,
-0.02737034298479557,
0.1603628247976303,
-0.0017084983410313725,
0.2334602028131485,
0.011656765826046467,
-0.016926812008023262,
-0.036929547786712646,
-0.04427332431077957,
-0.004221007693558931,
0.2044205516576767,
-0.044435713440179825,
0.03849377855658531,
-0.10320143401622772,
-0.005497249308973551,
0.028040006756782532,
-0.09782146662473679,
0.1224132850766182,
-0.12815944850444794,
-0.06060153245925903,
0.026229454204440117,
0.056969307363033295,
-0.028384912759065628,
0.0632302314043045,
-0.03127434849739075,
0.054795511066913605,
0.04987470060586929,
-0.03752107918262482,
-0.11274164915084839,
-0.14736413955688477,
0.03922649472951889,
-0.016179943457245827,
0.1422259658575058,
0.014828084968030453,
0.07847272604703903,
0.0883442834019661,
0.018900997936725616,
-0.07211040705442429,
0.09261732548475266,
0.03833162412047386,
0.012855196371674538,
0.05431129038333893,
0.11319633573293686,
-0.03830413520336151,
0.17128360271453857,
0.018253076821565628,
-0.015067153610289097,
-0.018184833228588104,
-0.04110978543758392,
-0.014804390259087086,
-0.177187979221344,
0.004456283058971167,
-0.05895116925239563,
0.12748348712921143,
0.17892079055309296,
-0.04526463896036148,
-0.026032783091068268,
-0.040111083537340164,
0.0832800343632698,
-0.005441121757030487,
0.0998808816075325,
-0.014210261404514313,
-0.1517888456583023,
0.01335215289145708,
-0.0031443194020539522,
0.0020853711757808924,
-0.18397796154022217,
-0.05104131996631622,
-0.03834420442581177,
-0.021496383473277092,
-0.09344417601823807,
0.14682933688163757,
0.04986479878425598,
0.029203198850154877,
-0.035871025174856186,
-0.1783141791820526,
-0.0014124843291938305,
0.06560146808624268,
-0.13471664488315582,
-0.12033068388700485
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the javascript function/method.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_javascript_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_javascript_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/javascript/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_javascript_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the javascript function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V3-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
87,
107
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V3-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12249606847763062,
0.021389996632933617,
-0.0010122028179466724,
0.09525052458047867,
0.04829844832420349,
0.021713461726903915,
0.045506224036216736,
0.0977255329489708,
-0.01427839882671833,
0.06651388108730316,
0.0398603230714798,
-0.0738399475812912,
0.05848955363035202,
0.20119233429431915,
0.026098068803548813,
-0.10790497809648514,
-0.03337797522544861,
0.039530545473098755,
-0.07993253320455551,
0.11118492484092712,
0.06448627263307571,
-0.08788823336362839,
0.0767061859369278,
-0.026433123275637627,
-0.14687161147594452,
0.025713833048939705,
-0.01779952272772789,
-0.012638558633625507,
0.09832228720188141,
0.06281764060258865,
0.12124001234769821,
-0.008054912090301514,
0.05843918398022652,
-0.19242608547210693,
0.005020422860980034,
0.023265235126018524,
0.0681973248720169,
0.04630795121192932,
0.04723142459988594,
0.07321692258119583,
0.08540754020214081,
-0.019101031124591827,
0.02542754076421261,
0.06424196809530258,
-0.06574870645999908,
-0.05700213834643364,
-0.08443097025156021,
0.06101207807660103,
0.08931641280651093,
0.10231988877058029,
-0.011566746048629284,
0.04821088910102844,
-0.09513210505247116,
0.08175381273031235,
0.12887801229953766,
-0.23882050812244415,
-0.024016866460442543,
0.09265024214982986,
0.08911854028701782,
0.0211331769824028,
-0.07608454674482346,
-0.03176700696349144,
0.11135382950305939,
0.03332866355776787,
0.04958462342619896,
-0.08141577988862991,
0.01166849210858345,
-0.008669804781675339,
-0.05078743398189545,
-0.04130557179450989,
0.18149742484092712,
0.06205703318119049,
-0.06602570414543152,
-0.09372127801179886,
-0.03249629959464073,
-0.17156845331192017,
0.03458655625581741,
0.014349608682096004,
-0.009443836286664009,
-0.012526045553386211,
-0.03470125421881676,
0.00969309825450182,
-0.0907636284828186,
-0.11004983633756638,
0.03123926930129528,
0.01987951621413231,
0.05719450116157532,
0.028158873319625854,
-0.06004000082612038,
0.09048450738191605,
0.06817549467086792,
-0.05855166167020798,
-0.0178123377263546,
0.008776926435530186,
-0.11896543949842453,
-0.013333193957805634,
-0.00591913890093565,
-0.05342122167348862,
0.002054112032055855,
0.08080928027629852,
-0.07329872250556946,
0.08370894938707352,
0.07836100459098816,
0.01871022768318653,
0.007227179128676653,
0.22197243571281433,
0.0717831701040268,
-0.1700236052274704,
0.032178692519664764,
0.03461843356490135,
0.003755418583750725,
0.01485785935074091,
-0.056255683302879333,
-0.04411127418279648,
0.026488834992051125,
0.07279547303915024,
-0.11840537190437317,
0.03294570744037628,
-0.06636177748441696,
-0.016069376841187477,
0.09525315463542938,
-0.1292116641998291,
0.027397409081459045,
0.0298675037920475,
-0.05976476892828941,
-0.03072984330356121,
0.09764626622200012,
-0.1412592977285385,
-0.10835479944944382,
0.001580005045980215,
-0.0493047870695591,
-0.029423527419567108,
-0.11302680522203445,
-0.10157831013202667,
0.002363696461543441,
-0.007805844768881798,
-0.0011199780274182558,
-0.10893400013446808,
-0.06620056927204132,
-0.021919474005699158,
0.03754349425435066,
0.010852592997252941,
-0.026097465306520462,
-0.04991128295660019,
-0.005258260760456324,
-0.0019410927779972553,
-0.025036543607711792,
0.008587263524532318,
-0.028079774230718613,
0.09822282195091248,
0.08903733640909195,
0.03910432010889053,
0.012154525145888329,
0.025605058297514915,
-0.09160027652978897,
0.08776617050170898,
-0.13546094298362732,
0.0745067298412323,
0.003982215188443661,
0.053055524826049805,
-0.10256712883710861,
-0.06779026240110397,
-0.005183704663068056,
0.045380350202322006,
0.08120796829462051,
0.05731076002120972,
-0.11965908110141754,
0.0053511313162744045,
0.1501438319683075,
-0.10120747238397598,
-0.15793000161647797,
0.11488698422908783,
-0.019171906635165215,
0.04284368082880974,
0.07586290687322617,
0.12610450387001038,
0.14461028575897217,
-0.10197421908378601,
-0.05009546875953674,
0.07106634974479675,
0.054020147770643234,
-0.061392102390527725,
0.04923216626048088,
0.015525386668741703,
-0.02020268701016903,
0.0118873817846179,
0.0555766262114048,
0.08528557419776917,
-0.014575674198567867,
-0.034969035536050797,
-0.02542903460562229,
-0.09441406279802322,
-0.036868415772914886,
-0.010698270983994007,
0.016884194687008858,
-0.05138831213116646,
-0.0733545795083046,
0.006691045127809048,
0.1642327904701233,
-0.10705909878015518,
0.02900187112390995,
-0.09860310703516006,
-0.03238869830965996,
-0.08088129013776779,
0.03094097413122654,
-0.12390337884426117,
-0.009252446703612804,
0.057090748101472855,
-0.04300180822610855,
0.05600884556770325,
0.06707014888525009,
0.0029938926454633474,
0.021329078823328018,
-0.04817476496100426,
-0.04575476795434952,
-0.053606726229190826,
-0.0747632384300232,
-0.10062415152788162,
-0.040442582219839096,
-0.10841745883226395,
-0.031183991581201553,
-0.023225177079439163,
-0.17659921944141388,
-0.002250517485663295,
0.005288447719067335,
0.02199218049645424,
0.031560108065605164,
-0.041931916028261185,
0.021246030926704407,
0.04593246430158615,
-0.05365968495607376,
-0.07866418361663818,
0.017201580107212067,
0.04709954932332039,
-0.09154848009347916,
-0.04163973778486252,
-0.09286391735076904,
-0.09187618643045425,
0.09124238789081573,
0.08328165113925934,
-0.12563742697238922,
-0.00983172282576561,
-0.0314740389585495,
-0.04608965292572975,
-0.04928826168179512,
-0.07063063234090805,
0.16381379961967468,
0.020676150918006897,
0.1530667543411255,
-0.13487239181995392,
-0.06465914100408554,
-0.024138152599334717,
0.017044175416231155,
0.036960482597351074,
0.133248969912529,
0.01617545075714588,
-0.09834247827529907,
0.02053232677280903,
-0.0118448780849576,
-0.0345173180103302,
0.16456659138202667,
-0.03345104306936264,
-0.057344455271959305,
0.0019243365386500955,
0.11546184122562408,
0.000918289297260344,
0.19926463067531586,
-0.07648442685604095,
-0.014016925357282162,
-0.015405391342937946,
0.007917110808193684,
0.036357682198286057,
-0.11832023411989212,
0.026625996455550194,
0.030171116814017296,
-0.0666416734457016,
-0.03453727439045906,
-0.020824585109949112,
-0.036584753543138504,
0.03743287920951843,
0.024943439289927483,
0.021593237295746803,
-0.0012191502610221505,
-0.033872753381729126,
-0.11578653007745743,
0.1745360940694809,
-0.05787752941250801,
-0.18572773039340973,
-0.15874095261096954,
0.09921547770500183,
-0.022264908999204636,
-0.017747141420841217,
0.02589278109371662,
-0.10006728023290634,
-0.043822597712278366,
-0.08085717260837555,
0.15621565282344818,
-0.088410384953022,
0.007999508641660213,
0.00425428431481123,
0.05115346983075142,
0.05727231130003929,
-0.16485118865966797,
0.025652296841144562,
-0.02354796603322029,
0.001757504534907639,
-0.02471163496375084,
-0.07410959899425507,
0.09040416777133942,
0.12753961980342865,
-0.05417444929480553,
0.024521740153431892,
-0.0021936858538538218,
0.16455265879631042,
-0.07179396599531174,
0.05617456138134003,
0.18025663495063782,
0.007137800566852093,
0.022595886141061783,
0.04097937420010567,
0.004051361698657274,
-0.09037788957357407,
0.0693487823009491,
0.04571739584207535,
-0.04482031241059303,
-0.21966780722141266,
-0.028029773384332657,
-0.07892756909132004,
0.05733419209718704,
0.10227922350168228,
0.03023749217391014,
-0.16171197593212128,
0.03041493520140648,
-0.01243415754288435,
0.16529187560081482,
-0.013777418993413448,
0.06057118996977806,
0.009165922179818153,
0.03522699698805809,
0.005866803228855133,
-0.10361072421073914,
0.0016902233473956585,
0.059092868119478226,
0.08600080758333206,
0.21589024364948273,
-0.09131821990013123,
0.16895383596420288,
0.018124226480722427,
0.12042748183012009,
0.03744563087821007,
0.09634900838136673,
-0.10479611158370972,
0.012823449447751045,
0.01290606614202261,
-0.02493247203528881,
-0.0693802684545517,
0.038173530250787735,
-0.04868697375059128,
0.08113203942775726,
-0.09140277653932571,
0.03630875423550606,
0.02919016405940056,
0.18596628308296204,
0.09663219749927521,
-0.18931730091571808,
-0.15186859667301178,
0.0029847468249499798,
-0.0830228328704834,
-0.09231267124414444,
0.06968938559293747,
0.21170088648796082,
-0.06375033408403397,
0.02434789203107357,
-0.01786436326801777,
0.12632279098033905,
-0.09243510663509369,
-0.010877148248255253,
0.056419726461172104,
0.06671290099620819,
0.0067446245811879635,
0.10345178842544556,
-0.24164406955242157,
0.07943113893270493,
0.02063017338514328,
0.09557440131902695,
-0.01822376623749733,
0.056341879069805145,
-0.024999091401696205,
-0.0026379921473562717,
0.07152603566646576,
0.008360377512872219,
-0.023599153384566307,
-0.20146654546260834,
-0.0403156504034996,
0.02958744578063488,
0.05608511343598366,
-0.015520536340773106,
0.09589418768882751,
-0.010811122134327888,
0.03775437921285629,
-0.026461949571967125,
-0.09898550063371658,
-0.09246379137039185,
-0.11490616947412491,
-0.05628625676035881,
-0.009901878423988819,
-0.045040328055620193,
-0.021083390340209007,
0.042853765189647675,
0.029418908059597015,
0.2463693916797638,
-0.12903092801570892,
-0.06594131141901016,
-0.07695255428552628,
0.0587446503341198,
0.11776623874902725,
-0.09934952110052109,
0.00800439715385437,
0.01959371194243431,
0.05791354551911354,
-0.04274696484208107,
-0.09013337641954422,
0.031603556126356125,
-0.06333857774734497,
-0.08510684221982956,
-0.03788353502750397,
0.12038914114236832,
0.011544618755578995,
0.040942464023828506,
0.02208182029426098,
-0.09494379907846451,
-0.016926107928156853,
-0.13083888590335846,
-0.0609482079744339,
-0.038031503558158875,
0.043101370334625244,
-0.009515682235360146,
-0.12311971932649612,
0.06415466964244843,
-0.018215173855423927,
-0.06151007488369942,
0.04742248356342316,
0.16478897631168365,
-0.0732545331120491,
0.010983649641275406,
0.0746249109506607,
-0.04533576965332031,
-0.18434269726276398,
-0.008371807634830475,
0.04741329699754715,
0.08082953095436096,
-0.024534933269023895,
-0.15293236076831818,
0.08227342367172241,
-0.022382380440831184,
0.0130622498691082,
0.01762206107378006,
-0.2421802431344986,
-0.1215205267071724,
0.011508246883749962,
0.05584929138422012,
0.04925426468253136,
-0.0908646211028099,
-0.046354446560144424,
-0.05346990376710892,
-0.0641460046172142,
0.07722585648298264,
0.04164619371294975,
0.10013089329004288,
-0.027911294251680374,
0.0313878171145916,
0.044766783714294434,
-0.030112747102975845,
0.047045618295669556,
0.019341392442584038,
0.10807180404663086,
-0.015565679408609867,
0.0033898178953677416,
0.059586670249700546,
-0.07264135032892227,
0.1776958554983139,
-0.1486741453409195,
0.09386280924081802,
-0.1644255816936493,
-0.026444626972079277,
-0.04584414139389992,
0.008852464146912098,
-0.03765431046485901,
-0.03457668423652649,
-0.13035748898983002,
0.03786024823784828,
0.045193035155534744,
-0.01321208756417036,
0.056107912212610245,
0.003963596187531948,
-0.04847736284136772,
0.056821275502443314,
0.09441269934177399,
-0.006941733881831169,
-0.12039625644683838,
0.03839688003063202,
0.01581376977264881,
0.09516049921512604,
-0.1724921613931656,
0.03314734622836113,
0.09921290725469589,
0.014294270426034927,
0.0909588560461998,
0.018766585737466812,
-0.10651563107967377,
0.03933192044496536,
0.0648108571767807,
-0.06779908388853073,
-0.053581781685352325,
-0.02933516725897789,
-0.05169972404837608,
-0.09925450384616852,
0.049085356295108795,
0.09776522219181061,
-0.034627318382263184,
-0.0013786597410216928,
-0.010503312572836876,
0.002439468167722225,
-0.08404406160116196,
0.16665104031562805,
0.014822907745838165,
0.08424300700426102,
-0.0551404245197773,
0.06942616403102875,
0.08519426733255386,
-0.10095011442899704,
0.02852647751569748,
0.13395968079566956,
-0.09606276452541351,
-0.016635380685329437,
0.08466620743274689,
0.14224542677402496,
-0.020008999854326248,
-0.04490484297275543,
-0.10053211450576782,
-0.08777840435504913,
0.010018086060881615,
0.056810978800058365,
0.07117465883493423,
0.09988103061914444,
-0.012094292789697647,
0.0068162851966917515,
-0.1356942504644394,
0.0957748219370842,
0.08732256293296814,
0.041903238743543625,
-0.12612226605415344,
0.15102829039096832,
0.03444124385714531,
0.09493046253919601,
-0.002647757064551115,
0.022820910438895226,
-0.11997606605291367,
0.038057196885347366,
-0.047673460096120834,
0.03759169951081276,
-0.015922747552394867,
0.04052566736936569,
-0.057370733469724655,
0.04677748307585716,
-0.03588604927062988,
0.04629397392272949,
-0.03998938575387001,
-0.024320941418409348,
-0.024192551150918007,
0.02839990332722664,
-0.06236904859542847,
-0.01029527373611927,
0.007049929816275835,
-0.10321586579084396,
0.10055084526538849,
-0.050938013941049576,
0.0032907072454690933,
0.0060247681103646755,
0.031039319932460785,
0.03322513774037361,
0.006741035263985395,
0.04593740403652191,
-0.0060571590438485146,
-0.009420621208846569,
0.026777740567922592,
0.02148645557463169,
-0.005628121551126242,
-0.0074888672679662704,
0.09819936007261276,
-0.12724219262599945,
-0.07359909266233444,
-0.08447045832872391,
-0.058711979538202286,
-0.06367240846157074,
0.08671291172504425,
0.07323853671550751,
0.07659363746643066,
0.0933116152882576,
-0.040733739733695984,
0.008650030940771103,
-0.1950002908706665,
-0.04654891416430473,
0.05126236379146576,
0.002822496695443988,
-0.11927390098571777,
-0.043007344007492065,
0.07085806131362915,
-0.04114767163991928,
0.08999449014663696,
-0.03034459985792637,
0.04616055265069008,
-0.01377030834555626,
-0.05312825366854668,
-0.05204092711210251,
0.005167593248188496,
0.14159637689590454,
-0.10920538753271103,
0.0074584633111953735,
-0.0018655764870345592,
0.005015167407691479,
0.039932090789079666,
0.15477538108825684,
0.14722709357738495,
0.11618363112211227,
0.024491719901561737,
0.09607836604118347,
-0.04168706759810448,
-0.03800524026155472,
-0.12182357162237167,
0.06755515933036804,
-0.0433785542845726,
0.033592406660318375,
-0.02510122023522854,
0.11950619518756866,
0.08067821711301804,
-0.1352938413619995,
0.09199404716491699,
-0.0023180521093308926,
-0.09317650645971298,
-0.04087980464100838,
-0.08595588058233261,
-0.033731408417224884,
-0.10709137469530106,
0.010199196636676788,
-0.09403562545776367,
-0.022353509441018105,
0.04968314245343208,
0.024298585951328278,
-0.032308775931596756,
0.17612627148628235,
-0.044933125376701355,
-0.05595216527581215,
0.0401502288877964,
0.039793916046619415,
0.006436560302972794,
0.10495594143867493,
0.026094181463122368,
0.06494578719139099,
-0.06049494817852974,
0.0730651468038559,
0.036016300320625305,
-0.0049352399073541164,
0.03162318840622902,
0.035765040665864944,
-0.02161497063934803,
-0.04434553533792496,
-0.005274009425193071,
0.09467397630214691,
0.13194167613983154,
0.04035342112183571,
-0.02989738993346691,
-0.05385364592075348,
0.1566709578037262,
-0.047902390360832214,
-0.03365306183695793,
-0.12560467422008514,
0.1520327627658844,
0.019255777820944786,
-0.0001101387242670171,
0.010784412734210491,
-0.07656347751617432,
-0.011588123627007008,
0.2490413784980774,
0.057690057903528214,
-0.04626084491610527,
-0.025486165657639503,
-0.016690362244844437,
-0.01106769684702158,
-0.04618794471025467,
0.15871979296207428,
0.0020334110595285892,
0.2288525104522705,
0.022595595568418503,
-0.025688068941235542,
-0.0428880900144577,
-0.049089960753917694,
0.022061504423618317,
0.1897554099559784,
-0.04730924218893051,
0.030398188158869743,
-0.10258538275957108,
-0.005715590436011553,
0.010022461414337158,
-0.14694306254386902,
0.13301357626914978,
-0.13504678010940552,
-0.06478647142648697,
0.013178360648453236,
0.05191563069820404,
-0.0455850213766098,
0.06421182304620743,
-0.03126358985900879,
0.06327120959758759,
0.05870002508163452,
-0.03215474262833595,
-0.09559798985719681,
-0.13919299840927124,
0.0544840507209301,
-0.014366435818374157,
0.12937231361865997,
0.013961400836706161,
0.09932917356491089,
0.08268369734287262,
0.020929330959916115,
-0.07464088499546051,
0.06707446277141571,
0.03424841910600662,
0.013096895068883896,
0.04616772010922432,
0.1112966313958168,
-0.04527561366558075,
0.1704721748828888,
0.022070204839110374,
-0.026041626930236816,
-0.026327650994062424,
-0.053998082876205444,
-0.01781054213643074,
-0.1717349737882614,
-0.010520088486373425,
-0.06259670853614807,
0.1407759189605713,
0.1879984587430954,
-0.05502995103597641,
-0.016089195385575294,
-0.03777045011520386,
0.09049932658672333,
0.006717635318636894,
0.08354926854372025,
-0.008467134088277817,
-0.1725684404373169,
0.01508721336722374,
-0.04073596000671387,
0.0029068179428577423,
-0.18786218762397766,
-0.05781106278300285,
-0.034067459404468536,
-0.03969854116439819,
-0.10127467662096024,
0.14800840616226196,
0.06000714749097824,
0.0372631773352623,
-0.046869684010744095,
-0.0961122140288353,
-0.010452907532453537,
0.057446353137493134,
-0.12862615287303925,
-0.13020646572113037
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_php_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_php_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/php/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_php_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us
|
CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
50,
61,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12895157933235168,
-0.02787751704454422,
0.0016143502434715629,
0.11706048250198364,
0.11589261889457703,
0.02410510927438736,
0.09247796982526779,
0.07239913195371628,
-0.019304290413856506,
0.024785712361335754,
0.06641390174627304,
-0.006358504761010408,
0.029475461691617966,
0.1332578957080841,
-0.005022847559303045,
-0.13454598188400269,
-0.003927203360944986,
0.0860658511519432,
-0.1936059445142746,
0.0984513983130455,
0.08766625076532364,
-0.10612398386001587,
0.10613882541656494,
-0.002083360916003585,
-0.17830759286880493,
0.04394705593585968,
-0.05607057362794876,
-0.058452058583498,
0.09866108745336533,
0.056825604289770126,
0.14160503447055817,
0.0048618009313941,
0.057711921632289886,
-0.10244716703891754,
0.014712448231875896,
0.017828278243541718,
0.040345609188079834,
0.052293289452791214,
0.04044336453080177,
0.11109521985054016,
0.07802429050207138,
-0.04047805815935135,
0.04174336791038513,
0.03418470546603203,
-0.07760217040777206,
-0.032826103270053864,
-0.03520556539297104,
0.09511426836252213,
0.10658437013626099,
0.13563762605190277,
0.020161068066954613,
0.06844330579042435,
-0.07156068086624146,
0.06277991086244583,
0.0844835489988327,
-0.28162193298339844,
-0.030570585280656815,
0.07779545336961746,
0.05843218415975571,
-0.0015771541511639953,
-0.06021280214190483,
-0.01349510159343481,
0.07520681619644165,
0.0418381467461586,
0.09854345768690109,
-0.08689599484205246,
-0.04527384042739868,
-0.01281722728163004,
-0.08948980271816254,
-0.027071384713053703,
0.2729063332080841,
0.04882040619850159,
-0.020522000268101692,
-0.042940933257341385,
-0.05000836402177811,
-0.09508257359266281,
0.02117897756397724,
-0.021340398117899895,
0.0046911281533539295,
-0.0012561676558107138,
-0.011612302623689175,
-0.03454400226473808,
-0.11730630695819855,
-0.08659379929304123,
-0.026988856494426727,
0.07521875947713852,
0.0710170790553093,
0.041895028203725815,
-0.05817825347185135,
0.05280407890677452,
0.07600875943899155,
-0.04604295641183853,
0.04171734303236008,
-0.0023845378309488297,
-0.05868664011359215,
-0.009990501217544079,
-0.04656162112951279,
-0.20773674547672272,
0.009559308178722858,
0.018289538100361824,
-0.11754333227872849,
0.06856496632099152,
0.16576291620731354,
0.06717494130134583,
-0.008939342573285103,
0.1935357302427292,
-0.009340841323137283,
-0.09669861942529678,
0.024140862748026848,
0.02346082031726837,
-0.03490016236901283,
-0.005940040107816458,
-0.12036608904600143,
-0.037336550652980804,
0.030271431431174278,
-0.01214575208723545,
-0.12785016000270844,
0.08658213168382645,
-0.01494314894080162,
-0.04149318486452103,
0.07886361330747604,
-0.08233936876058578,
0.025013212114572525,
-0.010910078883171082,
-0.08394214510917664,
-0.023986270651221275,
0.11728895455598831,
-0.0693272203207016,
-0.13271664083003998,
-0.015067171305418015,
-0.05370315536856651,
-0.018153861165046692,
-0.10404873639345169,
-0.11569861322641373,
-0.002735058544203639,
-0.01305599045008421,
0.014150048606097698,
-0.16618448495864868,
-0.0738477036356926,
-0.03777289763092995,
0.09010527282953262,
0.009621215052902699,
-0.05855067819356918,
-0.046103693544864655,
0.016117464751005173,
0.009734131395816803,
-0.02955053187906742,
0.030493151396512985,
-0.04575277864933014,
0.05516733229160309,
0.05048327147960663,
0.06919196993112564,
-0.05112113803625107,
0.05445275083184242,
-0.07296525686979294,
0.0293488297611475,
-0.1486785113811493,
0.05795138701796532,
0.038883037865161896,
0.11492962390184402,
-0.10833817720413208,
-0.09662482887506485,
-0.045575860887765884,
0.059291642159223557,
0.08406243473291397,
0.0445193387567997,
-0.04342564940452576,
0.005558696109801531,
0.0492444708943367,
-0.08903578668832779,
-0.18837259709835052,
0.09743969142436981,
-0.02081099897623062,
0.1093141958117485,
0.04187075048685074,
0.1631333976984024,
0.12520788609981537,
-0.08424109220504761,
0.011990655213594437,
0.06565957516431808,
-0.02831367403268814,
-0.18803006410598755,
0.06238900125026703,
0.07888653874397278,
-0.09653901308774948,
-0.0006307660369202495,
0.04219338670372963,
0.11994750052690506,
-0.031590815633535385,
-0.039473749697208405,
-0.02370111458003521,
-0.1075398325920105,
-0.025465482845902443,
-0.005693973507732153,
0.10358118265867233,
-0.03283125162124634,
-0.06754288822412491,
0.039769526571035385,
0.08780055493116379,
-0.08457808196544647,
0.032857127487659454,
-0.08599820733070374,
-0.04508955404162407,
-0.10949453711509705,
0.016606666147708893,
-0.14034917950630188,
0.010846338234841824,
0.001592113170772791,
0.022039206698536873,
0.06385820358991623,
0.12482218444347382,
0.035963185131549835,
0.013659851625561714,
-0.02322179637849331,
-0.03295329958200455,
-0.022884635254740715,
-0.025408312678337097,
-0.12082752585411072,
-0.022484512999653816,
-0.06154555082321167,
-0.019307440146803856,
-0.04107888415455818,
-0.14809773862361908,
0.02478005737066269,
-0.07968274503946304,
0.00850842334330082,
0.003482673317193985,
0.018398672342300415,
0.03487032651901245,
0.06647325307130814,
-0.0347445011138916,
-0.06330530345439911,
0.07070969045162201,
0.07105781883001328,
-0.08254669606685638,
0.048039983958005905,
-0.07843515276908875,
0.022082775831222534,
0.09454785287380219,
-0.09144473075866699,
-0.10201133787631989,
-0.004112658556550741,
-0.0364132858812809,
-0.05789440870285034,
0.009514473378658295,
-0.01415734551846981,
0.2524799108505249,
-0.01902083307504654,
0.1715286374092102,
-0.09194445610046387,
-0.03407294675707817,
-0.009754220955073833,
-0.014418027363717556,
0.04807456210255623,
0.10712263733148575,
0.0623919740319252,
-0.14344410598278046,
0.041968412697315216,
0.025471890345215797,
-0.05961891636252403,
0.08384441584348679,
-0.020540891215205193,
-0.041849952191114426,
0.02868707664310932,
0.053259383887052536,
-0.005027837585657835,
0.1346554011106491,
-0.1297147572040558,
-0.033973097801208496,
-0.003717780578881502,
0.02737518586218357,
0.05565125122666359,
-0.15226943790912628,
0.02428440749645233,
0.04836728423833847,
-0.012456605210900307,
-0.02963804081082344,
-0.009585759602487087,
-0.042234256863594055,
0.03909621015191078,
0.0531722828745842,
-0.01616162247955799,
0.019595744088292122,
0.01828608848154545,
-0.08465898036956787,
0.18380975723266602,
-0.03310323506593704,
-0.24350520968437195,
-0.12181752175092697,
0.06920313835144043,
-0.0011145953321829438,
-0.0064241099171340466,
0.06767124682664871,
-0.09340915083885193,
-0.055745579302310944,
-0.07814516127109528,
0.10547591000795364,
-0.09505365788936615,
0.0359041653573513,
0.032490503042936325,
0.0353078730404377,
0.07597845047712326,
-0.12570582330226898,
0.012606015428900719,
-0.02340119145810604,
0.02800101228058338,
0.0017289023380726576,
-0.07055464386940002,
0.10131551325321198,
0.14522476494312286,
-0.0765850618481636,
0.03952283039689064,
-0.02039012685418129,
0.16139726340770721,
-0.04880616441369057,
0.012769398279488087,
0.17835023999214172,
0.007544779218733311,
0.015013844706118107,
0.042789675295352936,
0.01828951947391033,
-0.08021237701177597,
0.05779729411005974,
0.03673158958554268,
-0.0503007248044014,
-0.19767515361309052,
-0.05929502472281456,
-0.07741108536720276,
-0.015055674128234386,
0.15380117297172546,
0.07341035455465317,
-0.0806409940123558,
0.06012003496289253,
0.03957536071538925,
0.1626511961221695,
-0.053434498608112335,
0.04139493778347969,
0.07414964586496353,
0.04691467806696892,
0.009761812165379524,
-0.10612189769744873,
-0.05240205302834511,
0.05239277705550194,
0.09814821183681488,
0.20697160065174103,
-0.07551161199808121,
0.10898967832326889,
0.03173748031258583,
0.052320390939712524,
0.02117720991373062,
0.13722600042819977,
-0.09066611528396606,
-0.007262549363076687,
-0.008627153001725674,
0.017639273777604103,
-0.041454751044511795,
0.0422469861805439,
-0.058840248733758926,
0.02589583583176136,
-0.1283627301454544,
0.03413769230246544,
0.036963921040296555,
0.18999524414539337,
0.037737827748060226,
-0.2613094747066498,
-0.1296660304069519,
-0.04866085574030876,
-0.07987596094608307,
-0.07785902917385101,
0.06311041861772537,
0.18399178981781006,
-0.006686035543680191,
-0.004598850384354591,
-0.029128611087799072,
0.1551510989665985,
-0.09266743063926697,
-0.0013314266689121723,
0.06780049949884415,
0.04069770127534866,
0.01516010519117117,
0.09890294075012207,
-0.23080074787139893,
0.13125064969062805,
-0.003463353728875518,
0.07702624052762985,
-0.036881256848573685,
0.0005885206628590822,
-0.018553774803876877,
0.07218563556671143,
0.05639307200908661,
0.024056635797023773,
-0.06266055256128311,
-0.19616228342056274,
-0.06433630734682083,
0.028838306665420532,
-0.013221165165305138,
0.025946134701371193,
0.0757940486073494,
-0.01432312373071909,
0.04677115008234978,
0.010415052995085716,
-0.043283071368932724,
-0.08700267970561981,
-0.10259903967380524,
-0.045968204736709595,
0.06560268253087997,
-0.07168304920196533,
-0.01154184341430664,
-0.005642290227115154,
-0.003463710891082883,
0.16605165600776672,
-0.025081103667616844,
-0.06606753170490265,
-0.11127182096242905,
0.038397617638111115,
0.08100178092718124,
-0.06778325140476227,
0.03398444503545761,
0.03464686498045921,
0.0065269023180007935,
-0.005772794131189585,
-0.0811336413025856,
0.06797885149717331,
-0.05891270935535431,
0.0005890062311664224,
-0.009677384048700333,
0.04865340515971184,
-0.009816472418606281,
0.020303089171648026,
0.014358052983880043,
-0.07572892308235168,
-0.07695204019546509,
-0.11367811262607574,
-0.051360033452510834,
-0.11276889592409134,
0.09686349332332611,
-0.052955497056245804,
-0.08285012096166611,
0.16256721317768097,
0.009447788819670677,
-0.03267719969153404,
0.1412559449672699,
0.02754313126206398,
-0.05057404935359955,
-0.05330248177051544,
0.07424818724393845,
-0.03663546219468117,
-0.2375195026397705,
-0.036208558827638626,
0.048721104860305786,
0.0626235157251358,
-0.07614150643348694,
-0.13728587329387665,
0.09479106962680817,
-0.02228706143796444,
0.02284901589155197,
-0.0023908393923193216,
-0.2695856988430023,
-0.11743244528770447,
0.0492570698261261,
0.08539119362831116,
0.2065657079219818,
-0.10941293090581894,
0.00027884001610800624,
-0.041565630584955215,
-0.058799032121896744,
0.12611404061317444,
-0.0529007688164711,
0.12949392199516296,
-0.03362414985895157,
0.05438278242945671,
0.007490838412195444,
-0.03964865207672119,
0.055084165185689926,
0.023952670395374298,
0.0815388485789299,
-0.04327321797609329,
0.0023995465599000454,
0.04947249963879585,
-0.1016010046005249,
0.19395893812179565,
-0.1451576203107834,
0.05544128641486168,
-0.16966871917247772,
-0.06659197062253952,
-0.013937106356024742,
0.014102461747825146,
0.029684418812394142,
-0.03007395938038826,
-0.08333233743906021,
0.01705743744969368,
0.04374168440699577,
-0.016068032011389732,
0.0964205265045166,
0.026469528675079346,
-0.04514792189002037,
0.07064052671194077,
0.06874089688062668,
-0.08216489851474762,
-0.15156617760658264,
0.047437313944101334,
0.014346405863761902,
0.14206427335739136,
-0.2663978338241577,
0.035579487681388855,
0.1044517531991005,
-0.04168986156582832,
0.08728841692209244,
0.04186692833900452,
-0.030426735058426857,
0.015170512720942497,
0.07300636917352676,
-0.08089540153741837,
-0.0683353915810585,
-0.028290877118706703,
-0.034975565969944,
-0.04580974951386452,
0.04813804477453232,
0.09574525058269501,
-0.10966990143060684,
0.017935944721102715,
-0.010674911551177502,
-0.03901055455207825,
-0.11339624226093292,
0.18634147942066193,
0.0400586873292923,
0.05710410326719284,
-0.04792340099811554,
0.10194668173789978,
0.10076175630092621,
-0.114203542470932,
0.044637929648160934,
0.16029104590415955,
-0.10430125892162323,
-0.0833410695195198,
0.08931519836187363,
0.2089645117521286,
-0.028836704790592194,
-0.10362981259822845,
-0.13373582065105438,
-0.08774201571941376,
0.037244897335767746,
0.04319321736693382,
0.06352347135543823,
0.039652056992053986,
-0.04396883025765419,
0.003958798944950104,
-0.13819317519664764,
0.04878980666399002,
0.08545085042715073,
0.04636697471141815,
-0.1547413021326065,
0.10961393266916275,
0.052690088748931885,
0.11284957826137543,
-0.03803602606058121,
0.027881132438778877,
-0.13483071327209473,
0.04735850170254707,
-0.032015953212976456,
0.04235275089740753,
-0.01305242907255888,
0.02419252321124077,
-0.030512381345033646,
-0.00845771748572588,
-0.06103506311774254,
0.06829949468374252,
-0.034496650099754333,
-0.013972844928503036,
-0.010484467260539532,
0.03928441181778908,
-0.015986643731594086,
-0.045765362679958344,
-0.015250712633132935,
-0.04141810163855553,
0.07464566826820374,
-0.03372412919998169,
-0.06806603074073792,
-0.00935703981667757,
-0.038933128118515015,
0.010716850869357586,
0.05921252816915512,
0.05070831626653671,
0.03713947907090187,
-0.002586320275440812,
0.03395325690507889,
0.03236294165253639,
0.027788333594799042,
-0.02876465581357479,
0.11328219622373581,
-0.0851551815867424,
-0.08636155724525452,
-0.12713269889354706,
-0.06581444293260574,
-0.048699986189603806,
0.032119471579790115,
0.11683809757232666,
0.1163896769285202,
0.12938539683818817,
-0.06411794573068619,
-0.0023022564128041267,
-0.14287127554416656,
-0.011604176834225655,
0.07688962668180466,
-0.04484948143362999,
-0.061200547963380814,
-0.10274439305067062,
0.059532713145017624,
-0.027321206405758858,
0.14074614644050598,
0.01938214711844921,
0.022944990545511246,
-0.005507045425474644,
0.017534717917442322,
-0.05651724711060524,
-0.017114538699388504,
0.16066764295101166,
-0.0947382003068924,
0.012004995718598366,
-0.01276540756225586,
0.07137078791856766,
0.10542697459459305,
0.15315674245357513,
0.10807871073484421,
0.1265508532524109,
0.05146123096346855,
0.10865145176649094,
-0.03073299489915371,
-0.01769225113093853,
-0.13742853701114655,
0.06954912096261978,
-0.04502912610769272,
0.05399596691131592,
-0.057691894471645355,
0.0956542119383812,
0.11382721364498138,
-0.06638267636299133,
0.06446576118469238,
-0.0038053314201533794,
-0.08320064097642899,
-0.02001124434173107,
-0.050988681614398956,
-0.06662426143884659,
-0.15422210097312927,
-0.020416615530848503,
-0.059253890067338943,
-0.08258645236492157,
0.08633478730916977,
0.018244938924908638,
-0.018684227019548416,
0.2125380039215088,
0.01355562824755907,
-0.019674556329846382,
0.02509177103638649,
-0.005146968178451061,
0.021778224036097527,
0.011200337670743465,
-0.0005499729886651039,
0.034602098166942596,
-0.02348787896335125,
0.06744956970214844,
0.005080887582153082,
-0.03028898313641548,
0.05749033764004707,
0.02652614749968052,
-0.032473690807819366,
-0.05630522221326828,
0.016605153679847717,
0.05943647399544716,
0.09466090798377991,
0.014366340823471546,
-0.037022899836301804,
-0.042802345007658005,
0.15651562809944153,
-0.05107584223151207,
-0.057372063398361206,
-0.11780747026205063,
0.06912431120872498,
0.053139578551054,
-0.009734696708619595,
-0.004734812770038843,
-0.05149881914258003,
-0.06020939722657204,
0.26376667618751526,
0.12833397090435028,
-0.09183269739151001,
-0.02160823531448841,
-0.016445329412817955,
-0.005243778694421053,
-0.03373509272933006,
0.16349808871746063,
0.07625977694988251,
0.1341608613729477,
-0.014865118078887463,
-0.08762620389461517,
-0.08366783708333969,
-0.005147822201251984,
-0.034552592784166336,
0.10940942168235779,
0.039705097675323486,
0.03082367591559887,
-0.07150716334581375,
0.027698440477252007,
-0.007458881940692663,
-0.08979840576648712,
0.1229909211397171,
-0.1121787428855896,
-0.06889088451862335,
-0.004289449658244848,
0.054799679666757584,
-0.030567344278097153,
0.06878598034381866,
-0.0346294604241848,
0.07363957166671753,
-0.00043084006756544113,
-0.016280492767691612,
-0.09822763502597809,
-0.07214334607124329,
0.042232539504766464,
0.01330997608602047,
0.15677213668823242,
-0.006203449796885252,
0.07065930217504501,
0.07450143992900848,
0.03472743555903435,
-0.09217949211597443,
0.08212681114673615,
-0.01830301806330681,
0.022332647815346718,
0.05171693488955498,
0.002141528530046344,
-0.08095800131559372,
0.10737266391515732,
-0.0060535818338394165,
-0.11045999079942703,
-0.045448292046785355,
-0.061353035271167755,
0.008728791028261185,
-0.11469356715679169,
-0.030163392424583435,
-0.05494474247097969,
0.1281716525554657,
0.15318778157234192,
-0.025883955880999565,
-0.026425715535879135,
-0.04500610753893852,
0.06385351717472076,
0.00039512236253358424,
0.01627729833126068,
-0.05002114549279213,
-0.16427680850028992,
-0.010083219967782497,
-0.06961897015571594,
0.006127727683633566,
-0.17836280167102814,
-0.0414440892636776,
-0.08264508843421936,
-0.02391940727829933,
-0.07397248595952988,
0.09946847707033157,
0.09852239489555359,
0.02588210441172123,
-0.06287294626235962,
-0.08052235841751099,
-0.01990547403693199,
0.08522745966911316,
-0.13712000846862793,
-0.13479341566562653
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the php function/method.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_php_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_php_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/php/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 8000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_php_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the php function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 8000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 8000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 8000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
77
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 8000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11992999911308289,
0.04222533851861954,
-0.0005899046664126217,
0.11045823246240616,
0.08216582238674164,
0.023288389667868614,
0.02190401591360569,
0.10464490205049515,
-0.06261198967695236,
0.053923387080430984,
0.09555315971374512,
-0.08587992191314697,
0.02402283251285553,
0.1340024471282959,
0.04150029644370079,
-0.19877035915851593,
-0.024353601038455963,
0.06989000737667084,
-0.12285400927066803,
0.11245400458574295,
0.07059861719608307,
-0.10227516293525696,
0.07165269553661346,
-0.022032273933291435,
-0.10282523185014725,
0.038002029061317444,
-0.03752822428941727,
-0.012950563803315163,
0.09561967849731445,
0.03691311180591583,
0.10256576538085938,
-0.022605258971452713,
0.056365445256233215,
-0.1340123414993286,
0.006057476159185171,
0.06317584216594696,
0.05595376342535019,
0.05393067002296448,
0.07524115592241287,
0.11708971112966537,
0.05358656123280525,
-0.021768899634480476,
0.027894871309399605,
0.06542997062206268,
-0.05118970200419426,
-0.056944385170936584,
-0.05784950032830238,
0.08621115237474442,
0.1007634848356247,
0.11784664541482925,
0.010729190893471241,
-0.008986524306237698,
-0.07298947870731354,
0.05550696700811386,
0.09898698329925537,
-0.21447952091693878,
-0.039099615067243576,
0.047559674829244614,
0.0503089614212513,
0.01239523384720087,
-0.06686904281377792,
-0.039080120623111725,
0.06364931166172028,
0.054745715111494064,
0.08731891959905624,
-0.090709388256073,
-0.0039038383401930332,
-0.034018877893686295,
-0.08273709565401077,
-0.04250907152891159,
0.1871393620967865,
0.07889872789382935,
-0.026408914476633072,
-0.08996386080980301,
-0.0967966690659523,
-0.18036909401416779,
0.010844619944691658,
-0.005803321022540331,
0.022784458473324776,
0.006341324187815189,
-0.02023916505277157,
-0.04256182536482811,
-0.10566892474889755,
-0.12109862267971039,
-0.007080833427608013,
-0.02128690667450428,
0.07759203016757965,
0.039447903633117676,
0.002398795448243618,
0.09479521214962006,
-0.0056051164865493774,
-0.01703667640686035,
0.021123025566339493,
0.0308785829693079,
-0.07999774068593979,
0.02184564620256424,
-0.017848625779151917,
-0.15678349137306213,
0.011552856303751469,
0.06664784997701645,
-0.11767267435789108,
0.07624942064285278,
0.17198234796524048,
0.01927068643271923,
-0.04529657959938049,
0.179799884557724,
-0.005632374435663223,
-0.09018521010875702,
-0.00014271483814809471,
0.01662413775920868,
-0.03487647324800491,
0.033513881266117096,
-0.05986114218831062,
-0.041649073362350464,
0.03683300316333771,
0.015353051014244556,
-0.069085493683815,
0.04051902890205383,
-0.0016225811559706926,
-0.020024238154292107,
0.12392608076334,
-0.08993016183376312,
0.03349345922470093,
0.007256032899022102,
-0.03416341543197632,
0.013410667888820171,
0.05837427079677582,
-0.1260206401348114,
-0.16141986846923828,
0.03795928880572319,
-0.0460992269217968,
-0.02798718400299549,
-0.10279112309217453,
-0.08575695008039474,
0.01663334295153618,
-0.030893750488758087,
-0.025449803099036217,
-0.13462130725383759,
-0.07893377542495728,
-0.04342178255319595,
0.05775528401136398,
-0.0016684344736859202,
-0.039056699723005295,
-0.014728455804288387,
0.03169107064604759,
0.009284766390919685,
-0.017744315788149834,
0.07896135747432709,
-0.04248231649398804,
0.0763046070933342,
0.03679777681827545,
0.039629094302654266,
-0.037364114075899124,
0.0391954705119133,
-0.07810290902853012,
0.0639103576540947,
-0.08288763463497162,
0.03846832737326622,
0.05353354290127754,
0.05668955296278,
-0.1311400979757309,
-0.10697393119335175,
-0.06809074431657791,
0.02521316520869732,
0.05968501791357994,
0.06471727788448334,
-0.04986042156815529,
0.033468857407569885,
0.15678438544273376,
-0.07583881169557571,
-0.11563074588775635,
0.1274852603673935,
0.019656671211123466,
0.03048556111752987,
0.051341719925403595,
0.1340494155883789,
0.1487104445695877,
-0.08398941159248352,
0.01697065494954586,
0.1054132953286171,
0.02692972868680954,
-0.14605000615119934,
0.07380963116884232,
-0.031459465622901917,
-0.0010655896039679646,
0.023094244301319122,
0.045541368424892426,
0.06109225004911423,
-0.013036856427788734,
-0.035565055906772614,
-0.03767833858728409,
-0.08414019644260406,
-0.04245326295495033,
0.010968014597892761,
0.03866772726178169,
-0.04399285465478897,
-0.06809127330780029,
0.016118910163640976,
0.12171551585197449,
-0.09619849175214767,
0.049425363540649414,
-0.03416699916124344,
-0.042082395404577255,
-0.10024446994066238,
0.02585672400891781,
-0.1354806125164032,
0.022706136107444763,
0.022058585658669472,
-0.014764717780053616,
0.018897967413067818,
0.08622942864894867,
0.03593217581510544,
-0.004032474476844072,
-0.07371992617845535,
-0.04361017048358917,
-0.01322544738650322,
-0.044941727072000504,
-0.13316427171230316,
0.008864669129252434,
-0.09394753724336624,
0.003781192936003208,
-0.02408771403133869,
-0.12319127470254898,
0.02495604194700718,
0.0010689218761399388,
0.006010930985212326,
0.010884658433496952,
-0.027756836265325546,
0.025175971910357475,
0.056728824973106384,
-0.005474933888763189,
-0.08039408177137375,
0.06912865489721298,
0.0719466358423233,
-0.04989083483815193,
-0.039519648998975754,
-0.056168489158153534,
-0.006529077887535095,
0.06766394525766373,
-0.0017013524193316698,
-0.12040574103593826,
-0.03217661380767822,
-0.041784029453992844,
-0.05485614761710167,
-0.048524804413318634,
-0.06059245020151138,
0.15860125422477722,
0.017608433961868286,
0.1770557314157486,
-0.10574255138635635,
-0.045953404158353806,
0.013011296279728413,
0.005026472732424736,
0.06113624572753906,
0.14329242706298828,
0.06429637223482132,
-0.052656207233667374,
0.028089361265301704,
0.015258662402629852,
-0.06325512379407883,
0.08194271475076675,
-0.03732803091406822,
-0.09440363943576813,
0.02399003505706787,
0.0899040549993515,
-0.001807678141631186,
0.1152171790599823,
-0.13060857355594635,
-0.009073816239833832,
-0.004212623927742243,
0.0461849607527256,
0.031171346083283424,
-0.15790149569511414,
0.05750807002186775,
0.04822031781077385,
-0.04489501938223839,
-0.0011352322762832046,
-0.04164022579789162,
-0.05863935127854347,
0.03970084339380264,
0.06415440142154694,
0.02433924935758114,
0.010168886743485928,
0.005398350767791271,
-0.09307075291872025,
0.1949101686477661,
-0.030002834275364876,
-0.19998635351657867,
-0.13034817576408386,
0.0723067969083786,
-0.023389920592308044,
-0.022034190595149994,
0.04461667314171791,
-0.09743566811084747,
-0.030159329995512962,
-0.09537075459957123,
0.05657707527279854,
-0.13424977660179138,
0.053817085921764374,
-0.06445363163948059,
0.05208781361579895,
0.09945748001337051,
-0.11062649637460709,
0.018277721479535103,
-0.014949169009923935,
-0.00927033368498087,
-0.0156641136854887,
-0.025397786870598793,
0.10345230251550674,
0.1555594801902771,
-0.07044407725334167,
0.03554835915565491,
-0.010824360884726048,
0.11968263238668442,
-0.051322467625141144,
0.06729244440793991,
0.21067644655704498,
0.06313513964414597,
0.03722458332777023,
0.032070040702819824,
0.04273224622011185,
-0.05058842897415161,
0.04259662330150604,
0.06335148215293884,
-0.05375506356358528,
-0.17536449432373047,
-0.046655699610710144,
-0.08106078207492828,
0.0472867526113987,
0.17729486525058746,
0.07766278833150864,
-0.11084886640310287,
0.059727396816015244,
-0.010743558406829834,
0.1225966289639473,
-0.07322146743535995,
0.05029217526316643,
0.07967399805784225,
0.004934258759021759,
0.0031342918518930674,
-0.10001108050346375,
-0.025319797918200493,
0.08293420076370239,
0.08846662193536758,
0.17293910682201385,
-0.0965685024857521,
0.16621489822864532,
0.025645621120929718,
0.11265040189027786,
0.005499939899891615,
0.12548400461673737,
-0.08811422437429428,
0.006492683198302984,
-0.00040724797872826457,
-0.012741565704345703,
-0.021343044936656952,
0.03779514878988266,
-0.03986969590187073,
0.03685114160180092,
-0.1260375827550888,
-0.021311521530151367,
0.03244367241859436,
0.25585275888442993,
0.08812376111745834,
-0.2091231495141983,
-0.1418740153312683,
-0.05483604967594147,
-0.10203275084495544,
-0.10758395493030548,
0.08191374689340591,
0.16302946209907532,
-0.04561012610793114,
0.002025207504630089,
-0.040279027074575424,
0.14458680152893066,
-0.11568859219551086,
-0.0038804071955382824,
0.10568952560424805,
0.056215379387140274,
0.01479878555983305,
0.11110559105873108,
-0.19820259511470795,
0.09512921422719955,
0.01566898263990879,
0.08309517800807953,
-0.041725095361471176,
0.032896988093853,
-0.040807243436574936,
0.01685045100748539,
0.08155529201030731,
0.019345808774232864,
0.019648022949695587,
-0.1509820967912674,
-0.0656658411026001,
0.023304680362343788,
0.02872840315103531,
0.0100006815046072,
0.07772242277860641,
-0.03905682638287544,
0.023633718490600586,
-0.020224185660481453,
-0.1088893786072731,
-0.03316761553287506,
-0.13572682440280914,
-0.04439840465784073,
0.04375468194484711,
-0.040805913507938385,
-0.032927919179201126,
0.03025410696864128,
0.06213865429162979,
0.22056736052036285,
-0.11118035763502121,
-0.08352786302566528,
-0.11041731387376785,
0.05262181535363197,
0.11745117604732513,
-0.07817957550287247,
0.05183401703834534,
-0.003613261738792062,
0.013054098933935165,
0.0028001470491290092,
-0.05048953741788864,
0.05826188251376152,
-0.057757992297410965,
-0.0680866539478302,
-0.03335803002119064,
0.10332279652357101,
-0.0459919311106205,
0.035410575568675995,
-0.022239379584789276,
-0.07794421911239624,
-0.0765928328037262,
-0.12612441182136536,
-0.0464259535074234,
-0.07028338313102722,
0.05689237266778946,
-0.05261719226837158,
-0.07081972807645798,
0.17358875274658203,
0.03574232757091522,
-0.05460534617304802,
0.10647861659526825,
0.06443513929843903,
-0.06490520387887955,
-0.031841639429330826,
0.12157300859689713,
-0.024196799844503403,
-0.22049203515052795,
-0.05886928364634514,
0.0352725051343441,
0.02375389076769352,
-0.09219832718372345,
-0.13420814275741577,
0.08954180777072906,
0.04315188154578209,
0.011924296617507935,
0.03588762506842613,
-0.29370835423469543,
-0.11038774996995926,
0.0008087715250439942,
0.05893684923648834,
0.09049396961927414,
-0.1194334626197815,
-0.03163347765803337,
-0.02701667882502079,
-0.038651492446660995,
0.026652608066797256,
0.006207679398357868,
0.12050231546163559,
-0.04837718978524208,
-0.007912986911833286,
0.007961217314004898,
-0.05052774026989937,
0.04244553670287132,
-0.012579221278429031,
0.06542930752038956,
-0.013584643602371216,
0.0404423326253891,
0.06292113661766052,
-0.07581271231174469,
0.16883288323879242,
-0.0961507186293602,
0.07920517772436142,
-0.14699454605579376,
-0.06586090475320816,
-0.04361095279455185,
-0.009378736838698387,
0.003604794619604945,
-0.05336661636829376,
-0.07548686861991882,
-0.004254690837115049,
0.05665120109915733,
-0.037681709975004196,
0.0009543303749524057,
-0.0011316542513668537,
-0.06910963356494904,
0.14499402046203613,
0.0343012735247612,
-0.10972607135772705,
-0.25228217244148254,
0.021678172051906586,
-0.007011158391833305,
0.101739302277565,
-0.20773637294769287,
0.017773369327187538,
0.09492548555135727,
0.02519976533949375,
0.05619640275835991,
0.03236433491110802,
-0.019943583756685257,
0.017259681597352028,
0.05093243718147278,
-0.07356912642717361,
-0.13149122893810272,
-0.038628414273262024,
-0.10377633571624756,
-0.1389378160238266,
0.05632227659225464,
0.07000303268432617,
-0.08292483538389206,
0.013451791368424892,
-0.008636683225631714,
-0.016992829740047455,
-0.08214517682790756,
0.22751884162425995,
0.035614147782325745,
0.06314892321825027,
-0.064546599984169,
0.07384262979030609,
0.09391094744205475,
-0.18349100649356842,
0.0024905449245125055,
0.16573470830917358,
-0.10371341556310654,
-0.05115676298737526,
0.09082958102226257,
0.0014699044404551387,
-0.014655518345534801,
-0.07985107600688934,
-0.12266764789819717,
-0.07535172253847122,
0.062021758407354355,
-0.05668063089251518,
0.05972951278090477,
0.060759056359529495,
-0.03717915713787079,
0.011801219545304775,
-0.13909296691417694,
0.07595039159059525,
0.07251624017953873,
0.04899521917104721,
-0.1530582457780838,
0.14708586037158966,
0.04066184535622597,
0.09516768902540207,
-0.0031373021192848682,
0.027088182047009468,
-0.070029616355896,
0.03672857955098152,
-0.02204800769686699,
0.0011963024735450745,
-0.012881607748568058,
0.018515046685934067,
-0.03741386532783508,
0.049260180443525314,
-0.029843894764780998,
0.058849744498729706,
-0.02203834243118763,
-0.04660314694046974,
-0.028974004089832306,
0.03489984944462776,
-0.02924538403749466,
0.0014220430748537183,
-0.01873638480901718,
-0.05227792635560036,
0.06296219676733017,
-0.056316718459129333,
-0.04546898230910301,
-0.04357235133647919,
0.01810457929968834,
0.025389818474650383,
0.03186141699552536,
0.04792051762342453,
-0.01780693419277668,
0.029477572068572044,
0.034200653433799744,
0.0346980020403862,
-0.00797983631491661,
-0.01675899140536785,
0.06229885667562485,
-0.1427294909954071,
-0.04709991440176964,
-0.12495057284832001,
-0.04058466851711273,
-0.0666469931602478,
0.03498607873916626,
0.08369787037372589,
0.08346366882324219,
0.0977487713098526,
-0.06340204924345016,
0.014753269031643867,
-0.19109486043453217,
-0.019885100424289703,
0.05441352352499962,
-0.009369290433824062,
-0.10570138692855835,
-0.06113245338201523,
0.0672837421298027,
-0.01514518354088068,
0.1243438571691513,
-0.010814870707690716,
0.06219784542918205,
0.01010952703654766,
-0.01087053120136261,
-0.017635324969887733,
-0.018193578347563744,
0.18414267897605896,
-0.0792209804058075,
-0.04900680482387543,
-0.016164081171154976,
0.05845167860388756,
0.062875896692276,
0.23493322730064392,
0.07456857711076736,
0.1318865865468979,
0.061282720416784286,
0.0878838524222374,
-0.0878811776638031,
0.015747519209980965,
-0.15598438680171967,
0.10708114504814148,
-0.020549725741147995,
0.11412478983402252,
-0.06693162769079208,
0.10897594690322876,
0.09651422500610352,
-0.09366475045681,
0.07402873784303665,
0.01730446144938469,
-0.07517789304256439,
-0.025086192414164543,
-0.09797529131174088,
-0.0648389533162117,
-0.14527961611747742,
-0.026763418689370155,
-0.053465306758880615,
-0.033620789647102356,
0.1237446591258049,
0.01869102008640766,
-0.007517021149396896,
0.2048092931509018,
0.004640218801796436,
-0.047137148678302765,
0.0636257603764534,
0.02549813687801361,
0.037332355976104736,
0.09061706066131592,
0.0012917878339067101,
0.06323159486055374,
-0.09145757555961609,
0.08283959329128265,
0.02056916058063507,
-0.020270582288503647,
0.03245645388960838,
0.0385577417910099,
-0.02098613791167736,
-0.057233985513448715,
0.022710565477609634,
0.07484451681375504,
0.1622024029493332,
0.019082453101873398,
-0.063778355717659,
-0.04769653454422951,
0.1767035871744156,
-0.08138548582792282,
-0.055582620203495026,
-0.10675260424613953,
0.14204980432987213,
0.06815875321626663,
-0.01806032657623291,
0.03449179232120514,
-0.074767105281353,
-0.017552044242620468,
0.2683001756668091,
0.11143510043621063,
-0.028534911572933197,
-0.033146485686302185,
0.037499796599149704,
-0.021832138299942017,
-0.014255668967962265,
0.1702527105808258,
0.02671007253229618,
0.21414561569690704,
-0.016909081488847733,
-0.01052749902009964,
-0.028999347239732742,
-0.044121645390987396,
-0.07427380979061127,
0.14900453388690948,
0.01950804702937603,
0.058243848383426666,
-0.07794470340013504,
0.02021438628435135,
0.05696776509284973,
-0.1169198602437973,
0.15287071466445923,
-0.07477502524852753,
-0.0710272416472435,
0.01813318207859993,
0.020311379805207253,
-0.003622101154178381,
0.059039365500211716,
-0.03910320624709129,
0.09314059466123581,
0.05024610087275505,
-0.03209789842367172,
-0.1171904131770134,
-0.1169431209564209,
0.052953921258449554,
0.03760857880115509,
0.14477737247943878,
0.02030458301305771,
0.04482000321149826,
0.08261089771986008,
0.0007560205413028598,
-0.11404656618833542,
0.08270340412855148,
0.016477320343255997,
-0.02217799797654152,
0.07046841084957123,
0.027238473296165466,
-0.03792576119303703,
0.07205304503440857,
-0.0027079894207417965,
-0.066769078373909,
-0.032395847141742706,
-0.02452300488948822,
-0.005279672332108021,
-0.17059184610843658,
-0.01024786476045847,
-0.03713495284318924,
0.1263410449028015,
0.19363650679588318,
-0.04332192242145538,
-0.02640300616621971,
-0.06237085163593292,
0.03536223620176315,
0.008008080534636974,
0.030063113197684288,
-0.008708707988262177,
-0.12991301715373993,
0.006732027977705002,
-0.03152388334274292,
0.0005168701754882932,
-0.19375988841056824,
-0.05909371376037598,
-0.02603868581354618,
-0.03309965133666992,
-0.07736263424158096,
0.12119397521018982,
0.033281926065683365,
0.02431744523346424,
-0.03912454470992088,
-0.0800013542175293,
-0.03438667953014374,
0.062316253781318665,
-0.13486720621585846,
-0.114332415163517
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the php function/method.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_php_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_php_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/php/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 18,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_php_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the php function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 18,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 18,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 18,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
109
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 18,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1182592436671257,
0.07826052606105804,
-0.0014159457059577107,
0.12145854532718658,
0.04586518555879593,
0.02164592035114765,
0.02154499478638172,
0.09264113008975983,
-0.038111113011837006,
0.06550318002700806,
0.06469960510730743,
-0.07739424705505371,
0.04547645524144173,
0.1692025363445282,
0.031921908259391785,
-0.15401281416416168,
-0.037801939994096756,
0.03910402953624725,
-0.09630617499351501,
0.11094555258750916,
0.07115347683429718,
-0.08697375655174255,
0.06228964030742645,
-0.03618558868765831,
-0.08913370966911316,
0.05519716069102287,
-0.031054271385073662,
-0.01470449473708868,
0.09584732353687286,
0.060618020594120026,
0.1132318377494812,
-0.03720946982502937,
0.05680792033672333,
-0.20069250464439392,
0.006012898404151201,
0.029192740097641945,
0.05294708535075188,
0.032736826688051224,
0.05315886810421944,
0.07610804587602615,
0.08512210100889206,
-0.009835595265030861,
0.036681223660707474,
0.061912454664707184,
-0.056684281677007675,
-0.08197814226150513,
-0.05462663248181343,
0.07496973127126694,
0.07533938437700272,
0.11117824912071228,
-0.003939865157008171,
0.001979174092411995,
-0.08410291373729706,
0.0701141208410263,
0.12390455603599548,
-0.22445513308048248,
-0.02139260433614254,
0.09901034832000732,
0.07313647866249084,
0.039319220930337906,
-0.08402738720178604,
-0.03928586095571518,
0.09702493995428085,
0.04392918571829796,
0.0521964393556118,
-0.0938783809542656,
-0.005659176968038082,
0.005592985078692436,
-0.051378339529037476,
-0.06178473308682442,
0.16905848681926727,
0.054756876081228256,
-0.042308155447244644,
-0.11027539521455765,
-0.05894957855343819,
-0.16961044073104858,
0.02095496654510498,
0.014864870347082615,
0.02238350920379162,
0.006004771217703819,
-0.005608587060123682,
-0.022006254643201828,
-0.09629901498556137,
-0.11919253319501877,
0.029919831082224846,
0.011624458245933056,
0.06675191968679428,
0.027491316199302673,
-0.014257539063692093,
0.09130153059959412,
0.03191152960062027,
-0.03954571112990379,
-0.009555162861943245,
0.0296018123626709,
-0.08684705942869186,
0.039882611483335495,
-0.01640424318611622,
-0.0670914351940155,
0.011136761866509914,
0.09188845008611679,
-0.12813980877399445,
0.08763521909713745,
0.11389266699552536,
0.01468634232878685,
-0.019418546929955482,
0.21537677943706512,
0.04401561990380287,
-0.1348779797554016,
0.020105751231312752,
0.028308168053627014,
-0.00781979225575924,
0.028253301978111267,
-0.05911070480942726,
-0.04999764636158943,
0.013178556226193905,
0.06019867584109306,
-0.12889494001865387,
0.02398521639406681,
-0.05235546827316284,
-0.00535390293225646,
0.09646572917699814,
-0.12241494655609131,
0.03250619396567345,
0.019500980153679848,
-0.050528980791568756,
-0.04228537902235985,
0.08867960423231125,
-0.14196571707725525,
-0.14002829790115356,
0.03352651745080948,
-0.04099344089627266,
-0.04250748082995415,
-0.10454170405864716,
-0.09744057059288025,
0.002947842935100198,
-0.016729284077882767,
-0.004541891627013683,
-0.10470594465732574,
-0.07764579355716705,
-0.028605807572603226,
0.04469271004199982,
0.006628058385103941,
-0.027829427272081375,
-0.05148869752883911,
0.017913326621055603,
0.0007674467633478343,
-0.02943604812026024,
0.0247665848582983,
-0.044423867017030716,
0.09354788064956665,
0.06640424579381943,
0.03195694461464882,
-0.0011992462677881122,
0.027432028204202652,
-0.08118347078561783,
0.0794457197189331,
-0.1023108959197998,
0.05933208763599396,
-0.009764688089489937,
0.05408865958452225,
-0.11435966938734055,
-0.08755176514387131,
-0.006393363233655691,
0.05399831384420395,
0.08027255535125732,
0.05257973447442055,
-0.09472954273223877,
0.017591413110494614,
0.16622905433177948,
-0.09491968899965286,
-0.13391540944576263,
0.10575465112924576,
-0.00551728717982769,
0.02122858166694641,
0.06570599228143692,
0.14726674556732178,
0.13894863426685333,
-0.0986514613032341,
-0.020956439897418022,
0.08637560158967972,
0.0711732879281044,
-0.09325170516967773,
0.06501106917858124,
-0.014776068739593029,
0.0029109525494277477,
0.03687553480267525,
0.05004945024847984,
0.05856849253177643,
0.010856742970645428,
-0.0369858555495739,
-0.03513637185096741,
-0.08998740464448929,
-0.03345581889152527,
0.008700872771441936,
0.01990554668009281,
-0.05708327889442444,
-0.07568354904651642,
0.024504490196704865,
0.1660337746143341,
-0.10859186202287674,
0.04185177758336067,
-0.06540696322917938,
-0.024169275537133217,
-0.07691431045532227,
0.029708702117204666,
-0.12480511516332626,
0.022391635924577713,
0.0433165617287159,
-0.020435746759176254,
0.0639546662569046,
0.07763759791851044,
0.02679372765123844,
-0.0018679759232327342,
-0.06583278626203537,
-0.04785534739494324,
-0.040013622492551804,
-0.07068731635808945,
-0.11727256327867508,
-0.023104501888155937,
-0.07526231557130814,
-0.015198476612567902,
-0.03055267035961151,
-0.17085283994674683,
-0.003954891115427017,
-0.004251992329955101,
0.016112344339489937,
0.00910621602088213,
-0.03213605284690857,
0.027654467150568962,
0.04420466348528862,
-0.027307631447911263,
-0.07382935285568237,
0.03200383111834526,
0.043845564126968384,
-0.08548681437969208,
-0.03169906511902809,
-0.0706075131893158,
-0.048799287527799606,
0.08281168341636658,
0.07448145002126694,
-0.1083037480711937,
-0.04471368342638016,
-0.030533287674188614,
-0.04313652589917183,
-0.030174048617482185,
-0.054033465683460236,
0.1860555112361908,
0.014615204185247421,
0.17946605384349823,
-0.1488269567489624,
-0.06905994564294815,
-0.019752010703086853,
0.011858636513352394,
0.0381932258605957,
0.16696366667747498,
0.0031980276107788086,
-0.05262305960059166,
0.0393366739153862,
-0.00786589179188013,
-0.04920954629778862,
0.13400837779045105,
-0.034835245460271835,
-0.0771288275718689,
0.014819110743701458,
0.11726436764001846,
-0.005024706479161978,
0.17256446182727814,
-0.07934529334306717,
0.0040043918415904045,
-0.010217108763754368,
0.020277176052331924,
0.040094561874866486,
-0.13567255437374115,
0.03124680370092392,
0.038796983659267426,
-0.07664305716753006,
-0.015431507490575314,
-0.027176931500434875,
-0.04656115174293518,
0.04518597573041916,
0.031010525301098824,
0.028777284547686577,
-0.017044590786099434,
-0.023729238659143448,
-0.10516132414340973,
0.19849899411201477,
-0.07649224996566772,
-0.20758138597011566,
-0.1635170578956604,
0.09628718346357346,
-0.03614621236920357,
-0.023994887247681618,
0.03293757140636444,
-0.13168057799339294,
-0.057516030967235565,
-0.09979908913373947,
0.0834856852889061,
-0.12388977408409119,
-0.0003587724640965462,
-0.050185687839984894,
0.05685807019472122,
0.06972882896661758,
-0.16695056855678558,
0.025284890085458755,
-0.02691962756216526,
0.011693279258906841,
-0.032790567725896835,
-0.05608092620968819,
0.08937589079141617,
0.1305239498615265,
-0.061867568641901016,
0.03321187570691109,
-0.005628661718219519,
0.15896642208099365,
-0.051877930760383606,
0.030207790434360504,
0.1817629337310791,
0.03157835826277733,
0.03478432819247246,
0.03656408190727234,
0.020302532240748405,
-0.08617720752954483,
0.057121314108371735,
0.06800414621829987,
-0.03277716785669327,
-0.23604333400726318,
-0.03817258030176163,
-0.07488052546977997,
0.037497859448194504,
0.13253863155841827,
0.07076166570186615,
-0.15896831452846527,
0.026267126202583313,
-0.008569736033678055,
0.1443050652742386,
-0.0383751280605793,
0.04936134070158005,
0.06548679620027542,
0.005888523068279028,
-0.00842166319489479,
-0.10616184026002884,
-0.004427035339176655,
0.08345498144626617,
0.11318004131317139,
0.1984890103340149,
-0.10115181654691696,
0.17056092619895935,
0.019795218482613564,
0.10235810279846191,
0.014962079003453255,
0.08817203342914581,
-0.12997916340827942,
0.017846111208200455,
0.006926069036126137,
-0.01948590576648712,
-0.061811935156583786,
0.04681408405303955,
-0.047520801424980164,
0.06069997698068619,
-0.08994493633508682,
-0.017320862039923668,
0.026362869888544083,
0.20240777730941772,
0.06611605733633041,
-0.1672268509864807,
-0.12902742624282837,
-0.006414805073291063,
-0.10121875256299973,
-0.10880009084939957,
0.08042573928833008,
0.1958913654088974,
-0.04301665723323822,
0.015503871254622936,
-0.012669743970036507,
0.13994914293289185,
-0.10036903619766235,
-0.023785535246133804,
0.039714694023132324,
0.05829574540257454,
0.0067003038711845875,
0.11992776393890381,
-0.23533865809440613,
0.10044842958450317,
0.021395601332187653,
0.09092047065496445,
-0.017635855823755264,
0.049304984509944916,
-0.04137881100177765,
-0.015013341791927814,
0.08152429759502411,
0.013629703782498837,
-0.012500018812716007,
-0.18604278564453125,
-0.03812442347407341,
0.03624830022454262,
0.04496460780501366,
0.00006235505861695856,
0.07736048102378845,
-0.025115758180618286,
0.04043689742684364,
-0.02268218621611595,
-0.1448909491300583,
-0.048352621495723724,
-0.1401732712984085,
-0.055910129100084305,
0.005710966885089874,
-0.056354962289333344,
-0.021914295852184296,
0.04034576937556267,
0.031600844115018845,
0.23935987055301666,
-0.145070880651474,
-0.09119897335767746,
-0.09845041483640671,
0.0625080019235611,
0.13193625211715698,
-0.09880365431308746,
0.03878840059041977,
0.0007730847573839128,
0.0252583809196949,
-0.03398965299129486,
-0.06882038712501526,
0.032067395746707916,
-0.04226593300700188,
-0.07647734880447388,
-0.02614019438624382,
0.11649985611438751,
-0.015248753130435944,
0.040625471621751785,
0.00008233082189690322,
-0.08925130218267441,
-0.04954510182142258,
-0.13094832003116608,
-0.08198879659175873,
-0.030604159459471703,
0.060179587453603745,
-0.02639462798833847,
-0.10592552274465561,
0.11730724573135376,
0.010735136456787586,
-0.078513965010643,
0.08258982747793198,
0.15229842066764832,
-0.07247219234704971,
0.011917331255972385,
0.11611514538526535,
-0.05431941896677017,
-0.18796324729919434,
-0.0442487895488739,
0.029371440410614014,
0.05293922871351242,
-0.05240534618496895,
-0.14731988310813904,
0.07310185581445694,
0.00266365148127079,
0.014042569324374199,
0.011249682866036892,
-0.2654181718826294,
-0.13169902563095093,
0.01800006628036499,
0.07279101759195328,
0.06728225946426392,
-0.1153133437037468,
-0.04491616040468216,
-0.061469338834285736,
-0.04462778568267822,
0.028705786913633347,
0.06474900990724564,
0.1108255460858345,
-0.040841519832611084,
0.037292640656232834,
0.03544784337282181,
-0.030317610129714012,
0.05554492399096489,
-0.01591038703918457,
0.09346481412649155,
-0.018166791647672653,
0.038000088185071945,
0.04159584641456604,
-0.06147124618291855,
0.1796479970216751,
-0.16608235239982605,
0.10634159296751022,
-0.20296108722686768,
-0.05398436263203621,
-0.02078091911971569,
-0.012732665054500103,
-0.02911738120019436,
-0.04395333677530289,
-0.12123430520296097,
0.02583094872534275,
0.0543159618973732,
-0.02404169552028179,
0.020024873316287994,
-0.018727248534560204,
-0.058042608201503754,
0.06213662773370743,
0.07833871245384216,
-0.02961433306336403,
-0.17772532999515533,
0.0378277562558651,
0.014290068298578262,
0.09547241032123566,
-0.2301807403564453,
0.022721389308571815,
0.11045625805854797,
0.021814974024891853,
0.08344302326440811,
0.011336552910506725,
-0.08419923484325409,
0.023529456928372383,
0.06411771476268768,
-0.06724035739898682,
-0.10627026110887527,
-0.018932398408651352,
-0.07294543832540512,
-0.11270161718130112,
0.04041927307844162,
0.07941559702157974,
-0.05551457777619362,
-0.0006816627574153244,
-0.0028015582356601954,
-0.0007875450537540019,
-0.07434826344251633,
0.19832584261894226,
0.04171226918697357,
0.06693578511476517,
-0.06043378636240959,
0.07828617095947266,
0.09139958024024963,
-0.13154983520507812,
0.03938436135649681,
0.1576230674982071,
-0.07945151627063751,
-0.027363713830709457,
0.09405394643545151,
0.09141306579113007,
-0.02343573607504368,
-0.06635870784521103,
-0.10116713494062424,
-0.07056985795497894,
0.03518836200237274,
0.013499821536242962,
0.07315662503242493,
0.07070953398942947,
-0.019611874595284462,
0.003139678854495287,
-0.10901199281215668,
0.09991250932216644,
0.08448777347803116,
0.03649537265300751,
-0.12657196819782257,
0.11453136056661606,
0.04453069716691971,
0.07977786660194397,
0.005891611333936453,
0.025531329214572906,
-0.1123577281832695,
0.04673939198255539,
-0.0424996092915535,
0.0451156422495842,
0.0007693886873312294,
0.051162488758563995,
-0.04746894910931587,
0.04376859962940216,
-0.027758004143834114,
0.05446110665798187,
-0.03628432750701904,
-0.026486538350582123,
-0.019583936780691147,
0.03789936751127243,
-0.07374811917543411,
-0.008834867738187313,
-0.0015137471491470933,
-0.06913729757070541,
0.09504383057355881,
-0.06467299163341522,
-0.01996828056871891,
0.0031388308852910995,
-0.0016029522521421313,
0.05620196461677551,
0.018713904544711113,
0.05429284647107124,
-0.017656460404396057,
0.03276502713561058,
0.04509531706571579,
0.02508852630853653,
-0.007440741173923016,
-0.005571240093559027,
0.09780626744031906,
-0.14376311004161835,
-0.08049460500478745,
-0.1086583212018013,
-0.07072558999061584,
-0.06159791722893715,
0.0738309770822525,
0.0890849158167839,
0.07808664441108704,
0.08195976167917252,
-0.0489991120994091,
-0.00025168960564769804,
-0.17590264976024628,
-0.0432572104036808,
0.05648646876215935,
-0.005360826384276152,
-0.11006303131580353,
-0.04771662503480911,
0.055887460708618164,
-0.03645980358123779,
0.13148178160190582,
-0.005157789681106806,
0.06350275874137878,
-0.009768795222043991,
-0.041458819061517715,
-0.017990564927458763,
-0.009611529298126698,
0.16338114440441132,
-0.10074254870414734,
-0.00019033443822991103,
-0.010440371930599213,
0.017959244549274445,
0.045268911868333817,
0.18909010291099548,
0.07855711132287979,
0.1375923752784729,
0.04291807487607002,
0.09210199862718582,
-0.06238746643066406,
-0.016986491158604622,
-0.12848302721977234,
0.08264806866645813,
-0.020148850977420807,
0.05410529673099518,
-0.05162946507334709,
0.1495368778705597,
0.10127373784780502,
-0.12941308319568634,
0.1044541746377945,
0.01411011815071106,
-0.09146814793348312,
-0.0434332974255085,
-0.09529516100883484,
-0.051311418414115906,
-0.12451735883951187,
0.007529687136411667,
-0.09022320061922073,
-0.01250937394797802,
0.07747434079647064,
0.03333400562405586,
-0.023644497618079185,
0.1602657586336136,
-0.003357189241796732,
-0.05307450145483017,
0.04115156829357147,
0.041737835854291916,
0.03365008905529976,
0.10535764694213867,
0.014168248511850834,
0.07301605492830276,
-0.08296617865562439,
0.06591124087572098,
0.040526360273361206,
-0.007262165192514658,
0.015431864187121391,
0.030294809490442276,
0.0066004469990730286,
-0.06002836301922798,
0.016562601551413536,
0.08670017123222351,
0.19274164736270905,
0.04827283322811127,
-0.05417736992239952,
-0.05066211149096489,
0.1595359444618225,
-0.04399864748120308,
-0.04750955477356911,
-0.1170436292886734,
0.15774567425251007,
0.04593474045395851,
0.00036255689337849617,
0.020950879901647568,
-0.07581885904073715,
-0.016602950170636177,
0.2727998197078705,
0.05900641158223152,
-0.04752213507890701,
-0.02630637399852276,
0.008158923126757145,
-0.008905128575861454,
-0.03174318000674248,
0.1736423224210739,
0.011940722353756428,
0.24648697674274445,
0.009694945998489857,
-0.033854443579912186,
-0.045257847756147385,
-0.04473794251680374,
-0.020717862993478775,
0.20164300501346588,
-0.03028295189142227,
0.02629922330379486,
-0.09047362208366394,
-0.009587476961314678,
0.027217427268624306,
-0.1201762855052948,
0.13451537489891052,
-0.11918752640485764,
-0.0660523846745491,
0.012845929712057114,
0.03953588381409645,
-0.03205277770757675,
0.029552312567830086,
-0.023251868784427643,
0.0678727924823761,
0.03575515002012253,
-0.03162722289562225,
-0.10976680368185043,
-0.15002499520778656,
0.0455007404088974,
0.007273487746715546,
0.1623183935880661,
0.024005912244319916,
0.07800756394863129,
0.08239048719406128,
0.021273480728268623,
-0.07695986330509186,
0.08354063332080841,
0.04124792292714119,
-0.0222136452794075,
0.05895847827196121,
0.07201030850410461,
-0.037750244140625,
0.1259978860616684,
0.008948300965130329,
0.01101849228143692,
-0.011926595121622086,
-0.024892188608646393,
-0.018037600442767143,
-0.17485478520393372,
0.0001497069315519184,
-0.07452085614204407,
0.12731753289699554,
0.18292003870010376,
-0.04241436347365379,
-0.023119300603866577,
-0.05903388932347298,
0.07253403961658478,
-0.02124675363302231,
0.05691402778029442,
0.005233979783952236,
-0.14673058688640594,
0.01657252386212349,
-0.006849070079624653,
-0.0030382031109184027,
-0.20988556742668152,
-0.04589376226067543,
-0.024416053667664528,
-0.02191448211669922,
-0.07936390489339828,
0.1559339016675949,
0.046870987862348557,
0.033862218260765076,
-0.035553302615880966,
-0.12764035165309906,
-0.01753631979227066,
0.05134579539299011,
-0.12760430574417114,
-0.1094900444149971
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/python/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1274278163909912,
-0.018003858625888824,
-0.0007026346866041422,
0.1326725035905838,
0.11778698861598969,
0.03352697938680649,
0.06236634403467178,
0.06922447681427002,
-0.03205614537000656,
0.019971279427409172,
0.04771578311920166,
0.02180682122707367,
0.033002447336912155,
0.1910034418106079,
0.016269460320472717,
-0.12856045365333557,
-0.020364003255963326,
0.045229412615299225,
-0.0439642108976841,
0.13407953083515167,
0.08247711509466171,
-0.05922936275601387,
0.05066255107522011,
-0.06410758942365646,
-0.234196275472641,
0.05258007347583771,
0.005541692487895489,
-0.05451280623674393,
0.10270348191261292,
0.034558240324258804,
0.1396876722574234,
-0.023937024176120758,
0.0305919386446476,
-0.14289282262325287,
0.008005338720977306,
0.015895364806056023,
0.025698205456137657,
0.009085968136787415,
0.045121386647224426,
0.030827486887574196,
0.1653156578540802,
0.0012563425116240978,
0.06939836591482162,
0.05908947065472603,
-0.07808031886816025,
-0.13518212735652924,
-0.011307028122246265,
0.029005099087953568,
0.04543716832995415,
0.10912583023309708,
-0.014958156272768974,
0.13387712836265564,
-0.1438223123550415,
0.13796597719192505,
0.09336572140455246,
-0.23037071526050568,
-0.009741438552737236,
0.12014109641313553,
0.0813521295785904,
0.08807704597711563,
-0.045009493827819824,
-0.06853822618722916,
0.09356388449668884,
0.05598057061433792,
0.031116437166929245,
-0.09370523691177368,
-0.08446529507637024,
0.0014915746869519353,
-0.09335176646709442,
-0.07628244906663895,
0.20555749535560608,
-0.012332165613770485,
-0.0866272822022438,
-0.05189371481537819,
-0.03272673860192299,
-0.14115513861179352,
0.026071913540363312,
0.052880264818668365,
-0.002056858967989683,
-0.03402248024940491,
0.0017915546195581555,
0.03687784820795059,
-0.0678776428103447,
-0.13845860958099365,
0.026137495413422585,
0.1109623908996582,
0.06491708010435104,
0.026914913207292557,
-0.09821300208568573,
0.11232398450374603,
0.037103794515132904,
-0.04503381624817848,
-0.023047732189297676,
-0.009472002275288105,
-0.10999558866024017,
0.0288376584649086,
-0.048826996237039566,
-0.15741948783397675,
0.008454602211713791,
0.05155804753303528,
-0.03446580842137337,
0.05170835554599762,
0.028120828792452812,
0.026640666648745537,
0.021435746923089027,
0.19807195663452148,
0.06061128154397011,
-0.1076965183019638,
0.058974381536245346,
0.040950946509838104,
-0.03142918646335602,
-0.007958277128636837,
-0.06776317209005356,
-0.09544197469949722,
0.10222696512937546,
0.10467082262039185,
-0.10744664072990417,
0.04531309753656387,
-0.06809671223163605,
-0.042246002703905106,
-0.037290655076503754,
-0.15492333471775055,
0.005886344704777002,
0.028580155223608017,
-0.065375916659832,
-0.028989428654313087,
0.10406826436519623,
-0.17651675641536713,
-0.15048567950725555,
-0.019966568797826767,
-0.07823190093040466,
-0.03423064574599266,
-0.15478120744228363,
-0.1724383682012558,
-0.014004855416715145,
-0.03423532471060753,
0.025785939767956734,
-0.09265229851007462,
-0.15136194229125977,
-0.019345922395586967,
0.026413913816213608,
0.016901860013604164,
-0.012337776832282543,
-0.07272107154130936,
-0.00720475148409605,
-0.023302186280488968,
-0.031415700912475586,
-0.0092836394906044,
-0.05330844968557358,
0.13165205717086792,
0.10498519986867905,
0.04328392446041107,
-0.03255430608987808,
0.053889043629169464,
-0.07024331390857697,
0.06173505261540413,
-0.09148002415895462,
0.08837161213159561,
-0.06029358506202698,
0.08952721953392029,
-0.028984954580664635,
-0.1162450984120369,
0.08562980592250824,
0.0590357631444931,
0.06652697175741196,
0.04464474692940712,
-0.12184308469295502,
-0.033253543078899384,
0.19648997485637665,
-0.11799569427967072,
-0.12997116148471832,
0.11617133021354675,
-0.039000578224658966,
0.07408145070075989,
0.09780944883823395,
0.1256549209356308,
0.1652628779411316,
-0.05666234344244003,
0.014543930068612099,
0.055866919457912445,
0.04989359527826309,
-0.1143319308757782,
0.0797853171825409,
0.054859377443790436,
-0.09263141453266144,
0.05482167750597,
-0.011796889826655388,
0.11270587891340256,
-0.011562978848814964,
-0.023897577077150345,
-0.051691994071006775,
-0.07462746649980545,
0.02230374701321125,
0.004340036306530237,
0.05744065344333649,
-0.08137594163417816,
-0.08743474632501602,
0.09064269810914993,
0.1922689527273178,
-0.13250991702079773,
-0.0035489709116518497,
-0.09086394309997559,
0.07322785258293152,
-0.0665055513381958,
0.02030700445175171,
-0.1627984642982483,
0.01136066485196352,
0.06000588834285736,
-0.008991139940917492,
0.07288667559623718,
0.115981824696064,
0.016444237902760506,
0.04118981957435608,
0.009048824198544025,
-0.013537324033677578,
-0.12405899167060852,
-0.05931719392538071,
-0.07358571141958237,
-0.04720601812005043,
-0.09016803652048111,
-0.05643519386649132,
0.0010993870673701167,
-0.20113824307918549,
0.018311189487576485,
0.0073918127454817295,
-0.003904538694769144,
0.01916196011006832,
-0.01901601254940033,
0.017881711944937706,
0.07522497326135635,
-0.06370387226343155,
-0.039784375578165054,
0.0315505713224411,
0.018229424953460693,
-0.04224216938018799,
-0.0984119400382042,
-0.10531828552484512,
0.0021095352713018656,
0.12041124701499939,
0.04944496229290962,
-0.0892510935664177,
0.03679122030735016,
-0.010405494831502438,
-0.03478267788887024,
0.013009968213737011,
-0.0653754323720932,
0.15670888125896454,
-0.0028337938711047173,
0.20402872562408447,
-0.14881105720996857,
-0.02688213810324669,
-0.027076058089733124,
0.01676389016211033,
0.07029344141483307,
0.13342681527137756,
-0.0200443547219038,
-0.063149094581604,
0.06460793316364288,
-0.0005403390969149768,
-0.10439115017652512,
0.21105362474918365,
-0.04902651906013489,
-0.0957259088754654,
0.03356315195560455,
0.11014624685049057,
-0.00319552980363369,
0.16749577224254608,
-0.185720294713974,
-0.024093851447105408,
0.014035724103450775,
0.005864428821951151,
0.06541251391172409,
-0.1304931789636612,
0.005379328969866037,
0.02106931246817112,
-0.0655437782406807,
-0.10935612767934799,
-0.01717205159366131,
0.00001352006529486971,
0.04308771342039108,
-0.007161625195294619,
-0.03565957024693489,
0.009348503313958645,
-0.03051331825554371,
-0.11521388590335846,
0.22951845824718475,
-0.09579236805438995,
-0.22000299394130707,
-0.1990734338760376,
0.05931101366877556,
-0.05489477142691612,
-0.02224631980061531,
0.03177667781710625,
-0.08401656150817871,
-0.047059886157512665,
-0.04302579537034035,
0.19618839025497437,
-0.09420067071914673,
-0.009813180193305016,
-0.033033955842256546,
0.06701339036226273,
0.016261287033557892,
-0.20765432715415955,
0.039605915546417236,
-0.007587732281535864,
-0.026599083095788956,
0.016597354784607887,
-0.11448108404874802,
0.08176527172327042,
0.16368260979652405,
-0.08227222412824631,
0.014697961509227753,
-0.002021556254476309,
0.2127595692873001,
-0.039476778358221054,
-0.07697898149490356,
0.13364163041114807,
-0.02635020576417446,
-0.00011903070844709873,
0.010880687274038792,
-0.013242701068520546,
-0.1015867292881012,
0.05782423913478851,
-0.013091573491692543,
-0.030970992520451546,
-0.2604786157608032,
-0.026038972660899162,
-0.08163485676050186,
0.0641247034072876,
0.06607882678508759,
0.03874390199780464,
-0.09988665580749512,
0.03343592956662178,
0.049341026693582535,
0.14348609745502472,
-0.006099742837250233,
0.06446994841098785,
0.05960801616311073,
0.0025676684454083443,
0.00832460168749094,
-0.09850054979324341,
0.00821283832192421,
0.07284989207983017,
0.10896304994821548,
0.2706715762615204,
-0.0945136547088623,
0.1761198788881302,
0.03394436836242676,
0.05557306855916977,
0.04912586137652397,
0.1423371285200119,
-0.12574051320552826,
0.03731165826320648,
0.016489390283823013,
0.0012944408226758242,
-0.11309053003787994,
0.0023149345070123672,
-0.037116821855306625,
0.07212238758802414,
-0.11881450563669205,
-0.061901506036520004,
-0.003298007184639573,
0.1350032091140747,
0.06669682264328003,
-0.23853084444999695,
-0.12583746016025543,
0.013307730667293072,
-0.1014469712972641,
-0.11090884357690811,
0.0619114488363266,
0.18959122896194458,
-0.0889117643237114,
-0.023378070443868637,
-0.021769769489765167,
0.1369360238313675,
-0.041089437901973724,
-0.03201382979750633,
-0.04898291081190109,
0.04829268902540207,
0.0059660570695996284,
0.1263684779405594,
-0.2835621237754822,
0.13121624290943146,
-0.011657852679491043,
0.065365269780159,
-0.027287116274237633,
0.05494839698076248,
-0.035063911229372025,
0.07933831959962845,
0.03562545403838158,
-0.011691422201693058,
0.050709858536720276,
-0.17382724583148956,
-0.0010213467758148909,
0.04194276034832001,
0.023074518889188766,
0.05872566998004913,
0.08688703179359436,
-0.01522138062864542,
0.069414883852005,
-0.019067388027906418,
-0.14562207460403442,
-0.046766459941864014,
-0.08041620254516602,
-0.021467357873916626,
-0.06195327639579773,
-0.02014971151947975,
-0.038252394646406174,
-0.011814691126346588,
0.06861360371112823,
0.18301326036453247,
-0.10063043981790543,
-0.09170453250408173,
-0.08057132363319397,
0.06104148551821709,
0.10230503231287003,
-0.08587728440761566,
0.050475094467401505,
-0.007713182829320431,
0.02699068747460842,
-0.012298382818698883,
-0.09122823923826218,
0.04913417249917984,
-0.03991790488362312,
-0.057115815579891205,
-0.0023429186549037695,
0.068272165954113,
0.0005074927466921508,
0.04100627079606056,
0.011358351446688175,
-0.08469685167074203,
-0.057237569242715836,
-0.10766978561878204,
-0.1420217752456665,
-0.05499514192342758,
0.007496245205402374,
0.06812040507793427,
-0.13944463431835175,
-0.05339223891496658,
-0.02401437796652317,
-0.015056254342198372,
0.15039674937725067,
0.17420601844787598,
-0.05661170557141304,
0.021047141402959824,
0.10917418450117111,
-0.054320983588695526,
-0.19087083637714386,
0.021994642913341522,
0.04965510591864586,
0.11500609666109085,
-0.03631363809108734,
-0.1845463216304779,
0.04009126126766205,
0.01948351040482521,
0.04067227989435196,
0.07436643540859222,
-0.31807559728622437,
-0.12072093039751053,
0.10290843993425369,
0.1520799845457077,
0.11413389444351196,
-0.1287577897310257,
-0.02162203937768936,
-0.05702187493443489,
-0.13111752271652222,
0.09270503371953964,
-0.041268978267908096,
0.13451069593429565,
-0.06528845429420471,
0.06276381015777588,
0.040944602340459824,
-0.036945223808288574,
0.07647540420293808,
0.017260095104575157,
0.10479138791561127,
-0.0395858958363533,
0.02117009647190571,
0.10996054857969284,
-0.03751020133495331,
0.1744617074728012,
-0.15025803446769714,
0.09680162370204926,
-0.2662121057510376,
-0.05762642249464989,
-0.06564830243587494,
0.005022169556468725,
-0.03290814161300659,
-0.04590213671326637,
-0.09078128635883331,
0.01864830031991005,
0.003354590618982911,
-0.007269889581948519,
0.035745520144701004,
-0.025955673307180405,
-0.0064231520518660545,
0.0683567225933075,
0.10485925525426865,
0.010033050552010536,
-0.11122816056013107,
0.05289587378501892,
0.04903115704655647,
0.10214167833328247,
-0.19174137711524963,
0.020587021484971046,
0.11176882684230804,
0.03193990886211395,
0.1167435497045517,
0.047854792326688766,
-0.10517729818820953,
0.04844731092453003,
0.08772767335176468,
-0.06677886098623276,
-0.07955630868673325,
-0.005011177621781826,
-0.08889961987733841,
-0.0912967249751091,
0.04081117734313011,
0.10320600122213364,
-0.043754950165748596,
-0.024361614137887955,
-0.023627737537026405,
-0.019163265824317932,
-0.11786768585443497,
0.19392850995063782,
0.07497650384902954,
0.08123783022165298,
-0.06979493051767349,
0.044339776039123535,
0.07550714910030365,
-0.06726755946874619,
0.005048817954957485,
0.17731846868991852,
-0.09951779246330261,
-0.044420771300792694,
0.0719510018825531,
0.18544399738311768,
-0.04723242670297623,
-0.0414169579744339,
-0.12435293197631836,
-0.067539282143116,
0.023166511207818985,
0.15887311100959778,
0.10714893788099289,
0.08601447939872742,
-0.035779908299446106,
0.0012388898758217692,
-0.11731080710887909,
0.08465764671564102,
0.0742044597864151,
0.044635381549596786,
-0.12222456932067871,
0.15422891080379486,
0.043398771435022354,
0.11542072892189026,
-0.025795256718993187,
-0.014293267391622066,
-0.14333342015743256,
0.07503596693277359,
-0.10279729962348938,
0.03304477408528328,
-0.006987855304032564,
0.056874167174100876,
-0.017192723229527473,
0.005717538762837648,
-0.031013458967208862,
0.05604090169072151,
-0.09079019725322723,
-0.00033939522108994424,
0.010301116853952408,
0.03177814930677414,
-0.0433468371629715,
-0.00831479113548994,
0.034372925758361816,
-0.10064245015382767,
0.1221342384815216,
-0.018940607085824013,
-0.024054933339357376,
0.09749763458967209,
-0.039990127086639404,
0.029887404292821884,
0.013377529568970203,
0.05319888889789581,
0.016063455492258072,
0.02028590999543667,
0.08079052716493607,
0.02327846921980381,
0.062278758734464645,
0.03932235762476921,
0.12545610964298248,
-0.13390898704528809,
-0.0863945409655571,
-0.055915188044309616,
-0.10931617021560669,
-0.05236174538731575,
0.09633158147335052,
0.02471640706062317,
0.11203030496835709,
0.11420998722314835,
-0.03374432399868965,
0.025858428329229355,
-0.11867732554674149,
-0.07265970855951309,
0.017265992239117622,
-0.0210981834679842,
-0.057736482471227646,
-0.05636491999030113,
0.04169880598783493,
-0.02296256646513939,
0.11530768126249313,
0.014549198560416698,
0.03586268424987793,
-0.024681583046913147,
-0.042852334678173065,
0.0067530544474720955,
0.00891509186476469,
0.22902706265449524,
-0.07320983707904816,
0.04373571649193764,
-0.001982353627681732,
0.011448017321527004,
0.013290141709148884,
0.12432011216878891,
0.1276862621307373,
0.15557923913002014,
-0.053645696491003036,
0.09116409718990326,
0.010742123238742352,
-0.0016779655124992132,
-0.09429100155830383,
-0.030554218217730522,
0.02367907389998436,
0.05931702256202698,
-0.04003637656569481,
0.20057809352874756,
0.0946231409907341,
-0.10218790173530579,
0.10606642812490463,
0.03402825444936752,
-0.1342097818851471,
-0.04656597226858139,
0.036169905215501785,
-0.026114696636795998,
-0.1473228484392166,
0.02837114967405796,
-0.12201834470033646,
-0.02694307640194893,
0.07559992372989655,
0.05519057810306549,
-0.07155676931142807,
0.16773004829883575,
0.024218568578362465,
-0.07221145927906036,
0.04354342818260193,
-0.0015906380722299218,
0.01858234591782093,
0.032314859330654144,
0.027532314881682396,
0.021750615909695625,
-0.046151041984558105,
0.05544757843017578,
0.02044004015624523,
-0.03891795501112938,
0.0018857858376577497,
-0.032617103308439255,
-0.001111937453970313,
-0.019949249923229218,
0.02855292707681656,
0.05709848925471306,
0.19212473928928375,
0.03573177382349968,
-0.0784355029463768,
-0.033407870680093765,
0.1663402020931244,
-0.029971936717629433,
-0.10346034169197083,
-0.1334022432565689,
0.17694272100925446,
0.028123218566179276,
0.007677608169615269,
0.01936645247042179,
-0.08543233573436737,
-0.05149545893073082,
0.21691356599330902,
0.06085857003927231,
-0.02295014262199402,
-0.018758051097393036,
0.002920114202424884,
0.00004700680437963456,
-0.03798205405473709,
0.2107146680355072,
0.024105457589030266,
0.2481667399406433,
0.02090790681540966,
-0.018711308017373085,
-0.07268043607473373,
-0.03505320847034454,
-0.01666419766843319,
0.10619493573904037,
-0.03600991144776344,
-0.04038437083363533,
-0.08847303688526154,
0.01865292340517044,
-0.005911688786000013,
-0.06382504105567932,
0.08704283833503723,
-0.12622006237506866,
-0.10084102302789688,
-0.044563472270965576,
0.02551298961043358,
-0.05400499328970909,
0.02810678444802761,
-0.03305158391594887,
0.031066615134477615,
0.05020302161574364,
-0.034424010664224625,
-0.11519560217857361,
-0.17399752140045166,
0.11105994880199432,
-0.02786925807595253,
0.13983668386936188,
-0.012008140794932842,
0.15415266156196594,
0.08933383226394653,
0.025327365845441818,
-0.05019870772957802,
0.09857204556465149,
0.03643786907196045,
0.05750751122832298,
0.05358630046248436,
0.1203230619430542,
-0.05678926035761833,
0.1176752895116806,
-0.046610794961452484,
-0.015326038002967834,
-0.027170278131961823,
-0.072896309196949,
-0.004281036555767059,
-0.15781740844249725,
-0.029739290475845337,
-0.1034468561410904,
0.08880271762609482,
0.2043416053056717,
-0.05000261217355728,
-0.023652350530028343,
-0.09281185269355774,
0.09133516997098923,
0.002695042174309492,
0.056017957627773285,
-0.03886573761701584,
-0.18763600289821625,
-0.016862554475665092,
-0.011009057983756065,
-0.003322360571473837,
-0.2801997661590576,
-0.001506209373474121,
-0.04340435191988945,
-0.02226131781935692,
-0.0955892950296402,
0.1763962358236313,
0.07662509381771088,
0.031332213431596756,
-0.03987100347876549,
-0.13035255670547485,
-0.04648508504033089,
0.06693197041749954,
-0.1442352533340454,
-0.13975411653518677
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the python function/method.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/python/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_python_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the python function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09253460168838501,
0.07891438901424408,
-0.001364770345389843,
0.10903935879468918,
0.05397115647792816,
0.032093193382024765,
0.030013587325811386,
0.10992896556854248,
-0.021490365266799927,
0.06651026755571365,
0.05803207680583,
-0.05913962796330452,
0.056165289133787155,
0.18373218178749084,
0.03231769800186157,
-0.16516657173633575,
-0.036502495408058167,
0.023050760850310326,
-0.06043118238449097,
0.10671914368867874,
0.0709029957652092,
-0.07171263545751572,
0.07043882459402084,
-0.04435222968459129,
-0.11471465229988098,
0.05067302659153938,
-0.025669896975159645,
-0.013979658484458923,
0.09503398835659027,
0.04975878447294235,
0.11613591760396957,
-0.041485901921987534,
0.06945732235908508,
-0.2138870507478714,
-0.0005897138034924865,
0.033318910747766495,
0.05915306136012077,
0.030716564506292343,
0.05512290820479393,
0.05778686702251434,
0.1351694017648697,
-0.019606254994869232,
0.06137816607952118,
0.05941331759095192,
-0.06697018444538116,
-0.08313000202178955,
-0.06249907612800598,
0.06619994342327118,
0.08010566234588623,
0.10419964790344238,
-0.009346415288746357,
0.016654156148433685,
-0.07375473529100418,
0.08989577740430832,
0.11956586688756943,
-0.21851305663585663,
-0.01834271475672722,
0.1092621237039566,
0.07822147756814957,
0.04352092742919922,
-0.07369300723075867,
-0.03992805629968643,
0.09782769531011581,
0.049882158637046814,
0.05266496166586876,
-0.09151087701320648,
-0.032085347920656204,
-0.020615486428141594,
-0.05877072736620903,
-0.05619452893733978,
0.14996321499347687,
0.028316885232925415,
-0.05500863492488861,
-0.1104094609618187,
-0.053843144327402115,
-0.19604671001434326,
0.03956771641969681,
0.032009776681661606,
0.007797394413501024,
-0.005284556187689304,
0.01902363821864128,
-0.014234649017453194,
-0.09057576209306717,
-0.10218510776758194,
0.029968921095132828,
0.03209871053695679,
0.06390418112277985,
0.03524808958172798,
-0.024703148752450943,
0.0983722135424614,
0.003490048460662365,
-0.04204539209604263,
-0.012682181783020496,
0.023330479860305786,
-0.11489825695753098,
0.013106431812047958,
0.003589554224163294,
-0.049193061888217926,
-0.0032910159789025784,
0.09752286225557327,
-0.0951903685927391,
0.08092722296714783,
0.09465231746435165,
0.006004950497299433,
0.017545510083436966,
0.21061451733112335,
0.04622300714254379,
-0.13474231958389282,
0.02123412862420082,
0.02396262064576149,
-0.00030009023612365127,
0.007996654137969017,
-0.04927092418074608,
-0.043171949684619904,
0.017825491726398468,
0.07815185189247131,
-0.1002291738986969,
0.014998983591794968,
-0.06036587059497833,
-0.009724481031298637,
0.05143022909760475,
-0.12238384038209915,
0.04490763694047928,
0.01636984571814537,
-0.04411877691745758,
-0.017617590725421906,
0.08581564575433731,
-0.1309409737586975,
-0.12347380816936493,
0.06714383512735367,
-0.04326227679848671,
-0.03071059286594391,
-0.11140089482069016,
-0.1205298900604248,
-0.005973410792648792,
-0.03655499219894409,
0.0020138106774538755,
-0.1000778004527092,
-0.08699337393045425,
-0.0210704505443573,
0.04103262349963188,
-0.0033727793488651514,
-0.03538721427321434,
-0.03932961821556091,
0.00895750056952238,
-0.002135198563337326,
-0.014191094785928726,
0.018366901203989983,
-0.03832732513546944,
0.09531556069850922,
0.07843326777219772,
0.03993803635239601,
-0.0018089794320985675,
0.025431964546442032,
-0.08314406126737595,
0.08197404444217682,
-0.07504583150148392,
0.05076617747545242,
-0.017165957018733025,
0.07193044573068619,
-0.10098764300346375,
-0.08425077050924301,
0.019070295616984367,
0.04481854662299156,
0.06286252290010452,
0.03663786128163338,
-0.10498754680156708,
0.021303504705429077,
0.152454674243927,
-0.10783673822879791,
-0.12063991278409958,
0.11638593673706055,
0.0007027290412224829,
0.02237130142748356,
0.07098999619483948,
0.12226134538650513,
0.1676829308271408,
-0.11216986924409866,
-0.03273039311170578,
0.08376944810152054,
0.06434404104948044,
-0.051563411951065063,
0.06330017000436783,
0.0008205870981328189,
0.0014637415297329426,
0.028066106140613556,
0.06023270636796951,
0.05969677120447159,
0.0012163110077381134,
-0.034944530576467514,
-0.043803904205560684,
-0.08749453723430634,
-0.034984342753887177,
-0.012630343437194824,
0.013706463389098644,
-0.058641914278268814,
-0.07240241765975952,
-0.0040409439243376255,
0.17965194582939148,
-0.0921110212802887,
0.02765789069235325,
-0.07346569746732712,
-0.014726025983691216,
-0.0727107971906662,
0.023475157096982002,
-0.11661146581172943,
0.01055228989571333,
0.054273106157779694,
-0.03681039810180664,
0.05578359216451645,
0.07306557148694992,
0.0052727642469108105,
0.023641251027584076,
-0.05308438464999199,
-0.043297845870256424,
-0.048443958163261414,
-0.07009194791316986,
-0.11313577741384506,
-0.021107234060764313,
-0.09328927844762802,
-0.028370525687932968,
-0.04310579597949982,
-0.16450822353363037,
0.0028553761076182127,
0.005815143696963787,
0.025418169796466827,
0.026187827810645103,
-0.040293022990226746,
0.026957614347338676,
0.05747928470373154,
-0.044293951243162155,
-0.08272748440504074,
0.01635042205452919,
0.026995142921805382,
-0.08423785120248795,
-0.05690401419997215,
-0.10598698258399963,
-0.06343665719032288,
0.07106602191925049,
0.10628531873226166,
-0.11604928225278854,
0.005248864181339741,
-0.02025904692709446,
-0.04225761070847511,
-0.06582720577716827,
-0.06449839472770691,
0.16856606304645538,
0.015340734273195267,
0.16418957710266113,
-0.12349752336740494,
-0.0569777674973011,
-0.029081284999847412,
0.0043123746290802956,
0.029111577197909355,
0.15496516227722168,
0.02206176519393921,
-0.07382110506296158,
0.03963799774646759,
-0.03668637573719025,
-0.05413825064897537,
0.1547151803970337,
-0.013364721089601517,
-0.08059949427843094,
0.005100127309560776,
0.11609494686126709,
-0.0006579707842320204,
0.16678301990032196,
-0.058037854731082916,
0.00984387006610632,
-0.007854809984564781,
0.02253979630768299,
0.03872658312320709,
-0.12774334847927094,
0.02725820615887642,
0.049142513424158096,
-0.061533451080322266,
-0.05316140875220299,
-0.030587680637836456,
-0.033106010407209396,
0.043710872530937195,
0.016864126548171043,
0.03859524056315422,
-0.02886945754289627,
-0.027590373530983925,
-0.1070970892906189,
0.189908966422081,
-0.07341957837343216,
-0.21233157813549042,
-0.17231787741184235,
0.0635773241519928,
-0.022836189717054367,
-0.02438901737332344,
0.03775296360254288,
-0.10003243386745453,
-0.058096155524253845,
-0.09438518434762955,
0.13841858506202698,
-0.1337442547082901,
-0.0032220734283328056,
-0.04026114195585251,
0.05511907860636711,
0.0586872361600399,
-0.16561004519462585,
0.02817225269973278,
-0.006357994396239519,
0.00013733770174439996,
-0.005501721519976854,
-0.04796979948878288,
0.07561402022838593,
0.11152687668800354,
-0.06361042708158493,
0.015406111255288124,
-0.004175751470029354,
0.18489526212215424,
-0.054681990295648575,
0.031147174537181854,
0.17770220339298248,
0.015118884854018688,
0.03595292568206787,
0.050157614052295685,
0.01706698350608349,
-0.10001908242702484,
0.059826597571372986,
0.056533269584178925,
-0.03767566755414009,
-0.21399010717868805,
-0.02937491424381733,
-0.08327388763427734,
0.07710279524326324,
0.14201834797859192,
0.05686056241393089,
-0.17040808498859406,
0.01929507404565811,
-0.013763261958956718,
0.14764803647994995,
-0.037606533616781235,
0.06245702505111694,
0.022768978029489517,
0.00836192350834608,
-0.0033803165424615145,
-0.10044791549444199,
0.006311530712991953,
0.08120133727788925,
0.11682961881160736,
0.1967441439628601,
-0.08443596214056015,
0.16688503324985504,
0.004074810538440943,
0.10607194900512695,
0.03473575785756111,
0.0941472128033638,
-0.13044388592243195,
0.010443936102092266,
0.018333448097109795,
-0.013040569610893726,
-0.054522354155778885,
0.045100338757038116,
-0.010297339409589767,
0.06223300099372864,
-0.0687689259648323,
-0.0020242510363459587,
0.011464685201644897,
0.19753484427928925,
0.08931407332420349,
-0.15757358074188232,
-0.12120940536260605,
0.016008906066417694,
-0.09760300070047379,
-0.11102420091629028,
0.06973108649253845,
0.19622057676315308,
-0.07146193087100983,
0.02818322740495205,
-0.020066792145371437,
0.13595610857009888,
-0.11102215200662613,
-0.02675967663526535,
0.02603778801858425,
0.0529293492436409,
0.000905349210370332,
0.11370886862277985,
-0.2531368136405945,
0.06931807845830917,
0.015499589964747429,
0.09053531289100647,
-0.014965600334107876,
0.06556833535432816,
-0.05266416072845459,
0.0057711489498615265,
0.07565602660179138,
0.011233185417950153,
-0.03767046704888344,
-0.20600692927837372,
-0.051925551146268845,
0.026020383462309837,
0.050867050886154175,
-0.021951401606202126,
0.09376177936792374,
-0.03200492635369301,
0.04670286178588867,
-0.03538428246974945,
-0.1593417525291443,
-0.04577171057462692,
-0.14572103321552277,
-0.033005401492118835,
-0.010692136362195015,
-0.0478193573653698,
-0.02230679988861084,
0.04034389555454254,
0.04903764650225639,
0.2267679125070572,
-0.16306979954242706,
-0.0875253900885582,
-0.10147321969270706,
0.06756340712308884,
0.1325659602880478,
-0.08940239995718002,
0.03056839480996132,
0.010552330873906612,
0.040176279842853546,
-0.045321688055992126,
-0.07743159681558609,
0.02897227555513382,
-0.05802709981799126,
-0.08372965455055237,
-0.025807039812207222,
0.1187245324254036,
-0.02007187157869339,
0.0488758459687233,
-0.0057230256497859955,
-0.07065334916114807,
-0.04409637674689293,
-0.1259697526693344,
-0.07222520560026169,
-0.010241493582725525,
0.024218587204813957,
0.00950562208890915,
-0.11876945197582245,
0.08186642825603485,
-0.00678534060716629,
-0.08174894005060196,
0.08380908519029617,
0.18783245980739594,
-0.07302620261907578,
0.031482476741075516,
0.0692690908908844,
-0.056436605751514435,
-0.17641665041446686,
-0.04384589195251465,
0.040310606360435486,
0.07749029994010925,
-0.015945158898830414,
-0.16086076200008392,
0.05852985009551048,
0.012496775947511196,
0.01933053322136402,
0.05107331648468971,
-0.2940051555633545,
-0.12838180363178253,
0.00883457250893116,
0.07169413566589355,
0.03951292112469673,
-0.10623885691165924,
-0.03823133558034897,
-0.057252854108810425,
-0.05713711678981781,
0.04837663099169731,
0.07818835228681564,
0.1156732439994812,
-0.0401439368724823,
0.046489208936691284,
0.04629954323172569,
-0.02576334960758686,
0.06809976696968079,
-0.030202561989426613,
0.08927828073501587,
-0.015858113765716553,
0.022535506635904312,
0.03287322074174881,
-0.06029147282242775,
0.18363608419895172,
-0.17126570641994476,
0.09646274149417877,
-0.19883643090724945,
-0.04371478408575058,
-0.02286076545715332,
0.0004253055085428059,
-0.040029361844062805,
-0.053397782146930695,
-0.11606305092573166,
0.01764630898833275,
0.05515311658382416,
-0.032766491174697876,
0.031801603734493256,
-0.020387763157486916,
-0.03579055145382881,
0.06342549622058868,
0.06916002184152603,
0.005857005249708891,
-0.16294048726558685,
0.032042570412158966,
0.01809614896774292,
0.08741926401853561,
-0.18798117339611053,
0.019098898395895958,
0.11345816403627396,
0.030008330941200256,
0.09654825925827026,
0.007323659025132656,
-0.08778111636638641,
0.029798733070492744,
0.0725032240152359,
-0.06485316157341003,
-0.1132366880774498,
-0.011700212024152279,
-0.05188436061143875,
-0.10772328823804855,
0.02212938480079174,
0.09439223259687424,
-0.05738997459411621,
-0.016778523102402687,
-0.004349239636212587,
0.01738140359520912,
-0.07696114480495453,
0.18309332430362701,
0.018284058198332787,
0.07668182253837585,
-0.06319187581539154,
0.07141576707363129,
0.09678703546524048,
-0.10526096820831299,
0.01929387077689171,
0.16732868552207947,
-0.07750767469406128,
-0.024740593507885933,
0.08471748232841492,
0.08875063061714172,
-0.026509901508688927,
-0.03745080530643463,
-0.08449770510196686,
-0.0717315599322319,
0.01764165237545967,
0.019708402454853058,
0.0658310055732727,
0.08161298930644989,
-0.03529177978634834,
0.0037437453866004944,
-0.12498068809509277,
0.09809494018554688,
0.07342159003019333,
0.049966178834438324,
-0.14694000780582428,
0.13857147097587585,
0.03783831372857094,
0.0776982307434082,
0.005290822125971317,
0.024905012920498848,
-0.1115444228053093,
0.03936341032385826,
-0.012155062519013882,
0.02656523324549198,
0.00034132334985770285,
0.0524321123957634,
-0.029167262837290764,
0.03854702040553093,
-0.026063773781061172,
0.04384458065032959,
-0.04377909377217293,
-0.03373013064265251,
-0.0372629351913929,
0.019152456894516945,
-0.061949994415044785,
-0.010488752275705338,
0.015071790665388107,
-0.08854391425848007,
0.0947386845946312,
-0.06094784289598465,
-0.0073187872767448425,
0.007703807670623064,
0.009035377763211727,
0.055198464542627335,
0.02058800868690014,
0.04359695315361023,
-0.015368849039077759,
-0.0103539964184165,
0.030985817313194275,
0.0097615085542202,
-0.007769196294248104,
0.004753876477479935,
0.08109022676944733,
-0.1530642956495285,
-0.08686917275190353,
-0.08358810842037201,
-0.08187689632177353,
-0.0569116435945034,
0.07092073559761047,
0.0737759917974472,
0.07057660073041916,
0.10324589163064957,
-0.03956449404358864,
0.020933154970407486,
-0.14883625507354736,
-0.048567995429039,
0.04599272459745407,
0.0008085332810878754,
-0.10084398090839386,
-0.03635181859135628,
0.055817488580942154,
-0.03489520400762558,
0.10741203278303146,
-0.015395410358905792,
0.050900641828775406,
-0.007967465557157993,
-0.04903301224112511,
-0.029491448774933815,
0.0026657001581043005,
0.1764499545097351,
-0.10216254740953445,
0.0019039511680603027,
-0.01863470859825611,
0.002334228018298745,
0.033811938017606735,
0.17745886743068695,
0.1032596305012703,
0.11492766439914703,
0.02867179363965988,
0.06909362971782684,
-0.04952368885278702,
-0.030476460233330727,
-0.15011759102344513,
0.04639069363474846,
-0.0295700766146183,
0.05131791532039642,
-0.043244585394859314,
0.1380321979522705,
0.10230474919080734,
-0.12505853176116943,
0.10336428135633469,
0.016846921294927597,
-0.09153932332992554,
-0.05065545812249184,
-0.08636181801557541,
-0.041978269815444946,
-0.10225013643503189,
0.008043617941439152,
-0.09764230251312256,
0.01943385601043701,
0.07843253761529922,
0.034456364810466766,
-0.026466967537999153,
0.1622169464826584,
-0.01974114216864109,
-0.06185963749885559,
0.03243717551231384,
0.04562745988368988,
0.026529565453529358,
0.10528234392404556,
0.027239102870225906,
0.05152084678411484,
-0.07480016350746155,
0.07564932852983475,
0.037016429007053375,
-0.021118367090821266,
0.018831809982657433,
0.009487492963671684,
-0.008345648646354675,
-0.04660532996058464,
-0.0038201631978154182,
0.07051762193441391,
0.17778471112251282,
0.04608435556292534,
-0.04862426966428757,
-0.058199916034936905,
0.20540784299373627,
-0.058524634689092636,
-0.057192474603652954,
-0.12889471650123596,
0.17726141214370728,
0.037913694977760315,
0.01272543240338564,
0.01463292259722948,
-0.081184521317482,
-0.029237400740385056,
0.22605760395526886,
0.0560445673763752,
-0.027460457757115364,
-0.02194279246032238,
-0.007053275126963854,
-0.0068483962677419186,
-0.029805948957800865,
0.15290002524852753,
0.00022055633598938584,
0.24290983378887177,
0.0074605792760849,
0.004788858350366354,
-0.03936867415904999,
-0.044826239347457886,
-0.03187005966901779,
0.19361957907676697,
-0.03473785147070885,
0.026770832017064095,
-0.1040356308221817,
-0.0030356012284755707,
0.028861423954367638,
-0.10865405946969986,
0.11942779272794724,
-0.12520092725753784,
-0.07652491331100464,
0.013888879679143429,
0.057585764676332474,
-0.042261458933353424,
0.04859938099980354,
-0.02598538249731064,
0.051714759320020676,
0.043177589774131775,
-0.03247063234448433,
-0.10815523564815521,
-0.16327495872974396,
0.053085390478372574,
0.0009763787966221571,
0.13705366849899292,
0.021004823967814445,
0.06384052336215973,
0.08226671069860458,
0.004080177750438452,
-0.07678834348917007,
0.08322133123874664,
0.03660789504647255,
-0.011291968636214733,
0.053717996925115585,
0.1300140768289566,
-0.04384630545973778,
0.15608440339565277,
0.006377088837325573,
-0.018500516191124916,
-0.028492042794823647,
-0.03039800003170967,
0.00954246986657381,
-0.14960810542106628,
-0.004270779434591532,
-0.0650014653801918,
0.12984579801559448,
0.2062687873840332,
-0.04787268862128258,
-0.021315976977348328,
-0.05220138654112816,
0.07361838221549988,
-0.010100251995027065,
0.08716288208961487,
0.0009094581473618746,
-0.16783994436264038,
0.008619287051260471,
-0.018946312367916107,
0.005169103387743235,
-0.19494329392910004,
-0.05395255237817764,
-0.03417773172259331,
-0.023218778893351555,
-0.10485062748193741,
0.15692122280597687,
0.061050690710544586,
0.02441406063735485,
-0.038448724895715714,
-0.16521646082401276,
-0.014240099117159843,
0.05161065235733986,
-0.12535962462425232,
-0.11558862775564194
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the python function/method.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_python_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_python_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/python/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_python_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the python function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10289273411035538,
0.0724584087729454,
-0.0008943566936068237,
0.10911059379577637,
0.04860978573560715,
0.02915826253592968,
0.033914972096681595,
0.1109130010008812,
-0.031466078013181686,
0.058023352175951004,
0.049680545926094055,
-0.05311482772231102,
0.06741348654031754,
0.18842127919197083,
0.022871380671858788,
-0.15103092789649963,
-0.03524332121014595,
0.038758765906095505,
-0.0730607882142067,
0.11302974075078964,
0.07066098600625992,
-0.07711288332939148,
0.07587452232837677,
-0.045489367097616196,
-0.11716165393590927,
0.04474018141627312,
-0.01957353763282299,
-0.017844930291175842,
0.09834263473749161,
0.05976055935025215,
0.12763532996177673,
-0.038440633565187454,
0.07537231594324112,
-0.2065848857164383,
-0.0009102404583245516,
0.03266679123044014,
0.060011863708496094,
0.040305543690919876,
0.0502791590988636,
0.07065287232398987,
0.12942463159561157,
-0.023201774805784225,
0.051436975598335266,
0.05541461333632469,
-0.06608543545007706,
-0.06225523352622986,
-0.06853578984737396,
0.07188159227371216,
0.08498737215995789,
0.10688231885433197,
-0.006295912899076939,
0.04818875715136528,
-0.07399889081716537,
0.08859087526798248,
0.1191471517086029,
-0.2290695607662201,
-0.01967921294271946,
0.0947563499212265,
0.0817214772105217,
0.04140898585319519,
-0.07516686618328094,
-0.044048067182302475,
0.10417628288269043,
0.04865552857518196,
0.056875355541706085,
-0.08882305026054382,
-0.0285949669778347,
-0.018360858783125877,
-0.061628587543964386,
-0.05348166078329086,
0.1746348887681961,
0.030835522338747978,
-0.060739051550626755,
-0.09824027866125107,
-0.049511272460222244,
-0.1942761391401291,
0.039430830627679825,
0.022276602685451508,
0.0013785954797640443,
-0.00930614024400711,
0.007544942665845156,
-0.002715711947530508,
-0.09223370254039764,
-0.10692847520112991,
0.026014631614089012,
0.02449057064950466,
0.06313913315534592,
0.035646628588438034,
-0.03491235896945,
0.09788156300783157,
0.04257075861096382,
-0.038058824837207794,
-0.006847538519650698,
0.0160023532807827,
-0.10896622389554977,
0.0005081877461634576,
0.0032254832331091166,
-0.0610152930021286,
-0.017445707693696022,
0.08518794924020767,
-0.09235502779483795,
0.07280468195676804,
0.09626852720975876,
0.008909206837415695,
0.014239785261452198,
0.20567519962787628,
0.046399399638175964,
-0.1406792551279068,
0.0246815737336874,
0.026279354467988014,
-0.007126238662749529,
0.015099257230758667,
-0.048207949846982956,
-0.04654872417449951,
0.024723125621676445,
0.07357293367385864,
-0.10894612222909927,
0.026039833202958107,
-0.06224525719881058,
-0.01587465964257717,
0.05052865669131279,
-0.12480634450912476,
0.035517118871212006,
0.012353929691016674,
-0.056332796812057495,
-0.02394937165081501,
0.09677878022193909,
-0.13601703941822052,
-0.12430805712938309,
0.046490564942359924,
-0.04619511961936951,
-0.034080296754837036,
-0.11560478061437607,
-0.12503105401992798,
-0.008514763787388802,
-0.012743194587528706,
-0.00028603331884369254,
-0.10497865825891495,
-0.08395890146493912,
-0.013771609403192997,
0.044076573103666306,
0.002979113720357418,
-0.03207849711179733,
-0.041338276118040085,
0.004168527666479349,
-0.0021093229297548532,
-0.016442598775029182,
0.0063504669815301895,
-0.03639348968863487,
0.10336541384458542,
0.07776697725057602,
0.04001034423708916,
-0.008062140084803104,
0.022678963840007782,
-0.08336243778467178,
0.07649069279432297,
-0.08337435126304626,
0.05071103572845459,
-0.011279481463134289,
0.06968891620635986,
-0.10521074384450912,
-0.08362392336130142,
0.01326773501932621,
0.048759184777736664,
0.07372311502695084,
0.04496714472770691,
-0.13012000918388367,
0.028145765885710716,
0.15188340842723846,
-0.10873878747224808,
-0.13342443108558655,
0.11187927424907684,
-0.008044395595788956,
0.039355285465717316,
0.07298669219017029,
0.11701994389295578,
0.16535161435604095,
-0.11117719113826752,
-0.03836406394839287,
0.0834261029958725,
0.052786946296691895,
-0.06226380169391632,
0.053919717669487,
0.013683426193892956,
-0.018540358170866966,
0.018049687147140503,
0.06238047778606415,
0.06596404314041138,
-0.004325691610574722,
-0.03578175604343414,
-0.03598954156041145,
-0.09484004974365234,
-0.039469100534915924,
-0.010957279242575169,
0.017096666619181633,
-0.053585685789585114,
-0.06982932239770889,
0.006630662363022566,
0.17546431720256805,
-0.09610846638679504,
0.023992860689759254,
-0.07938925921916962,
-0.024582868441939354,
-0.06974750757217407,
0.018673943355679512,
-0.11259008944034576,
0.01809355430305004,
0.05300221964716911,
-0.02334550768136978,
0.058547209948301315,
0.07717244327068329,
0.003661323571577668,
0.018737124279141426,
-0.051169488579034805,
-0.03987162932753563,
-0.03805813938379288,
-0.0711999163031578,
-0.11638740450143814,
-0.018138399347662926,
-0.09610673040151596,
-0.01990109123289585,
-0.04458368569612503,
-0.177369624376297,
0.0019059708574786782,
-0.0020225911866873503,
0.024136822670698166,
0.02581067755818367,
-0.03403822332620621,
0.025644203647971153,
0.05180446431040764,
-0.0472540408372879,
-0.0818425714969635,
0.01455072220414877,
0.033551208674907684,
-0.08442626148462296,
-0.05303264409303665,
-0.10402057319879532,
-0.071367546916008,
0.08185067027807236,
0.1043258085846901,
-0.12857244908809662,
0.0028489616233855486,
-0.024239199236035347,
-0.03960993513464928,
-0.057348039001226425,
-0.06060713902115822,
0.16468043625354767,
0.013606438413262367,
0.16222049295902252,
-0.12855948507785797,
-0.060366492718458176,
-0.03296544402837753,
0.00849685538560152,
0.03620533272624016,
0.14660605788230896,
0.02617446705698967,
-0.0882672667503357,
0.033998169004917145,
-0.046724990010261536,
-0.04885819926857948,
0.14809943735599518,
-0.015156841836869717,
-0.07498481869697571,
0.007348094135522842,
0.1124265193939209,
0.007863556034862995,
0.18915420770645142,
-0.034745123237371445,
0.009127168916165829,
-0.01032216101884842,
0.01353801041841507,
0.038052093237638474,
-0.1273416429758072,
0.025570448487997055,
0.047400206327438354,
-0.05805876478552818,
-0.04459453001618385,
-0.035471584647893906,
-0.03306708112359047,
0.045837245881557465,
0.01499680895358324,
0.033141542226076126,
-0.026234086602926254,
-0.027649054303765297,
-0.10942825675010681,
0.18726520240306854,
-0.07276502251625061,
-0.2188970446586609,
-0.16861169040203094,
0.08135486394166946,
-0.006003840826451778,
-0.021359870210289955,
0.027533210813999176,
-0.09203970432281494,
-0.05914606526494026,
-0.09686044603586197,
0.14098966121673584,
-0.12426651269197464,
0.0012003807350993156,
-0.03986054286360741,
0.056515563279390335,
0.058793507516384125,
-0.16835832595825195,
0.035603366792201996,
-0.010896624065935612,
0.004465833306312561,
-0.006180244032293558,
-0.06769542396068573,
0.07496964186429977,
0.11160864681005478,
-0.07433857768774033,
0.013647885993123055,
-0.007500295527279377,
0.18097902834415436,
-0.05902419611811638,
0.04196590185165405,
0.16181263327598572,
0.010876261629164219,
0.029469136148691177,
0.046861182898283005,
0.009535465389490128,
-0.10066342353820801,
0.06764765828847885,
0.04534480348229408,
-0.03152959421277046,
-0.212846577167511,
-0.03233307972550392,
-0.07964427024126053,
0.07858949899673462,
0.141008660197258,
0.04489253833889961,
-0.16368742287158966,
0.02571709081530571,
-0.013698591850697994,
0.1650649756193161,
-0.033976681530475616,
0.06339666992425919,
0.00012550306564662606,
0.016299985349178314,
-0.004307394381612539,
-0.10279277712106705,
0.003937528934329748,
0.07623829692602158,
0.10948989540338516,
0.19691906869411469,
-0.09508463740348816,
0.16087406873703003,
-0.0031298967078328133,
0.12111736834049225,
0.03988759592175484,
0.11302094906568527,
-0.13057193160057068,
0.015545137226581573,
0.016510477289557457,
-0.0136018767952919,
-0.07186184078454971,
0.043263524770736694,
-0.025352759286761284,
0.07257076352834702,
-0.07691314071416855,
0.00559934601187706,
0.009972241707146168,
0.18641474843025208,
0.09226597100496292,
-0.16975657641887665,
-0.1269504725933075,
0.0067343758419156075,
-0.09608910232782364,
-0.11072323471307755,
0.07061731815338135,
0.20299792289733887,
-0.06589534878730774,
0.018806297332048416,
-0.02277691662311554,
0.1358206421136856,
-0.10647871345281601,
-0.02882135845720768,
0.027493027970194817,
0.053179290145635605,
-0.0026635185349732637,
0.11121191829442978,
-0.2523258924484253,
0.07687469571828842,
0.018446166068315506,
0.09381601959466934,
-0.01763729564845562,
0.05730915069580078,
-0.051430266350507736,
-0.0026406561955809593,
0.07707861810922623,
0.01272776909172535,
-0.02939966879785061,
-0.2027241736650467,
-0.054206427186727524,
0.024473663419485092,
0.04840533062815666,
-0.006003255490213633,
0.09701020270586014,
-0.029000332579016685,
0.046142056584358215,
-0.027746113017201424,
-0.1479332596063614,
-0.05595938116312027,
-0.1413431465625763,
-0.04209202900528908,
-0.01629306748509407,
-0.04644564539194107,
-0.024500148370862007,
0.045947544276714325,
0.05631301924586296,
0.22750593721866608,
-0.16242916882038116,
-0.07818576693534851,
-0.0959913581609726,
0.0664188489317894,
0.1351260095834732,
-0.08328527957201004,
0.02696829102933407,
0.022250046953558922,
0.04454371705651283,
-0.044494032859802246,
-0.08048389852046967,
0.030143288895487785,
-0.05655299872159958,
-0.07055018842220306,
-0.029036756604909897,
0.11421757191419601,
-0.00429309718310833,
0.053728919476270676,
0.008714481256902218,
-0.08127913624048233,
-0.039928629994392395,
-0.12157963216304779,
-0.0895979180932045,
-0.027554964646697044,
0.03392031416296959,
0.006319308187812567,
-0.12990856170654297,
0.07097359001636505,
-0.018171513453125954,
-0.0773993730545044,
0.07672832906246185,
0.16585096716880798,
-0.07374796271324158,
0.02198617346584797,
0.05085824429988861,
-0.056409433484077454,
-0.18006746470928192,
-0.032949987798929214,
0.041259799152612686,
0.07996606826782227,
-0.01857590116560459,
-0.14961378276348114,
0.05965188145637512,
-0.0016759736463427544,
0.025955164805054665,
0.04192071035504341,
-0.2694435715675354,
-0.12542736530303955,
0.012629744596779346,
0.07067788392305374,
0.031634967774152756,
-0.10043104737997055,
-0.03646332025527954,
-0.059906426817178726,
-0.05510493367910385,
0.0719388872385025,
0.07749919593334198,
0.10957314074039459,
-0.03669087961316109,
0.048920512199401855,
0.049477219581604004,
-0.026157474145293236,
0.05629269778728485,
-0.02651222050189972,
0.09288293123245239,
-0.019623825326561928,
0.017512256279587746,
0.037799108773469925,
-0.06017389893531799,
0.18526691198349,
-0.17710183560848236,
0.10205385088920593,
-0.2053503692150116,
-0.039657313376665115,
-0.025851722806692123,
0.0021013154182583094,
-0.04012228175997734,
-0.05028663948178291,
-0.12364668399095535,
0.032573748379945755,
0.06245456263422966,
-0.027898747473955154,
0.02120664156973362,
-0.009443948976695538,
-0.038058169186115265,
0.04588785022497177,
0.08519060909748077,
0.014799590222537518,
-0.1405690312385559,
0.040138129144907,
0.021766524761915207,
0.09296678006649017,
-0.18168242275714874,
0.020720224827528,
0.11357135325670242,
0.024474751204252243,
0.09098976850509644,
0.011025089770555496,
-0.08855196088552475,
0.018437154591083527,
0.07130910456180573,
-0.06804736703634262,
-0.08436416834592819,
-0.011518317274749279,
-0.02467348985373974,
-0.10371488332748413,
0.02169591560959816,
0.09103967249393463,
-0.06424349546432495,
-0.012891304679214954,
-0.0028263963758945465,
0.009109930135309696,
-0.0834929570555687,
0.18050086498260498,
0.02036520093679428,
0.07686681300401688,
-0.05371160805225372,
0.06729793548583984,
0.09045535326004028,
-0.0966508612036705,
0.025106552988290787,
0.15817858278751373,
-0.08226248621940613,
-0.021820824593305588,
0.10746867954730988,
0.09889783710241318,
-0.03465797007083893,
-0.03944169357419014,
-0.08326653391122818,
-0.07345570623874664,
0.011722101829946041,
0.044670525938272476,
0.06428166478872299,
0.08639460057020187,
-0.027467792853713036,
-0.0006295610219240189,
-0.13520681858062744,
0.09132975339889526,
0.08007091283798218,
0.046971190720796585,
-0.13759319484233856,
0.16209080815315247,
0.031324297189712524,
0.07200979441404343,
0.0023477065842598677,
0.033066634088754654,
-0.11412615329027176,
0.041472285985946655,
-0.016330452635884285,
0.03386453539133072,
0.0014710263349115849,
0.047395192086696625,
-0.034341584891080856,
0.04517163336277008,
-0.02627359889447689,
0.039980098605155945,
-0.04129457473754883,
-0.027591943740844727,
-0.036326032131910324,
0.010841012932360172,
-0.054187629371881485,
-0.012494482100009918,
0.013972950167953968,
-0.09334422647953033,
0.08522652089595795,
-0.058080703020095825,
-0.0060259192250669,
0.008954438380897045,
0.019076108932495117,
0.04748990759253502,
0.01095971092581749,
0.04978252202272415,
-0.007130080368369818,
0.00014707808441016823,
0.02705795131623745,
0.017265932634472847,
-0.005161491222679615,
0.0003391492646187544,
0.09101583808660507,
-0.14259685575962067,
-0.08602340519428253,
-0.08549627661705017,
-0.06925791501998901,
-0.059123553335666656,
0.07605747133493423,
0.07204409688711166,
0.06903126090765,
0.1015079915523529,
-0.03922651335597038,
0.015866369009017944,
-0.15737752616405487,
-0.04731312394142151,
0.04782775044441223,
0.001238497905433178,
-0.0942491963505745,
-0.03572797402739525,
0.062238723039627075,
-0.03118697553873062,
0.10158496350049973,
-0.014270736835896969,
0.04000435397028923,
-0.010639232583343983,
-0.05978161469101906,
-0.0403403677046299,
0.0037622209638357162,
0.18938544392585754,
-0.10418681800365448,
0.005691425409168005,
-0.01895085535943508,
0.0024515201803296804,
0.03162660077214241,
0.16831649839878082,
0.11686154454946518,
0.1185314804315567,
0.01400084514170885,
0.07404609769582748,
-0.051479045301675797,
-0.03435290977358818,
-0.12712523341178894,
0.03771622106432915,
-0.0305631160736084,
0.04238863289356232,
-0.03211059048771858,
0.14482399821281433,
0.08595329523086548,
-0.1269969791173935,
0.10232162475585938,
0.008092576637864113,
-0.09930755198001862,
-0.04361043870449066,
-0.07849244028329849,
-0.036208730190992355,
-0.09894505143165588,
0.005570271518081427,
-0.10007583349943161,
0.002981362398713827,
0.06283527612686157,
0.03308359533548355,
-0.028458718210458755,
0.16253499686717987,
-0.03221254050731659,
-0.062390509992837906,
0.03397933766245842,
0.04950842261314392,
0.0182990450412035,
0.08886561542749405,
0.028465408831834793,
0.048916786909103394,
-0.07152006030082703,
0.07467345148324966,
0.03154415264725685,
-0.018211983144283295,
0.02777743898332119,
0.029266357421875,
-0.00756290415301919,
-0.0454690046608448,
-0.011198540218174458,
0.07663769274950027,
0.17223595082759857,
0.04423791170120239,
-0.04235011711716652,
-0.05946413427591324,
0.1941760927438736,
-0.06090650334954262,
-0.060687512159347534,
-0.12839730083942413,
0.17887477576732635,
0.03097492828965187,
0.011946722865104675,
0.015894019976258278,
-0.08042071014642715,
-0.019538674503564835,
0.24443559348583221,
0.05821564421057701,
-0.04399988427758217,
-0.02450629509985447,
-0.012761353515088558,
-0.006862394977360964,
-0.04121987894177437,
0.1541721373796463,
0.009057922288775444,
0.24572357535362244,
0.006446836516261101,
-0.01042266096919775,
-0.042355526238679886,
-0.04134924337267876,
-0.011532852426171303,
0.1881079375743866,
-0.03404294326901436,
0.0269901305437088,
-0.10311247408390045,
-0.004697192460298538,
0.021753834560513496,
-0.13374675810337067,
0.11847875267267227,
-0.12691637873649597,
-0.06969349831342697,
0.010303246788680553,
0.05032411962747574,
-0.0490393191576004,
0.04903450608253479,
-0.02868781052529812,
0.06321003288030624,
0.04432188346982002,
-0.02692939154803753,
-0.11055655032396317,
-0.15663056075572968,
0.055297620594501495,
-0.005214873701334,
0.13275891542434692,
0.01986338011920452,
0.07542155683040619,
0.07842820137739182,
0.0068224212154746056,
-0.07377821207046509,
0.07716459035873413,
0.034583140164613724,
-0.0035730968229472637,
0.0518738329410553,
0.12405966222286224,
-0.04640207067131996,
0.16623175144195557,
0.0030864630825817585,
-0.026790734380483627,
-0.02787117287516594,
-0.03037971630692482,
0.003926802892237902,
-0.15265879034996033,
-0.007758545223623514,
-0.06265203654766083,
0.13636735081672668,
0.20627380907535553,
-0.05106892064213753,
-0.010718630626797676,
-0.05293086916208267,
0.0745784267783165,
-0.005025885999202728,
0.07797157764434814,
0.001281142234802246,
-0.16503176093101501,
-0.002914660843089223,
-0.04084451124072075,
0.0007676874520257115,
-0.199607253074646,
-0.04731836915016174,
-0.04290987178683281,
-0.03713672235608101,
-0.10686364024877548,
0.15549059212207794,
0.0641147717833519,
0.031932104378938675,
-0.04301123321056366,
-0.12124136835336685,
-0.01419626735150814,
0.050556015223264694,
-0.1256515085697174,
-0.11831307411193848
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_ruby_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_ruby_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/ruby/base_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. (We have trained in total 260,000 steps.)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_ruby_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. (We have trained in total 260,000 steps.)
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. (We have trained in total 260,000 steps.)\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. (We have trained in total 260,000 steps.)\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
133
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture. (We have trained in total 260,000 steps.)\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10495331883430481,
-0.021562468260526657,
0.0003362499119248241,
0.126540869474411,
0.10784327238798141,
0.004584750160574913,
0.1169055625796318,
0.08237740397453308,
-0.08195937424898148,
0.028175290673971176,
0.09702577441930771,
0.06693840771913528,
0.03340931236743927,
0.12459132075309753,
0.012176916934549809,
-0.11608993262052536,
-0.030695652589201927,
0.05214698240160942,
-0.12140081077814102,
0.13864560425281525,
0.10722538828849792,
-0.09460602700710297,
0.09525666385889053,
-0.058232683688402176,
-0.2506906986236572,
0.05277485400438309,
-0.03428677096962929,
-0.02685127779841423,
0.10747135430574417,
0.04457532986998558,
0.1290149986743927,
0.012526450678706169,
0.026954196393489838,
-0.14227966964244843,
0.008818240836262703,
0.008146335370838642,
0.04113198444247246,
0.04144803434610367,
0.07262599468231201,
0.08125758916139603,
0.16077718138694763,
-0.03779109939932823,
0.035205282270908356,
0.05739979073405266,
-0.06335295736789703,
-0.07925409078598022,
-0.0761805847287178,
0.09075330942869186,
0.10186589509248734,
0.09775492548942566,
-0.008112093433737755,
0.14697152376174927,
-0.06937573850154877,
0.1225898414850235,
0.1334180384874344,
-0.24644659459590912,
-0.051242999732494354,
0.11465917527675629,
0.0662383958697319,
0.02561350353062153,
-0.04006452485918999,
-0.0349842868745327,
0.12864267826080322,
0.0563734732568264,
-0.007966994307935238,
-0.0988788977265358,
-0.07104723155498505,
0.02446037344634533,
-0.10070895403623581,
-0.059523019939661026,
0.20322374999523163,
0.0171968974173069,
-0.0885126069188118,
-0.046787653118371964,
-0.02150280401110649,
-0.1211584135890007,
0.07469210773706436,
0.0012807176681235433,
0.0031055458821356297,
-0.03964214026927948,
-0.0539865642786026,
0.0175374373793602,
-0.10260035842657089,
-0.14655622839927673,
0.03716958686709404,
0.034550439566373825,
0.06783635914325714,
0.030863311141729355,
-0.0812007263302803,
0.09609847515821457,
0.06162382289767265,
-0.04025592654943466,
-0.015920421108603477,
-0.04003545269370079,
-0.0763421282172203,
0.004952495452016592,
-0.05925114080309868,
-0.17728158831596375,
0.04551566392183304,
0.08103156089782715,
-0.05919777974486351,
0.06500257551670074,
0.07402069121599197,
0.051120102405548096,
-0.019493786618113518,
0.21002237498760223,
0.06302427500486374,
-0.07496846467256546,
-0.0006245857803151011,
0.008751831948757172,
-0.054884105920791626,
0.019874779507517815,
-0.07832016795873642,
-0.06025603413581848,
0.11015743762254715,
0.04819155111908913,
-0.1077660396695137,
0.045401543378829956,
-0.02348928153514862,
-0.024914482608437538,
-0.00008079707185970619,
-0.0898144543170929,
-0.008802331984043121,
-0.005166308488696814,
-0.07708214968442917,
-0.06815627217292786,
0.06648878753185272,
-0.10027306526899338,
-0.09235228598117828,
-0.027334116399288177,
-0.04406711831688881,
-0.014652825891971588,
-0.1464030146598816,
-0.13201159238815308,
-0.007466534152626991,
-0.05217243731021881,
0.036005061119794846,
-0.13450850546360016,
-0.15101027488708496,
-0.05701124295592308,
0.05497242882847786,
0.001447026152163744,
0.0068997167982161045,
-0.05440767481923103,
-0.001486142398789525,
-0.0167859960347414,
-0.016669785603880882,
0.034287791699171066,
-0.005295692477375269,
0.11944414675235748,
0.12057894468307495,
0.027546580880880356,
-0.025447996333241463,
0.02681512013077736,
-0.07809711247682571,
0.05173380672931671,
-0.12459735572338104,
0.0603247731924057,
0.007092316169291735,
0.10703904926776886,
-0.10423283278942108,
-0.10649800300598145,
0.051256079226732254,
0.04520081356167793,
0.08611979335546494,
0.06940246373414993,
-0.13452990353107452,
0.01638348400592804,
0.11495863646268845,
-0.1072564423084259,
-0.16857247054576874,
0.12009639292955399,
-0.035504695028066635,
0.08564820140600204,
0.06142432242631912,
0.1336718052625656,
0.10806657373905182,
-0.018688248470425606,
-0.022282835096120834,
0.05554764345288277,
0.019636956974864006,
-0.08889210969209671,
0.1122899204492569,
0.05708226561546326,
-0.11501417309045792,
0.006345327477902174,
0.017860954627394676,
0.050278328359127045,
-0.05286384001374245,
-0.021198799833655357,
-0.04390111565589905,
-0.10164927691221237,
0.025126516819000244,
-0.0060307951644063,
0.09216278791427612,
-0.07255446910858154,
-0.07895011454820633,
-0.004609218332916498,
0.13309773802757263,
-0.0842173621058464,
0.005095612723380327,
-0.11429222673177719,
-0.004782587289810181,
-0.09231357276439667,
-0.005775844678282738,
-0.16984783113002777,
0.07216718047857285,
0.05282846838235855,
0.025748280808329582,
0.0795026645064354,
0.10824625939130783,
0.027005556970834732,
0.034934915602207184,
-0.0038781098555773497,
-0.04427005723118782,
-0.07170815020799637,
-0.0577428974211216,
-0.11054156720638275,
-0.05895491689443588,
-0.09245122969150543,
-0.04082927107810974,
-0.0019741228315979242,
-0.21389086544513702,
0.02103053778409958,
-0.030578473582863808,
0.008381925523281097,
0.013789713382720947,
-0.009102039970457554,
0.011129670776426792,
0.06879609823226929,
-0.0772651731967926,
-0.051256269216537476,
0.03744085505604744,
0.04197772219777107,
-0.08860078454017639,
-0.06725006550550461,
-0.06556962430477142,
-0.053301021456718445,
0.06649961322546005,
-0.03278456628322601,
-0.11942422389984131,
0.03250137344002724,
-0.026707442477345467,
-0.04648628085851669,
0.014846064150333405,
-0.057443246245384216,
0.22444690763950348,
-0.02090720273554325,
0.15817901492118835,
-0.1180134266614914,
-0.03180505335330963,
-0.022756099700927734,
-0.01572030782699585,
0.0683135986328125,
0.16106866300106049,
0.002404573140665889,
-0.1641363799571991,
0.048857156187295914,
0.0013878220925107598,
-0.08794715255498886,
0.09776867181062698,
-0.040862489491701126,
-0.04511082544922829,
0.002760381205007434,
0.10515278577804565,
0.0030979544389992952,
0.11917412281036377,
-0.12155622243881226,
-0.025363001972436905,
-0.0023751435801386833,
-0.0064237359911203384,
0.0438871905207634,
-0.1573394238948822,
0.015526258386671543,
0.05277734994888306,
-0.0120614655315876,
-0.06276385486125946,
-0.004551181104034185,
-0.04606325924396515,
0.02932077646255493,
0.012402115389704704,
-0.02437053993344307,
0.02412080019712448,
-0.014800077304244041,
-0.12643012404441833,
0.2237427979707718,
-0.04393874108791351,
-0.17305143177509308,
-0.15910877287387848,
0.07557376474142075,
-0.020710699260234833,
-0.00539221428334713,
0.07444269210100174,
-0.11757536232471466,
-0.009280838072299957,
-0.05817050486803055,
0.16048480570316315,
-0.1198248341679573,
0.03277235105633736,
0.016815347597002983,
0.06807709485292435,
0.0187817569822073,
-0.14336182177066803,
0.025963885709643364,
-0.0025670721661299467,
-0.0075814505107700825,
0.029218966141343117,
-0.11093684285879135,
0.0925801545381546,
0.11308234184980392,
-0.07873973250389099,
0.023127073422074318,
-0.02411196567118168,
0.2113547921180725,
-0.042137011885643005,
-0.04812350124120712,
0.2067137509584427,
-0.039337921887636185,
0.0049019502475857735,
0.02621438354253769,
-0.00271986098960042,
-0.07550954073667526,
0.07775497436523438,
-0.013820038177073002,
-0.033207111060619354,
-0.2900180518627167,
0.005445197224617004,
-0.04118352755904198,
0.079653799533844,
0.06781250983476639,
0.057072851806879044,
-0.07237474620342255,
0.06345504522323608,
0.04594014585018158,
0.16004423797130585,
-0.05028413236141205,
0.07564777135848999,
-0.0567503497004509,
0.023661721497774124,
-0.005153454840183258,
-0.10920946300029755,
-0.021421004086732864,
0.044084060937166214,
0.05040459707379341,
0.2562134563922882,
-0.03564152494072914,
0.12434032559394836,
0.04966862499713898,
0.05604546517133713,
0.021535860374569893,
0.15129348635673523,
-0.11378993839025497,
0.012791414745151997,
0.0007322232704609632,
-0.01566420868039131,
-0.10109898447990417,
0.05004832521080971,
-0.05625966936349869,
0.07249245792627335,
-0.11412358283996582,
0.02239835448563099,
0.00848687905818224,
0.17420843243598938,
0.06833437830209732,
-0.25392869114875793,
-0.12411877512931824,
-0.025589479133486748,
-0.08159598708152771,
-0.08011472225189209,
0.0376104936003685,
0.16372908651828766,
-0.08637665957212448,
-0.008547613397240639,
-0.05338625982403755,
0.14055019617080688,
-0.05429203808307648,
0.002037342870607972,
0.006602148525416851,
-0.010267253965139389,
0.017319418489933014,
0.13345618546009064,
-0.3142387270927429,
0.17292597889900208,
-0.003218169556930661,
0.07002884894609451,
-0.03808854892849922,
0.03849264234304428,
-0.04307227581739426,
0.04143298417329788,
0.09104891121387482,
-0.018686795607209206,
-0.09252190589904785,
-0.17037910223007202,
0.0005023623816668987,
0.02142902836203575,
0.08402150869369507,
0.01935625448822975,
0.10847917199134827,
-0.014886838383972645,
0.06654428690671921,
-0.015592443756759167,
-0.04458288103342056,
-0.12338937819004059,
-0.0628131628036499,
-0.02493835985660553,
-0.0548429973423481,
-0.03128952533006668,
-0.0445728562772274,
0.008659468963742256,
0.0668695867061615,
0.15264618396759033,
-0.1620282083749771,
-0.05928925797343254,
-0.06572504341602325,
0.07244359701871872,
0.08360516279935837,
-0.08131784945726395,
0.004892465192824602,
0.018110008910298347,
0.012070409022271633,
-0.002396214986220002,
-0.07064913213253021,
0.033369626849889755,
-0.053750261664390564,
-0.07907379418611526,
-0.03294353932142258,
0.024101167917251587,
-0.008850493468344212,
0.05163811519742012,
0.01908850111067295,
-0.06306008249521255,
-0.07196706533432007,
-0.0895988941192627,
-0.06658241897821426,
-0.08571789413690567,
0.04309980198740959,
0.00993178877979517,
-0.0851903185248375,
0.09321057051420212,
-0.04671167582273483,
-0.06793949753046036,
0.1474798321723938,
0.14512182772159576,
-0.056339867413043976,
-0.0421656034886837,
0.08212874084711075,
-0.076184943318367,
-0.19434039294719696,
-0.004625917412340641,
0.07848106324672699,
0.08079338073730469,
-0.030324259772896767,
-0.15575531125068665,
0.06031671538949013,
-0.014994672499597073,
0.030193999409675598,
-0.0060038259252905846,
-0.2856297194957733,
-0.10955427587032318,
0.04662393778562546,
0.10079418122768402,
0.11276978999376297,
-0.14208419620990753,
-0.03905247151851654,
-0.06608237326145172,
-0.08364160358905792,
0.03659522160887718,
-0.03916379064321518,
0.11668454855680466,
-0.05420466512441635,
0.03422880545258522,
0.03779495134949684,
-0.05814507231116295,
0.0846414864063263,
-0.009339320473372936,
0.1198035478591919,
-0.03933877497911453,
-0.04853637143969536,
0.06890501081943512,
-0.07029932737350464,
0.1476733237504959,
-0.1680935174226761,
0.10085232555866241,
-0.2260635644197464,
-0.04222425818443298,
-0.021085264161229134,
-0.010649454779922962,
-0.029742740094661713,
-0.04704893007874489,
-0.0885864868760109,
0.025610540062189102,
0.007683455478399992,
-0.00656109768897295,
0.10202468186616898,
0.004520423244684935,
0.014307104051113129,
0.09774298220872879,
0.09459881484508514,
0.07112041860818863,
-0.11725950241088867,
0.04241784289479256,
0.04195419326424599,
0.101618193089962,
-0.23314549028873444,
0.008157328702509403,
0.13442958891391754,
0.02229282446205616,
0.11295226216316223,
0.04317887872457504,
-0.07977821677923203,
0.050624359399080276,
0.06734774261713028,
-0.06875105202198029,
-0.09300758689641953,
-0.03306182473897934,
0.032269205898046494,
-0.07390149682760239,
0.07426238805055618,
0.09455385059118271,
-0.08636654913425446,
-0.018383659422397614,
-0.027231808751821518,
-0.020058536902070045,
-0.14149905741214752,
0.21078816056251526,
0.03948479890823364,
0.10416863113641739,
-0.04209088906645775,
0.06955493986606598,
0.08596008270978928,
-0.05733555555343628,
0.024473827332258224,
0.13902930915355682,
-0.08277903497219086,
-0.04894765466451645,
0.1179840937256813,
0.22123567759990692,
-0.08474807441234589,
-0.08751369267702103,
-0.15949511528015137,
-0.05420275405049324,
0.002964226296171546,
0.1186913326382637,
0.11069254577159882,
0.09013663232326508,
-0.006514783948659897,
0.0031991491559892893,
-0.12087257951498032,
0.06630889326334,
0.07554890960454941,
0.07016141712665558,
-0.17544598877429962,
0.14047984778881073,
0.034536223858594894,
0.1467466652393341,
-0.019008886069059372,
0.036837104707956314,
-0.1346355825662613,
0.0689249262213707,
-0.06837072968482971,
0.05519101396203041,
0.029302991926670074,
0.015318615362048149,
0.01096048578619957,
0.028008168563246727,
-0.02564510889351368,
0.06715347617864609,
-0.06814488768577576,
-0.00378306838683784,
0.0017366806278005242,
0.04978260025382042,
-0.02607671543955803,
-0.029426122084259987,
0.012043727561831474,
-0.06611280143260956,
0.08563901484012604,
-0.052175380289554596,
-0.04878411069512367,
0.02675713039934635,
-0.048625148832798004,
0.06436491012573242,
-0.001204608241096139,
0.0372772216796875,
0.00012169327965239063,
-0.009757531806826591,
0.03752315044403076,
-0.004790544044226408,
0.03311355039477348,
0.008062060922384262,
0.10728558897972107,
-0.11479181796312332,
-0.07604702562093735,
-0.07721613347530365,
-0.0797928124666214,
-0.061063673347234726,
0.07948075234889984,
0.01472540944814682,
0.12262186408042908,
0.12553581595420837,
-0.05242402106523514,
0.028902804479002953,
-0.10547973215579987,
-0.031027453020215034,
0.06262935698032379,
-0.028014495968818665,
-0.11460991948843002,
-0.08479137718677521,
0.05841584876179695,
-0.03157841041684151,
0.07685748487710953,
0.01018171664327383,
0.029516233131289482,
-0.01703825034201145,
-0.038176774978637695,
-0.028246082365512848,
-0.005631429608911276,
0.21369647979736328,
-0.11260507255792618,
0.03876890614628792,
-0.01687087118625641,
0.029445743188261986,
0.019026311114430428,
0.14118196070194244,
0.1605495810508728,
0.21411727368831635,
-0.009646840393543243,
0.09233809262514114,
0.00340630323626101,
0.02723860926926136,
-0.11855699121952057,
-0.009557499550282955,
0.04654265567660332,
0.050935566425323486,
-0.060852210968732834,
0.18630611896514893,
0.11490515619516373,
-0.09008892625570297,
0.06119876354932785,
-0.01631079986691475,
-0.08768745511770248,
-0.04663201421499252,
0.01133981067687273,
-0.06949324160814285,
-0.14486026763916016,
0.023401958867907524,
-0.08458391577005386,
-0.03626955300569534,
0.09880473464727402,
0.02939571812748909,
-0.04833073914051056,
0.15461023151874542,
0.0061088246293365955,
-0.026743220165371895,
0.04492419213056564,
-0.025484761223196983,
0.008612739853560925,
0.04991384595632553,
0.017342036589980125,
0.04224872961640358,
-0.05211436748504639,
0.06933121383190155,
0.0016862457850947976,
-0.03780039772391319,
0.018009847030043602,
0.016928577795624733,
-0.03785461187362671,
-0.021620307117700577,
0.015839016065001488,
0.07684409618377686,
0.13304580748081207,
0.008973180316388607,
-0.02974424883723259,
-0.04916786774992943,
0.17510688304901123,
-0.05921100080013275,
-0.058629631996154785,
-0.09175360947847366,
0.21065612137317657,
0.03498167172074318,
0.010975648649036884,
0.01588292047381401,
-0.09525749832391739,
-0.03227877989411354,
0.21734008193016052,
0.08744356036186218,
0.01212245598435402,
-0.044797807931900024,
0.004659832920879126,
0.006143931765109301,
-0.06299285590648651,
0.18346212804317474,
0.03675176203250885,
0.15232355892658234,
-0.031192904338240623,
-0.03633669763803482,
-0.0646427795290947,
-0.019423414021730423,
-0.008665299043059349,
0.08297803997993469,
0.007593102287501097,
-0.00007478840416297317,
-0.06122911721467972,
0.05999215319752693,
-0.03295639529824257,
-0.156979039311409,
0.11471813172101974,
-0.09748680144548416,
-0.08709131926298141,
0.0036775795742869377,
0.11830584704875946,
-0.033661797642707825,
0.07651124149560928,
-0.049660783261060715,
0.04543079435825348,
-0.002326450077816844,
-0.023962508887052536,
-0.08205030858516693,
-0.12427936494350433,
0.08364046365022659,
-0.04951219633221626,
0.15240897238254547,
-0.019832447171211243,
0.16779665648937225,
0.11597487330436707,
-0.001365993870422244,
-0.07865238934755325,
0.0696575865149498,
0.03721487522125244,
0.032939061522483826,
0.07564380019903183,
0.12744887173175812,
-0.03793856501579285,
0.16593992710113525,
-0.026764966547489166,
-0.08937935531139374,
-0.02633081004023552,
0.015552590601146221,
0.009545852430164814,
-0.17885446548461914,
0.02252156101167202,
-0.09233272820711136,
0.12151230126619339,
0.18902349472045898,
-0.05914776027202606,
0.00692686578258872,
-0.0686013400554657,
0.07144785672426224,
0.012273168191313744,
0.06349401921033859,
-0.04257172346115112,
-0.1939169466495514,
0.01815830171108246,
-0.023560816422104836,
0.011286759749054909,
-0.25465577840805054,
-0.031863968819379807,
-0.04195649176836014,
-0.03553304821252823,
-0.08091439306735992,
0.1277943104505539,
0.11451154947280884,
0.01947171613574028,
-0.044551681727170944,
-0.1764240860939026,
-0.07199840992689133,
0.07610602676868439,
-0.13283681869506836,
-0.15629564225673676
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the ruby function/method.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_ruby_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_ruby_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/ruby/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_ruby_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the ruby function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08953233063220978,
0.07327625155448914,
-0.0021869547199457884,
0.127711221575737,
0.051502954214811325,
0.02193749137222767,
0.04921646788716316,
0.11496589332818985,
-0.037982795387506485,
0.06875196099281311,
0.061658184975385666,
-0.04380218684673309,
0.04829785227775574,
0.15776224434375763,
0.017640599980950356,
-0.15554210543632507,
-0.045057572424411774,
0.013169865123927593,
-0.07569819688796997,
0.1047927513718605,
0.08444325625896454,
-0.07777979969978333,
0.07223953306674957,
-0.05081033334136009,
-0.1411772519350052,
0.043864328414201736,
-0.024600407108664513,
0.0009313329355791211,
0.09877564013004303,
0.04886409267783165,
0.10878810286521912,
-0.018178649246692657,
0.05817160755395889,
-0.20863699913024902,
0.0026901536621153355,
0.026968171820044518,
0.06566661596298218,
0.037538234144449234,
0.07518620043992996,
0.07301810383796692,
0.10265753418207169,
-0.026611898094415665,
0.04593522846698761,
0.06279993802309036,
-0.06302638351917267,
-0.07252753525972366,
-0.08989223092794418,
0.0914304330945015,
0.06775107234716415,
0.08336988091468811,
-0.006549705285578966,
0.022176304832100868,
-0.07148046046495438,
0.08926532417535782,
0.1408882737159729,
-0.24836906790733337,
-0.021535305306315422,
0.11625640094280243,
0.07045452296733856,
0.05113653093576431,
-0.06885954737663269,
-0.02876809611916542,
0.09967243671417236,
0.044454313814640045,
0.035995904356241226,
-0.09571173042058945,
-0.030516192317008972,
-0.0012157905148342252,
-0.058007191866636276,
-0.05227960646152496,
0.12188215553760529,
0.03264877572655678,
-0.06411635875701904,
-0.10726926475763321,
-0.045468270778656006,
-0.1967257708311081,
0.038997016847133636,
0.017041178420186043,
0.0131001565605402,
-0.0021772137843072414,
0.007122998125851154,
-0.02416078932583332,
-0.08297104388475418,
-0.12068517506122589,
0.04796682298183441,
0.027341295033693314,
0.07035797089338303,
0.04341721162199974,
-0.02172180637717247,
0.08717050403356552,
0.021098731085658073,
-0.040393173694610596,
-0.01979711279273033,
0.005360555835068226,
-0.1146775558590889,
0.031226936727762222,
-0.010628730058670044,
-0.03926241025328636,
0.011104132980108261,
0.0923173800110817,
-0.09419559687376022,
0.08687296509742737,
0.08438695967197418,
0.023245908319950104,
-0.00395679846405983,
0.20652741193771362,
0.06227429583668709,
-0.14011603593826294,
0.02411670610308647,
0.015431673265993595,
-0.002866300055757165,
0.009622125886380672,
-0.06012898311018944,
-0.05754859000444412,
0.03716709092259407,
0.07137562334537506,
-0.11749795079231262,
0.006693542934954166,
-0.06163371354341507,
-0.011556107550859451,
0.08830522745847702,
-0.1080293282866478,
0.04378150403499603,
0.018075676634907722,
-0.058257367461919785,
-0.04311513528227806,
0.057170066982507706,
-0.11421925574541092,
-0.11238107085227966,
0.039246637374162674,
-0.02846030332148075,
-0.02408403903245926,
-0.10973828285932541,
-0.10605931282043457,
-0.013261513784527779,
-0.06309260427951813,
0.009760314598679543,
-0.10644116997718811,
-0.10121305286884308,
-0.029183654114603996,
0.032026201486587524,
-0.018733737990260124,
-0.041144296526908875,
-0.03790942206978798,
0.011704148724675179,
-0.008909435011446476,
-0.015535985119640827,
0.020137213170528412,
-0.022698329761624336,
0.0907769724726677,
0.08076610416173935,
0.03433062508702278,
0.0021781413815915585,
0.022214222699403763,
-0.07896463572978973,
0.07778415083885193,
-0.09708268195390701,
0.07505348324775696,
-0.007641254458576441,
0.06028695032000542,
-0.10736767202615738,
-0.08046268671751022,
0.01797596737742424,
0.04529012739658356,
0.06798673421144485,
0.03953845053911209,
-0.13678552210330963,
0.029459470883011818,
0.16096250712871552,
-0.10710886865854263,
-0.11522451043128967,
0.1264798641204834,
-0.01094450056552887,
0.0019578703213483095,
0.0679452121257782,
0.1404307782649994,
0.14437668025493622,
-0.06782867759466171,
-0.037261586636304855,
0.06483614444732666,
0.05245869234204292,
-0.06241978332400322,
0.07756523042917252,
0.020474527031183243,
-0.007470983080565929,
0.02546127699315548,
0.061238836497068405,
0.0334310419857502,
0.002677851589396596,
-0.034960608929395676,
-0.04530293866991997,
-0.09355965256690979,
-0.01297706738114357,
-0.012007750570774078,
0.029137808829545975,
-0.06047852337360382,
-0.06801401078701019,
-0.030967416241765022,
0.17412593960762024,
-0.0804930180311203,
0.032533057034015656,
-0.08115825057029724,
-0.029167287051677704,
-0.05058138445019722,
0.01815914176404476,
-0.13374978303909302,
0.0643942803144455,
0.07263100147247314,
-0.005680982489138842,
0.06537371128797531,
0.09335765242576599,
0.012011679820716381,
0.02633342146873474,
-0.0622662715613842,
-0.052733294665813446,
-0.049073126167058945,
-0.08112359791994095,
-0.11226662993431091,
-0.03255300596356392,
-0.08805195242166519,
-0.034739330410957336,
-0.02890469878911972,
-0.18362994492053986,
0.002119175624102354,
0.009458948858082294,
0.0360289104282856,
0.03844762220978737,
-0.04081385210156441,
0.024542609229683876,
0.05398184433579445,
-0.04180343076586723,
-0.07859262824058533,
0.02816588059067726,
0.04250916838645935,
-0.07158907502889633,
-0.0450458899140358,
-0.06330002844333649,
-0.07376877963542938,
0.06147165223956108,
0.09823296964168549,
-0.1126355528831482,
-0.015475459396839142,
-0.019603027030825615,
-0.03862309083342552,
-0.048093050718307495,
-0.045055076479911804,
0.20367373526096344,
0.019244756549596786,
0.16265302896499634,
-0.1317521631717682,
-0.04798240214586258,
-0.025379925966262817,
-0.0053249322809278965,
0.04168649762868881,
0.16029198467731476,
-0.007936670444905758,
-0.11771494150161743,
0.034694839268922806,
-0.054094284772872925,
-0.06060997396707535,
0.14142915606498718,
-0.017817629501223564,
-0.05527767166495323,
-0.00801795907318592,
0.11675380170345306,
0.005488659255206585,
0.1490054577589035,
-0.04249338060617447,
0.0015166044468060136,
-0.0069260308519005775,
0.012950708158314228,
0.03407808393239975,
-0.13711172342300415,
0.027109939604997635,
0.03868066519498825,
-0.06202661618590355,
-0.021027186885476112,
-0.024833494797348976,
-0.039676323533058167,
0.03915124759078026,
0.022367579862475395,
0.03123677335679531,
-0.01442513708025217,
-0.02679711952805519,
-0.10030452907085419,
0.17150241136550903,
-0.07505921274423599,
-0.1870260089635849,
-0.1639523059129715,
0.07191967219114304,
-0.030878733843564987,
-0.01927821710705757,
0.039411261677742004,
-0.10944714397192001,
-0.03361335024237633,
-0.09038934856653214,
0.11656953394412994,
-0.1332118958234787,
0.004586163442581892,
-0.02492273971438408,
0.07795866578817368,
0.04943705350160599,
-0.1494244486093521,
0.02376154623925686,
-0.016398433595895767,
0.013478636741638184,
-0.0004850872210226953,
-0.046070586889982224,
0.0640861839056015,
0.10932064056396484,
-0.06522900611162186,
0.02008083648979664,
-0.004652749747037888,
0.17338697612285614,
-0.04169644042849541,
0.027922989800572395,
0.20341144502162933,
0.0075680878944695,
0.04287386313080788,
0.06165611371397972,
0.017156150192022324,
-0.08368680626153946,
0.0600496381521225,
0.04351721704006195,
-0.0314808189868927,
-0.24151216447353363,
-0.01595240831375122,
-0.07477526366710663,
0.07515808194875717,
0.11745679378509521,
0.06280364096164703,
-0.1585097461938858,
0.022911934182047844,
-0.0016390759265050292,
0.15011759102344513,
-0.035523075610399246,
0.060791999101638794,
-0.0057310499250888824,
0.017262909561395645,
0.004674152471125126,
-0.10010422021150589,
0.004828691482543945,
0.08066683262586594,
0.11295788735151291,
0.2136872112751007,
-0.08797106891870499,
0.15057314932346344,
0.011189762502908707,
0.11388202011585236,
0.05473138764500618,
0.07997261732816696,
-0.12754623591899872,
0.007105052471160889,
-0.002351256087422371,
-0.01329846028238535,
-0.07128432393074036,
0.05402383208274841,
-0.018058879300951958,
0.054297130554914474,
-0.05115886032581329,
0.02243085578083992,
0.015547000803053379,
0.214668869972229,
0.08787811547517776,
-0.15188726782798767,
-0.11454035341739655,
0.011734366416931152,
-0.08661261200904846,
-0.09798508882522583,
0.06640689074993134,
0.19011595845222473,
-0.07068026065826416,
0.021167583763599396,
-0.025147248059511185,
0.13245950639247894,
-0.12285678088665009,
-0.0191793292760849,
0.022768434137105942,
0.053165510296821594,
0.011957386508584023,
0.11231887340545654,
-0.26166465878486633,
0.07675941288471222,
0.015452966094017029,
0.0841602236032486,
-0.0075745964422822,
0.06551670283079147,
-0.04275435209274292,
0.002695600502192974,
0.07812957465648651,
0.009942607954144478,
-0.08307009935379028,
-0.1982623189687729,
-0.036052726209163666,
0.011244846507906914,
0.06963223218917847,
-0.011748774908483028,
0.09013988822698593,
-0.03295397758483887,
0.04552562162280083,
-0.04007826745510101,
-0.1337222456932068,
-0.06711306422948837,
-0.13292045891284943,
-0.035306863486766815,
-0.02337348461151123,
-0.06750086694955826,
-0.02922249212861061,
0.04511984437704086,
0.05723651126027107,
0.2082115262746811,
-0.15331299602985382,
-0.07707121968269348,
-0.08824850618839264,
0.06395865231752396,
0.11522584408521652,
-0.0985453724861145,
0.020860852673649788,
0.011021113954484463,
0.04914625734090805,
-0.04318608343601227,
-0.06040928140282631,
0.016039269044995308,
-0.05178651213645935,
-0.10682342946529388,
-0.03429779037833214,
0.10183706134557724,
-0.03120247833430767,
0.05251637473702431,
0.001779451733455062,
-0.06163515895605087,
-0.037154827266931534,
-0.115791454911232,
-0.05031720921397209,
-0.03511704131960869,
0.027659472078084946,
0.00023896693892311305,
-0.1002940684556961,
0.09340761601924896,
-0.017092067748308182,
-0.09364169836044312,
0.10128721594810486,
0.20834918320178986,
-0.0844993144273758,
0.03303459286689758,
0.07026734203100204,
-0.05122258886694908,
-0.18624477088451385,
-0.05376426875591278,
0.05579449236392975,
0.06986497342586517,
-0.011634827591478825,
-0.15669791400432587,
0.04033445566892624,
-0.009403401985764503,
0.008233398199081421,
-0.00164587062317878,
-0.26446476578712463,
-0.11895275115966797,
-0.011737463995814323,
0.069108746945858,
0.06596807390451431,
-0.0983894094824791,
-0.05370141938328743,
-0.0635996088385582,
-0.029398471117019653,
0.02925335243344307,
0.06789351999759674,
0.11317422240972519,
-0.045322220772504807,
0.018443873152136803,
0.04657094180583954,
-0.022831616923213005,
0.06370877474546432,
-0.04327073320746422,
0.10125745087862015,
-0.008182965219020844,
-0.011208211071789265,
0.032201774418354034,
-0.06279122829437256,
0.15401151776313782,
-0.18558438122272491,
0.10960161685943604,
-0.18223653733730316,
-0.042119357734918594,
-0.006178941577672958,
-0.022991502657532692,
-0.03411078080534935,
-0.048246681690216064,
-0.11362289637327194,
0.030743345618247986,
0.04262499138712883,
-0.025550438091158867,
0.04224352166056633,
-0.007778675761073828,
-0.06323415040969849,
0.1186150312423706,
0.057464491575956345,
0.042313531041145325,
-0.14501869678497314,
0.020145738497376442,
0.017255788668990135,
0.07921773940324783,
-0.20381681621074677,
0.024073513224720955,
0.1027270033955574,
0.022379353642463684,
0.09767596423625946,
0.006254789885133505,
-0.08946989476680756,
0.041133634746074677,
0.06638478487730026,
-0.05601233243942261,
-0.125308558344841,
-0.01626889780163765,
-0.01777837797999382,
-0.09295325726270676,
0.035502489656209946,
0.10324820876121521,
-0.06167345494031906,
-0.025736508890986443,
-0.0040110195986926556,
0.020156703889369965,
-0.08225040137767792,
0.18517227470874786,
0.024265529587864876,
0.08464377373456955,
-0.06161678954958916,
0.08235033601522446,
0.09626172482967377,
-0.06894146651029587,
0.024322841316461563,
0.15832297503948212,
-0.07832109928131104,
-0.030667226761579514,
0.0865660011768341,
0.12279070168733597,
-0.003862170036882162,
-0.047645408660173416,
-0.11219830065965652,
-0.0708395391702652,
0.017137648537755013,
-0.012678862549364567,
0.0787094384431839,
0.07640755921602249,
-0.03960300236940384,
-0.004307781811803579,
-0.1082875207066536,
0.09807782620191574,
0.08497379720211029,
0.053382258862257004,
-0.16756974160671234,
0.11588890105485916,
0.03200191259384155,
0.089556485414505,
0.005560068413615227,
0.03405274078249931,
-0.10842764377593994,
0.036433037370443344,
-0.029312118887901306,
0.051876913756132126,
0.02696610987186432,
0.05244242399930954,
-0.037929434329271317,
0.042721688747406006,
-0.027825355529785156,
0.04690397158265114,
-0.04680277407169342,
-0.03127282112836838,
-0.03862440213561058,
0.03432361036539078,
-0.05098375678062439,
-0.026377631351351738,
0.01514117419719696,
-0.07997985929250717,
0.10383229702711105,
-0.06425663083791733,
-0.015415998175740242,
0.0013448791578412056,
0.03752917796373367,
0.05751274153590202,
0.016517065465450287,
0.04687939211726189,
-0.021309401839971542,
-0.002670299494639039,
0.028801996260881424,
-0.0033692645374685526,
-0.008784998208284378,
0.0009265356347896159,
0.10697609186172485,
-0.13897529244422913,
-0.08212994784116745,
-0.10963990539312363,
-0.08208513259887695,
-0.06269218027591705,
0.08437500894069672,
0.07473472505807877,
0.099211186170578,
0.10251964628696442,
-0.040403492748737335,
0.02063026838004589,
-0.14371775090694427,
-0.03584900498390198,
0.05187438800930977,
-0.01687769591808319,
-0.13161341845989227,
-0.051825542002916336,
0.05496426671743393,
-0.023981790989637375,
0.10521509498357773,
0.0009049462387338281,
0.027213474735617638,
-0.017303025349974632,
-0.04419755935668945,
-0.05810093507170677,
0.006114371120929718,
0.15779654681682587,
-0.10358886420726776,
0.003944334574043751,
-0.009094302542507648,
-0.0007121398812159896,
0.024659128859639168,
0.19183452427387238,
0.1060413271188736,
0.1756087839603424,
0.0514429546892643,
0.06868351995944977,
-0.043722521513700485,
-0.015137679874897003,
-0.13764497637748718,
0.08634360134601593,
-0.026401551440358162,
0.03137835115194321,
-0.06211693584918976,
0.18542347848415375,
0.0995832309126854,
-0.13175034523010254,
0.10227590054273605,
0.005611996632069349,
-0.08447493612766266,
-0.051398247480392456,
-0.07343081384897232,
-0.055750951170921326,
-0.12425841391086578,
0.006628108210861683,
-0.0890888124704361,
0.011123945005238056,
0.08034885674715042,
0.021042095497250557,
-0.023352552205324173,
0.11241337656974792,
-0.024229397997260094,
-0.04468429088592529,
0.04039005562663078,
0.03223424777388573,
0.01650208793580532,
0.1219014897942543,
0.018618395552039146,
0.06153261289000511,
-0.07167056947946548,
0.08213706314563751,
0.032476022839546204,
-0.012669721618294716,
0.0053296517580747604,
0.01968567445874214,
-0.01962890289723873,
-0.041480597108602524,
-0.005320066586136818,
0.08129574358463287,
0.17281711101531982,
0.04342462494969368,
-0.04227697104215622,
-0.05287507548928261,
0.21397727727890015,
-0.056453343480825424,
-0.04955388605594635,
-0.11048649251461029,
0.15310271084308624,
0.03936344385147095,
0.021431295201182365,
0.021738028153777122,
-0.07848721742630005,
-0.027341710403561592,
0.22051101922988892,
0.06441689282655716,
-0.03807951509952545,
-0.026653824374079704,
0.012215442024171352,
-0.005642877891659737,
-0.04218343645334244,
0.15512323379516602,
-0.0017038281075656414,
0.20254088938236237,
0.0025234816130250692,
0.001975459046661854,
-0.033204030245542526,
-0.0477462001144886,
-0.021943753585219383,
0.18913383781909943,
-0.04043622314929962,
0.03143720701336861,
-0.10138443112373352,
-0.006186546292155981,
0.010406754910945892,
-0.1438586413860321,
0.12603303790092468,
-0.13621363043785095,
-0.08499466627836227,
0.02371940203011036,
0.07359959930181503,
-0.03860955685377121,
0.06414110213518143,
-0.020453037694096565,
0.061403557658195496,
0.029542753472924232,
-0.03279988840222359,
-0.0962589681148529,
-0.1353517323732376,
0.05093366652727127,
-0.015397278591990471,
0.13052205741405487,
0.007581683341413736,
0.09104008972644806,
0.09512700885534286,
0.003813359886407852,
-0.09041428565979004,
0.06832710653543472,
0.026219086721539497,
-0.016647854819893837,
0.05332161486148834,
0.14011503756046295,
-0.04557637870311737,
0.13157851994037628,
0.023065539076924324,
-0.020877137780189514,
-0.030413130298256874,
-0.007901265285909176,
0.0007159123779274523,
-0.1658044010400772,
0.013146664947271347,
-0.06910115480422974,
0.12426971644163132,
0.1927815079689026,
-0.04767422378063202,
-0.012876522727310658,
-0.04436406493186951,
0.07164863497018814,
-0.008219791576266289,
0.09229029715061188,
0.004382167477160692,
-0.16693277657032013,
0.013496101833879948,
0.015210204757750034,
0.014315379783511162,
-0.1998635083436966,
-0.06386358290910721,
-0.03536481782793999,
-0.021234216168522835,
-0.10583174973726273,
0.1602933555841446,
0.07157155871391296,
0.020054461434483528,
-0.03046766109764576,
-0.20995360612869263,
-0.03270156309008598,
0.04308011382818222,
-0.11407040804624557,
-0.11806727945804596
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the ruby function/method.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_ruby_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_code_documentation_generation_ruby_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/ruby/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_large_code_documentation_generation_ruby_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the ruby function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09936226159334183,
0.05641235038638115,
-0.0012303452240303159,
0.11936061084270477,
0.047170497477054596,
0.024878187105059624,
0.046666838228702545,
0.11062204092741013,
-0.05556945875287056,
0.06041008606553078,
0.053351227194070816,
-0.04571954160928726,
0.06503932178020477,
0.16854943335056305,
0.00802221056073904,
-0.14220724999904633,
-0.04534033313393593,
0.02903488092124462,
-0.07521350681781769,
0.10946819931268692,
0.08290393650531769,
-0.08898895978927612,
0.0774410143494606,
-0.053098034113645554,
-0.13558784127235413,
0.044824909418821335,
-0.025680378079414368,
-0.004750253167003393,
0.09829708188772202,
0.06543570756912231,
0.12051327526569366,
-0.0183763075619936,
0.06649023294448853,
-0.21021594107151031,
0.0026993160136044025,
0.0284065380692482,
0.06611230224370956,
0.04621221497654915,
0.07146686315536499,
0.08100123703479767,
0.09845238924026489,
-0.028735361993312836,
0.042877428233623505,
0.05653927102684975,
-0.06351069360971451,
-0.05565549060702324,
-0.09166644513607025,
0.09804876148700714,
0.06895830482244492,
0.09015268832445145,
-0.004725164268165827,
0.05249916762113571,
-0.06361610442399979,
0.09371735155582428,
0.14171835780143738,
-0.25222891569137573,
-0.02504236437380314,
0.11889320611953735,
0.0763305127620697,
0.054205913096666336,
-0.07046877592802048,
-0.033204518258571625,
0.10580172389745712,
0.04162762686610222,
0.044316958636045456,
-0.08965497463941574,
-0.01926514506340027,
-0.0034559250343590975,
-0.060395874083042145,
-0.0492364726960659,
0.15850576758384705,
0.03380798175930977,
-0.06230833753943443,
-0.10194295644760132,
-0.03881020471453667,
-0.21005946397781372,
0.03360280394554138,
0.009557933546602726,
0.00960632599890232,
-0.007089819759130478,
-0.005362380761653185,
-0.00711815943941474,
-0.08148020505905151,
-0.1223042756319046,
0.038759052753448486,
0.024361595511436462,
0.06691979616880417,
0.03900795057415962,
-0.04359001666307449,
0.08604471385478973,
0.04828552156686783,
-0.03499385714530945,
-0.012491796165704727,
0.00019605027046054602,
-0.10999728739261627,
0.011917022988200188,
-0.013044701889157295,
-0.060993123799562454,
-0.004089429974555969,
0.0844244435429573,
-0.08806219696998596,
0.08650526404380798,
0.08610757440328598,
0.025031138211488724,
0.007341524586081505,
0.2089005410671234,
0.05837723985314369,
-0.1547912061214447,
0.025628743693232536,
0.0178474560379982,
-0.011109299026429653,
0.01568843238055706,
-0.04828409478068352,
-0.054539378732442856,
0.042686887085437775,
0.06678321957588196,
-0.12113670259714127,
0.024708136916160583,
-0.056972015649080276,
-0.014452122151851654,
0.08092552423477173,
-0.11486660689115524,
0.034760184586048126,
0.012341764755547047,
-0.061733677983284,
-0.03710499405860901,
0.06705182790756226,
-0.12220194190740585,
-0.11438677459955215,
0.036946021020412445,
-0.03168391436338425,
-0.0347253754734993,
-0.12048088759183884,
-0.114047110080719,
-0.021802974864840508,
-0.03202614188194275,
0.0029047203715890646,
-0.10507475584745407,
-0.10171059519052505,
-0.023503603413701057,
0.03513488546013832,
-0.008596819825470448,
-0.03366803005337715,
-0.03878689184784889,
0.010074751451611519,
-0.009152999147772789,
-0.01308218389749527,
0.01892983727157116,
-0.019698889926075935,
0.09528177976608276,
0.08165764063596725,
0.03477208688855171,
-0.001698188134469092,
0.021610695868730545,
-0.08167248219251633,
0.06985446065664291,
-0.10159610211849213,
0.07346245646476746,
-0.00761814508587122,
0.05797778442502022,
-0.11263089627027512,
-0.0805211216211319,
0.005610621068626642,
0.04644623398780823,
0.0793319046497345,
0.04529058188199997,
-0.15865223109722137,
0.03512437269091606,
0.15840359032154083,
-0.1131955161690712,
-0.1302087903022766,
0.11285987496376038,
-0.01659328117966652,
0.01706082746386528,
0.06892293691635132,
0.1261148750782013,
0.14247672259807587,
-0.06853767484426498,
-0.042340345680713654,
0.06678453087806702,
0.045951396226882935,
-0.06514965742826462,
0.06668440252542496,
0.0279526486992836,
-0.03649776428937912,
0.02114962972700596,
0.06270574778318405,
0.03970666229724884,
-0.006544079631567001,
-0.03433234617114067,
-0.0398654080927372,
-0.10009093582630157,
-0.03213623911142349,
-0.010193102061748505,
0.03333338722586632,
-0.051892489194869995,
-0.053710728883743286,
-0.04095127433538437,
0.16731292009353638,
-0.08437727391719818,
0.029512479901313782,
-0.08498706668615341,
-0.03797515481710434,
-0.05256868526339531,
0.01844291388988495,
-0.13729162514209747,
0.06577662378549576,
0.07225583493709564,
-0.003493786556646228,
0.06365139782428741,
0.08197034895420074,
0.009896575473248959,
0.013492058962583542,
-0.0633823350071907,
-0.049256831407547,
-0.03324412927031517,
-0.08105295896530151,
-0.11283041536808014,
-0.03786129876971245,
-0.08849934488534927,
-0.030981779098510742,
-0.04695663973689079,
-0.18298451602458954,
-0.0011558285914361477,
-0.00223827944137156,
0.03310509771108627,
0.04108845070004463,
-0.03667394071817398,
0.02046641707420349,
0.05348319932818413,
-0.04327063634991646,
-0.07565683871507645,
0.02062235400080681,
0.046286098659038544,
-0.08124437183141708,
-0.04995938017964363,
-0.062015168368816376,
-0.08521202951669693,
0.07152944058179855,
0.10110602527856827,
-0.1330358237028122,
-0.024488838389515877,
-0.022534657269716263,
-0.03447730094194412,
-0.04205573722720146,
-0.03983701393008232,
0.20293095707893372,
0.017955714836716652,
0.15941840410232544,
-0.13371075689792633,
-0.055998794734478,
-0.027713323011994362,
0.005750682670623064,
0.04177074134349823,
0.15259015560150146,
0.014493304304778576,
-0.13105608522891998,
0.028761757537722588,
-0.06448118388652802,
-0.0542624294757843,
0.1373520791530609,
-0.020816536620259285,
-0.048798978328704834,
-0.0020677740685641766,
0.10859637707471848,
0.010195179842412472,
0.17482790350914001,
-0.020373322069644928,
0.0023225515615195036,
-0.0080244280397892,
0.004003453534096479,
0.03778442367911339,
-0.13001346588134766,
0.03215113282203674,
0.038801733404397964,
-0.04771902784705162,
-0.022997500374913216,
-0.028570547699928284,
-0.04052296280860901,
0.04208940640091896,
0.01668996922671795,
0.03983866050839424,
-0.020832741633057594,
-0.031248481944203377,
-0.10385989397764206,
0.17565588653087616,
-0.07307795435190201,
-0.18664826452732086,
-0.1558961719274521,
0.08999177068471909,
-0.01808512769639492,
-0.0231351125985384,
0.029035314917564392,
-0.10057391971349716,
-0.04247410595417023,
-0.09675228595733643,
0.11942175775766373,
-0.1278616040945053,
0.009094360284507275,
-0.022686876356601715,
0.06938688457012177,
0.04462626203894615,
-0.15652036666870117,
0.03068036213517189,
-0.021020837128162384,
0.020617665722966194,
0.00020593326189555228,
-0.0600280836224556,
0.07059620320796967,
0.10406826436519623,
-0.07300078123807907,
0.021510353311896324,
-0.014745093882083893,
0.18066193163394928,
-0.05204755440354347,
0.03795000538229942,
0.17753303050994873,
0.008966061286628246,
0.035116974264383316,
0.06207253411412239,
0.012495240196585655,
-0.08846742659807205,
0.06660827249288559,
0.033917978405952454,
-0.024139748886227608,
-0.2297814041376114,
-0.022302750498056412,
-0.0763893723487854,
0.07452549785375595,
0.11996205151081085,
0.0498974546790123,
-0.16245071589946747,
0.027295809239149094,
-0.0034677735529839993,
0.16597680747509003,
-0.03171613812446594,
0.06559979170560837,
-0.025627803057432175,
0.02312466688454151,
0.0037238472141325474,
-0.10490675270557404,
0.0028490640688687563,
0.07514496147632599,
0.10348735749721527,
0.21396061778068542,
-0.08895646780729294,
0.13878731429576874,
0.0017366142710670829,
0.1189340129494667,
0.059491608291864395,
0.10586266964673996,
-0.13347694277763367,
0.010283676907420158,
-0.0028917714953422546,
-0.012346711941063404,
-0.08163201063871384,
0.04608119651675224,
-0.03558846935629845,
0.06252233684062958,
-0.05387919768691063,
0.022102907299995422,
0.01625409536063671,
0.20570391416549683,
0.06790406256914139,
-0.15663935244083405,
-0.11689336597919464,
0.0012753086630254984,
-0.07882261276245117,
-0.09512222558259964,
0.0678698942065239,
0.18125340342521667,
-0.06384043395519257,
0.02127978764474392,
-0.025537529960274696,
0.13394439220428467,
-0.11693892627954483,
-0.02306411974132061,
0.021554013714194298,
0.05698710307478905,
0.0022382892202585936,
0.10683518648147583,
-0.2719099819660187,
0.08358103781938553,
0.01715879514813423,
0.08935663849115372,
-0.015406576916575432,
0.06040972098708153,
-0.04719291999936104,
0.003932262305170298,
0.07943087071180344,
0.010709601454436779,
-0.08176105469465256,
-0.19257201254367828,
-0.028964033350348473,
0.01713506691157818,
0.07124148309230804,
0.0006811736384406686,
0.09203432500362396,
-0.030980991199612617,
0.04816007241606712,
-0.03302575275301933,
-0.12675940990447998,
-0.0782308429479599,
-0.13665562868118286,
-0.040685977786779404,
-0.025786759331822395,
-0.06260242313146591,
-0.0291550625115633,
0.05221250280737877,
0.049210187047719955,
0.1997506469488144,
-0.16390080749988556,
-0.06331625580787659,
-0.08831710368394852,
0.06743840128183365,
0.1196301132440567,
-0.09270040690898895,
0.020021837204694748,
0.016477182507514954,
0.05408778414130211,
-0.04709239304065704,
-0.06932312250137329,
0.01985304430127144,
-0.05909200757741928,
-0.09192320704460144,
-0.0364355742931366,
0.09075719118118286,
-0.01902111805975437,
0.05346531793475151,
0.00813612062484026,
-0.07138296216726303,
-0.03767133504152298,
-0.1171361654996872,
-0.06357544660568237,
-0.0414666123688221,
0.020481325685977936,
0.0021490994840860367,
-0.11774635314941406,
0.05616786703467369,
-0.022966476157307625,
-0.09292061626911163,
0.09132636338472366,
0.18377752602100372,
-0.08299601078033447,
0.018540576100349426,
0.06041410565376282,
-0.055775970220565796,
-0.18852324783802032,
-0.0426984578371048,
0.053510475903749466,
0.06962116807699203,
-0.013897119089961052,
-0.15156559646129608,
0.047231417149305344,
-0.025711631402373314,
0.0162627175450325,
-0.014416852965950966,
-0.24703972041606903,
-0.11833139508962631,
-0.003083133604377508,
0.072543203830719,
0.048267245292663574,
-0.09263744205236435,
-0.04960327968001366,
-0.06671229004859924,
-0.025773942470550537,
0.043276309967041016,
0.08010008186101913,
0.10719089955091476,
-0.03955037519335747,
0.017834732308983803,
0.04685383290052414,
-0.026948831975460052,
0.04291065037250519,
-0.03957942873239517,
0.11354217678308487,
-0.00643755029886961,
-0.016034677624702454,
0.03595571592450142,
-0.055721987038850784,
0.15669134259223938,
-0.1863052397966385,
0.11935614049434662,
-0.17758919298648834,
-0.0378175713121891,
-0.010837112553417683,
-0.017759235575795174,
-0.03626527264714241,
-0.04723555967211723,
-0.12368188053369522,
0.045498497784137726,
0.05690281465649605,
-0.02848738431930542,
0.03342922776937485,
-0.002117230324074626,
-0.06113114580512047,
0.08930689096450806,
0.07233580946922302,
0.04885700345039368,
-0.12478531152009964,
0.02802908420562744,
0.018010420724749565,
0.0885966569185257,
-0.18434281647205353,
0.022756999358534813,
0.10390997678041458,
0.022341223433613777,
0.09686607122421265,
0.011626003310084343,
-0.08830700069665909,
0.02441599778831005,
0.06746163219213486,
-0.05993802845478058,
-0.10172602534294128,
-0.013331201858818531,
0.010588474571704865,
-0.08734387159347534,
0.03885508328676224,
0.08943109959363937,
-0.06663717329502106,
-0.01936577819287777,
-0.0059590269811451435,
0.014049417339265347,
-0.07823298871517181,
0.1777644157409668,
0.019122019410133362,
0.08333364874124527,
-0.056509099900722504,
0.08161812275648117,
0.09552659839391708,
-0.07287599891424179,
0.028285833075642586,
0.14568597078323364,
-0.0837244838476181,
-0.021935945376753807,
0.11557365953922272,
0.14487217366695404,
-0.012831238098442554,
-0.04636311158537865,
-0.10506986081600189,
-0.07508993148803711,
0.0129086347296834,
0.022258246317505836,
0.07233264297246933,
0.07275372743606567,
-0.03286485746502876,
-0.006441221572458744,
-0.11803890764713287,
0.09508288651704788,
0.0836087241768837,
0.05239211022853851,
-0.14908190071582794,
0.13371844589710236,
0.03074834495782852,
0.0689023956656456,
0.00312375882640481,
0.041877631098032,
-0.11325492709875107,
0.03515153378248215,
-0.003197177778929472,
0.049596384167671204,
0.02419508807361126,
0.04988199099898338,
-0.037527669221162796,
0.05021083354949951,
-0.02751854807138443,
0.043727047741413116,
-0.043387118726968765,
-0.02420561946928501,
-0.034821026027202606,
0.02999294176697731,
-0.047078635543584824,
-0.020952560007572174,
0.014953727833926678,
-0.08192052692174911,
0.09118626266717911,
-0.06434939056634903,
-0.012249735184013844,
0.0011282479390501976,
0.03447386622428894,
0.05395421385765076,
0.00168329244479537,
0.05344266816973686,
-0.018457463011145592,
0.002967460546642542,
0.02421703189611435,
0.00598698016256094,
-0.0166595671325922,
-0.0010167709551751614,
0.10732374340295792,
-0.1369914710521698,
-0.07807335257530212,
-0.10076719522476196,
-0.06475132703781128,
-0.06131112575531006,
0.08752622455358505,
0.0759330466389656,
0.09083130955696106,
0.09619265794754028,
-0.0410291850566864,
0.013427878729999065,
-0.15081290900707245,
-0.03549240902066231,
0.05532175675034523,
-0.015468357130885124,
-0.13446027040481567,
-0.049784671515226364,
0.06311620026826859,
-0.026605794206261635,
0.11284057796001434,
0.002403859281912446,
0.012427386827766895,
-0.017919141799211502,
-0.048548780381679535,
-0.07214102149009705,
0.00399074936285615,
0.1761600524187088,
-0.10379743576049805,
0.0036703646183013916,
-0.007530563045293093,
0.005722720175981522,
0.01770690642297268,
0.1868162602186203,
0.12130331993103027,
0.16617421805858612,
0.03206793963909149,
0.06623253971338272,
-0.04724007472395897,
-0.027879957109689713,
-0.09565605968236923,
0.07266177982091904,
-0.027547914534807205,
0.030490899458527565,
-0.04979807510972023,
0.18826065957546234,
0.08310974389314651,
-0.1314050555229187,
0.1067531481385231,
-0.0008435228955931962,
-0.0873878225684166,
-0.038988709449768066,
-0.07257742434740067,
-0.04952123016119003,
-0.11525988578796387,
0.006195286754518747,
-0.0998905599117279,
-0.0030590782407671213,
0.0590045265853405,
0.02233981527388096,
-0.02265014871954918,
0.12733852863311768,
-0.04438666254281998,
-0.04786524176597595,
0.04516816511750221,
0.03784382343292236,
0.005978990811854601,
0.09373003244400024,
0.02029610425233841,
0.058362096548080444,
-0.07728967815637589,
0.07588838040828705,
0.03007001057267189,
-0.016352079808712006,
0.013364131562411785,
0.0449262373149395,
-0.015552341938018799,
-0.03850822150707245,
-0.020845795050263405,
0.08185960352420807,
0.17126484215259552,
0.035474829375743866,
-0.030722081661224365,
-0.05844901129603386,
0.20615284144878387,
-0.058502957224845886,
-0.05493541061878204,
-0.11461261659860611,
0.1560683250427246,
0.04200168699026108,
0.017486318945884705,
0.02317034639418125,
-0.08100900053977966,
-0.016602128744125366,
0.24160124361515045,
0.06596499681472778,
-0.045530080795288086,
-0.025459738448262215,
0.006653675809502602,
-0.004094691481441259,
-0.04367629066109657,
0.1464923918247223,
0.00648842565715313,
0.2003660798072815,
0.0028091799467802048,
0.005227000918239355,
-0.04291681945323944,
-0.04882633313536644,
0.00578042957931757,
0.19458559155464172,
-0.024568790569901466,
0.024675482884049416,
-0.10121024399995804,
-0.005930433515459299,
0.0034088159445673227,
-0.16540324687957764,
0.12309350818395615,
-0.14082071185112,
-0.0747184157371521,
0.013729900121688843,
0.06995774060487747,
-0.04402446001768112,
0.0582454577088356,
-0.01973172090947628,
0.07194805890321732,
0.025990759953856468,
-0.023797396570444107,
-0.09295473247766495,
-0.14465630054473877,
0.05145733430981636,
-0.0243088211864233,
0.12112953513860703,
0.011388091370463371,
0.09177181869745255,
0.08972779661417007,
0.00518985278904438,
-0.08559349924325943,
0.06722203642129898,
0.023477422073483467,
-0.006110933609306812,
0.04840518534183502,
0.13507120311260223,
-0.04501114785671234,
0.1566072404384613,
0.012774970382452011,
-0.022751890122890472,
-0.02105412445962429,
-0.011013932526111603,
-0.004304706584662199,
-0.16239199042320251,
0.015396510250866413,
-0.06309995800256729,
0.13575811684131622,
0.19672317802906036,
-0.04549727961421013,
-0.004492092877626419,
-0.04843619465827942,
0.07581037282943726,
-0.00637218588963151,
0.08771383762359619,
0.00353122316300869,
-0.16538377106189728,
0.007671186700463295,
-0.006661264691501856,
0.010826697573065758,
-0.19548656046390533,
-0.05436699092388153,
-0.0429023802280426,
-0.03166569396853447,
-0.10403791069984436,
0.1494446098804474,
0.08411125838756561,
0.025907019153237343,
-0.035766761749982834,
-0.1695692092180252,
-0.024401184171438217,
0.03856028616428375,
-0.11310208588838577,
-0.11476600915193558
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_commit_generation_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_commit_generation_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/commit%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_large_commit_generation_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
145
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11985477060079575,
-0.011115065775811672,
-0.0014715079450979829,
0.11705095320940018,
0.1163502112030983,
0.01670173928141594,
0.08602911233901978,
0.0769004225730896,
-0.018157577142119408,
0.021893378347158432,
0.05569656565785408,
0.00946712214499712,
0.06622704863548279,
0.20431731641292572,
0.024726953357458115,
-0.1496368944644928,
-0.00564967654645443,
0.026259364560246468,
-0.02505396492779255,
0.13128377497196198,
0.10331767052412033,
-0.07159788906574249,
0.06076867878437042,
-0.032771360129117966,
-0.21939381957054138,
0.03738460689783096,
-0.00727847358211875,
-0.06009337306022644,
0.10136934369802475,
0.05193085968494415,
0.12072965502738953,
-0.0006306574796326458,
0.018142294138669968,
-0.12061327695846558,
0.010334452614188194,
0.049448296427726746,
0.023568307980895042,
0.01632799580693245,
0.04779517278075218,
0.042857054620981216,
0.13018205761909485,
-0.005723918788135052,
0.01875867135822773,
0.044380392879247665,
-0.07488298416137695,
-0.02970030903816223,
-0.03371921554207802,
0.03217006102204323,
0.059784725308418274,
0.10233935713768005,
-0.02109501324594021,
0.10123945772647858,
-0.13724994659423828,
0.12230150401592255,
0.15020757913589478,
-0.21272660791873932,
-0.014298800379037857,
0.11570723354816437,
0.06600075215101242,
0.11801047623157501,
-0.047207269817590714,
-0.04931772127747536,
0.0900433287024498,
0.0591789186000824,
0.02704857662320137,
-0.06100393459200859,
-0.02483498863875866,
0.03551502525806427,
-0.12035225331783295,
-0.08643639087677002,
0.15688461065292358,
-0.012061916291713715,
-0.08865457773208618,
-0.07271707057952881,
-0.03524036705493927,
-0.17477048933506012,
0.04194653406739235,
0.04876845329999924,
-0.0034657439682632685,
-0.012111456133425236,
-0.024072637781500816,
-0.0005501559935510159,
-0.09233374893665314,
-0.1439792364835739,
-0.00492925476282835,
0.10536389797925949,
0.06530534476041794,
0.03878304362297058,
-0.06790422648191452,
0.11970948427915573,
-0.025707798078656197,
-0.037597980350255966,
-0.044850368052721024,
-0.019288772717118263,
-0.12116936594247818,
0.025747191160917282,
-0.059356171637773514,
-0.20035873353481293,
0.04111339524388313,
0.006945684086531401,
-0.05761422589421272,
0.04443202540278435,
0.027602355927228928,
0.022125886753201485,
0.05718175694346428,
0.2189038097858429,
0.0026173351798206568,
-0.06902241706848145,
0.06924937665462494,
0.04440448060631752,
-0.06045589596033096,
-0.017055068165063858,
-0.07753818482160568,
-0.07614555954933167,
0.09879963845014572,
0.09865546226501465,
-0.11590545624494553,
0.05457652732729912,
-0.0523243322968483,
-0.04412921890616417,
0.01285017654299736,
-0.14836430549621582,
-0.016095677390694618,
0.004671814385801554,
-0.061811842024326324,
-0.0521785132586956,
0.06269975751638412,
-0.14488868415355682,
-0.14310656487941742,
-0.04481175169348717,
-0.08138865232467651,
-0.05577640235424042,
-0.13396140933036804,
-0.15880095958709717,
-0.016713984310626984,
-0.07513917237520218,
-0.004349876195192337,
-0.09122171998023987,
-0.15750110149383545,
-0.017894789576530457,
0.004583894275128841,
0.009829608723521233,
0.007598447613418102,
-0.05072271451354027,
-0.005748755764216185,
-0.024642307311296463,
-0.031089674681425095,
-0.03721575811505318,
-0.06067788973450661,
0.11531736701726913,
0.07143193483352661,
0.03253406286239624,
-0.005240347236394882,
0.022031886503100395,
-0.07629311829805374,
0.055685821920633316,
-0.14459390938282013,
0.11274902522563934,
-0.07932581752538681,
0.08846117556095123,
-0.052668701857328415,
-0.09396757930517197,
0.0744190439581871,
0.044266559183597565,
0.0814799964427948,
0.06436360627412796,
-0.09768373519182205,
-0.04801300913095474,
0.17632922530174255,
-0.116243876516819,
-0.09941483289003372,
0.14614704251289368,
-0.0319291315972805,
0.033058181405067444,
0.11584728211164474,
0.14282479882240295,
0.15768319368362427,
-0.0614757277071476,
-0.008305675350129604,
0.06382336467504501,
0.024370215833187103,
-0.08199366927146912,
0.06490597873926163,
0.06317933648824692,
-0.08872426301240921,
0.04814592003822327,
-0.018402760848402977,
0.11208533495664597,
-0.026469897478818893,
-0.01204192079603672,
-0.05713936686515808,
-0.07434382289648056,
-0.040560532361269,
-0.0223185196518898,
0.05300160124897957,
-0.08096908032894135,
-0.07601776719093323,
0.026953143998980522,
0.1663811206817627,
-0.11925458163022995,
-0.0004968011053279042,
-0.08991415053606033,
0.043369147926568985,
-0.0836239829659462,
0.034450117498636246,
-0.11980358511209488,
0.016694871708750725,
0.08780740946531296,
-0.028651293367147446,
0.08600997924804688,
0.10390033572912216,
0.019015474244952202,
0.04235784336924553,
0.0020393473096191883,
-0.02421090193092823,
-0.09015004336833954,
-0.06015375256538391,
-0.08347325026988983,
-0.06371459364891052,
-0.07479868084192276,
-0.04305535927414894,
-0.02571343630552292,
-0.1333625614643097,
0.024294670671224594,
0.08157283067703247,
0.01997235231101513,
0.005967847537249327,
-0.03031507320702076,
0.028508827090263367,
0.035871002823114395,
-0.0634867399930954,
-0.06315211206674576,
-0.0054613263346254826,
0.023261867463588715,
-0.07429980486631393,
-0.023349713534116745,
-0.04994221404194832,
0.005103003699332476,
0.11078649759292603,
0.09080763161182404,
-0.07243869453668594,
-0.0050795599818229675,
-0.029899878427386284,
-0.029803477227687836,
0.009712453000247478,
-0.07245844602584839,
0.16227608919143677,
-0.0037875226698815823,
0.17257368564605713,
-0.15156111121177673,
-0.024662330746650696,
-0.01272437535226345,
0.013052253983914852,
0.04561493918299675,
0.12227097153663635,
-0.03912853077054024,
-0.12925109267234802,
0.07232603430747986,
-0.03751431405544281,
-0.07980146259069443,
0.22380368411540985,
-0.035458486527204514,
-0.09336989372968674,
0.010942219756543636,
0.10746811330318451,
-0.012216059491038322,
0.1413995772600174,
-0.18240860104560852,
-0.03915074095129967,
0.007080981507897377,
0.02329622581601143,
0.08003480732440948,
-0.1411086916923523,
-0.0015470149228349328,
0.013214756734669209,
-0.0834285169839859,
-0.0804603174328804,
-0.02424810267984867,
-0.011714817956089973,
0.03839889168739319,
0.01609080098569393,
0.018895478919148445,
0.023928117007017136,
-0.026364991441369057,
-0.09405297785997391,
0.2243622988462448,
-0.10184627026319504,
-0.26313504576683044,
-0.18251243233680725,
0.061385881155729294,
-0.07416917383670807,
0.015659885480999947,
0.0289420485496521,
-0.14501185715198517,
-0.04526260122656822,
-0.04582313075661659,
0.21423938870429993,
-0.08285389095544815,
0.04095472767949104,
-0.038356930017471313,
0.06257621943950653,
-0.009084584191441536,
-0.19665025174617767,
0.02809567004442215,
-0.009402156807482243,
-0.05615099519491196,
0.009778349660336971,
-0.1243957132101059,
0.07084887474775314,
0.1627863198518753,
-0.04766615107655525,
0.04273761436343193,
-0.008604699745774269,
0.24976617097854614,
-0.0694776251912117,
-0.03973376750946045,
0.13444465398788452,
0.01537375245243311,
0.010560774244368076,
0.01275477185845375,
-0.009407377801835537,
-0.08163178712129593,
0.06208283454179764,
-0.0016371257370337844,
-0.041501302272081375,
-0.27167508006095886,
0.005963816307485104,
-0.06716090440750122,
0.08225784450769424,
0.039581991732120514,
0.05535784736275673,
-0.055393628776073456,
0.046695124357938766,
0.021062951534986496,
0.13922975957393646,
0.0013681527925655246,
0.055323295295238495,
-0.007160300854593515,
-0.013363471254706383,
0.021934393793344498,
-0.06772401183843613,
0.007404431235045195,
0.1075705885887146,
0.12205573916435242,
0.2345573902130127,
-0.11601130664348602,
0.20948098599910736,
0.040537428110837936,
0.06852908432483673,
0.05127444490790367,
0.11632382869720459,
-0.1285642385482788,
0.019196264445781708,
0.011431570164859295,
-0.004558744840323925,
-0.11346340924501419,
0.03120955266058445,
-0.0235541183501482,
0.02391442097723484,
-0.09134482592344284,
-0.05019519105553627,
0.038086581975221634,
0.17226633429527283,
0.052464570850133896,
-0.2182697057723999,
-0.1272372603416443,
0.012077834457159042,
-0.10865184664726257,
-0.10290537029504776,
0.059172917157411575,
0.2121955007314682,
-0.06634723395109177,
-0.02289312519133091,
-0.009493767283856869,
0.1243288666009903,
-0.10321459174156189,
-0.019508814439177513,
-0.053757473826408386,
0.10525189340114594,
-0.02685089409351349,
0.1330195516347885,
-0.26987048983573914,
0.10404349863529205,
-0.0014894020278006792,
0.04340332746505737,
-0.0634443610906601,
0.055063601583242416,
-0.03646903857588768,
0.08633380383253098,
0.03763623908162117,
-0.0018761127721518278,
0.027115346863865852,
-0.17181652784347534,
-0.015189246274530888,
0.030414972454309464,
0.052582304924726486,
0.01001826487481594,
0.07078354805707932,
0.011545324698090553,
0.05137879028916359,
-0.01450373511761427,
-0.1174745038151741,
-0.0731842964887619,
-0.08916976302862167,
0.02194982022047043,
-0.026255488395690918,
-0.004606462083756924,
-0.07299937307834625,
-0.013164356350898743,
0.06641116738319397,
0.23728737235069275,
-0.07863390445709229,
-0.10276734083890915,
-0.09182073920965195,
0.07151051610708237,
0.1266661137342453,
-0.07666950672864914,
0.044794950634241104,
-0.018305256962776184,
0.059540681540966034,
-0.016989046707749367,
-0.06590255349874496,
0.06663219630718231,
-0.04676225036382675,
-0.07864126563072205,
-0.005726906005293131,
0.09854434430599213,
0.039489421993494034,
0.0160119216889143,
-0.0016672172350808978,
-0.10118385404348373,
-0.0592951700091362,
-0.11484086513519287,
-0.09983905404806137,
-0.04354734718799591,
-0.008561805821955204,
0.10066299885511398,
-0.07394696027040482,
-0.0197627991437912,
-0.03795069456100464,
-0.047404345124959946,
0.10551916062831879,
0.15537191927433014,
-0.025382867082953453,
0.021690472960472107,
0.13581210374832153,
-0.044802337884902954,
-0.17480053007602692,
0.03386656194925308,
0.061302509158849716,
0.10558757185935974,
-0.09884084016084671,
-0.19226643443107605,
0.01390132401138544,
0.02040352113544941,
0.028539344668388367,
0.046232499182224274,
-0.31591299176216125,
-0.11530061811208725,
0.06519521772861481,
0.1169985830783844,
0.11373695731163025,
-0.11757897585630417,
-0.042728111147880554,
-0.0641431212425232,
-0.07470312714576721,
0.1025625616312027,
-0.052914537489414215,
0.14782287180423737,
-0.045117396861314774,
0.05002855136990547,
0.039401255548000336,
-0.05344467610120773,
0.06832704693078995,
0.022112833335995674,
0.08808606117963791,
-0.0413430780172348,
0.03291916102170944,
0.12248031049966812,
-0.038351334631443024,
0.15629510581493378,
-0.1358652561903,
0.11597444117069244,
-0.16916824877262115,
-0.07513029873371124,
-0.08825188130140305,
0.016327999532222748,
-0.007661749608814716,
-0.06536982953548431,
-0.12572023272514343,
0.035341475158929825,
-0.01976938359439373,
-0.002293619094416499,
0.06654335558414459,
-0.027350137010216713,
-0.030275097116827965,
0.09574034810066223,
0.05892762541770935,
-0.03390327841043472,
-0.06331876665353775,
0.05336236581206322,
0.03977294638752937,
0.09469205886125565,
-0.21414674818515778,
0.022776616737246513,
0.08935026079416275,
0.014803469181060791,
0.13462889194488525,
0.044396672397851944,
-0.14369580149650574,
0.02360074780881405,
0.08613027632236481,
-0.09379918873310089,
-0.061520230025053024,
-0.03450245037674904,
-0.057040248066186905,
-0.057240575551986694,
0.09188835322856903,
0.1320192813873291,
-0.03567814454436302,
-0.01799633540213108,
-0.03279634192585945,
-0.009341130033135414,
-0.10905860364437103,
0.21606676280498505,
0.0628804937005043,
0.06257888674736023,
-0.078382708132267,
0.045156847685575485,
0.07336928695440292,
-0.062413159757852554,
0.009619980119168758,
0.17441672086715698,
-0.10682942718267441,
-0.0468166321516037,
0.02893567457795143,
0.11098853498697281,
-0.04255959019064903,
-0.02854827791452408,
-0.11711105704307556,
-0.0644071102142334,
0.03393612802028656,
0.14401592314243317,
0.079851433634758,
0.10566316545009613,
-0.02535366080701351,
0.03538006544113159,
-0.08255708962678909,
0.06834021210670471,
0.06345690041780472,
0.058548640459775925,
-0.13208240270614624,
0.1445145159959793,
0.03883615881204605,
0.12208037078380585,
-0.031003611162304878,
-0.0025710801128298044,
-0.13784848153591156,
0.06239624693989754,
-0.08942340314388275,
0.013534510508179665,
-0.012511046603322029,
0.0401567779481411,
-0.018625976517796516,
-0.023057859390974045,
-0.022783003747463226,
0.059444114565849304,
-0.0859270989894867,
0.007999463938176632,
-0.014323048293590546,
0.037571754306554794,
-0.043519143015146255,
-0.00863570161163807,
0.043207067996263504,
-0.10476865619421005,
0.16369475424289703,
-0.01049714908003807,
-0.02756039798259735,
0.06744162738323212,
-0.03716029226779938,
0.05687836930155754,
0.021248267963528633,
0.040569011121988297,
0.004431608598679304,
0.03809202089905739,
0.08063691854476929,
0.026299530640244484,
0.03685261309146881,
0.026881689205765724,
0.11102878302335739,
-0.12615104019641876,
-0.09483864158391953,
-0.02052391692996025,
-0.07550414651632309,
-0.055627524852752686,
0.09755217283964157,
0.06985760480165482,
0.09988270699977875,
0.06682199239730835,
-0.016230205073952675,
0.013673411682248116,
-0.1372271329164505,
-0.06136699020862579,
0.03983546793460846,
-0.05666058138012886,
-0.017101874575018883,
-0.04634633660316467,
0.05852024629712105,
-0.02967122197151184,
0.17338551580905914,
0.01647244207561016,
0.04134003072977066,
-0.023226527497172356,
0.01869179680943489,
0.05887378379702568,
0.03456825390458107,
0.19894982874393463,
-0.07388468831777573,
0.024996010586619377,
-0.03232615813612938,
-0.004840471316128969,
0.027078578248620033,
0.05291319265961647,
0.06985431909561157,
0.08180253207683563,
0.011700182221829891,
0.09029485285282135,
0.026580747216939926,
0.010293826460838318,
-0.08520738780498505,
-0.01759438030421734,
-0.02195803076028824,
0.06615673005580902,
-0.04863855242729187,
0.17531490325927734,
0.08473343402147293,
-0.09531485289335251,
0.0976838544011116,
0.05249670520424843,
-0.1399194747209549,
-0.02687808871269226,
-0.040832556784152985,
-0.02771683782339096,
-0.16791276633739471,
0.03937356546521187,
-0.12754954397678375,
0.003947156947106123,
0.06014697998762131,
0.07233556360006332,
-0.05984446778893471,
0.19928158819675446,
0.07877089828252792,
-0.08071417361497879,
0.06988593190908432,
0.006234101019799709,
0.013028139248490334,
0.06722193956375122,
-0.009151487611234188,
0.05623072013258934,
-0.012904820963740349,
0.06052740663290024,
0.009337549097836018,
-0.004314758814871311,
0.006593880243599415,
-0.008640606887638569,
-0.007063678931444883,
-0.0332309864461422,
0.006277482956647873,
0.02655465342104435,
0.1513119339942932,
0.010061833076179028,
-0.07654634863138199,
-0.016963861882686615,
0.1757768839597702,
-0.049162015318870544,
-0.07261483371257782,
-0.13149477541446686,
0.15058909356594086,
0.01868508942425251,
0.018644455820322037,
0.007387564983218908,
-0.09492865949869156,
-0.059602800756692886,
0.23715557157993317,
0.06870178878307343,
-0.03830863907933235,
-0.03563983365893364,
-0.0038956080097705126,
-0.009734125807881355,
-0.023752301931381226,
0.17980924248695374,
0.009426823817193508,
0.22106562554836273,
0.0242843609303236,
0.026491036638617516,
-0.04903227463364601,
-0.040658045560121536,
0.004350531380623579,
0.13748806715011597,
-0.03497064486145973,
-0.02031913958489895,
-0.10335560888051987,
0.007687409874051809,
-0.01184774748980999,
-0.1322983354330063,
0.05010738968849182,
-0.1190195381641388,
-0.09432008117437363,
-0.022178370505571365,
0.06749119609594345,
-0.05927789956331253,
0.045914579182863235,
-0.027288703247904778,
0.057344574481248856,
0.028874557465314865,
-0.028348667547106743,
-0.10945942252874374,
-0.16631701588630676,
0.10557672381401062,
-0.024810947477817535,
0.1212724894285202,
-0.013800824992358685,
0.13964173197746277,
0.09677223116159439,
0.007953757420182228,
-0.06495115160942078,
0.08704419434070587,
0.016100293025374413,
0.0049210661090910435,
0.018147164955735207,
0.14132894575595856,
-0.04686537757515907,
0.09954856336116791,
-0.03627326712012291,
-0.0560615248978138,
-0.032216694205999374,
-0.057125844061374664,
0.013801098801195621,
-0.18220952153205872,
-0.02099812775850296,
-0.11295295506715775,
0.08622518926858902,
0.170320063829422,
-0.04218002036213875,
1.4492109734476344e-8,
-0.10302779078483582,
0.08571024984121323,
-0.022661905735731125,
0.061525724828243256,
-0.02673972025513649,
-0.19940270483493805,
-0.02517022006213665,
0.05800978094339371,
0.022404054179787636,
-0.25261473655700684,
-0.0030980990268290043,
-0.023235876113176346,
-0.010281940922141075,
-0.08216492831707001,
0.16817253828048706,
0.07846610248088837,
0.04162623733282089,
-0.024346118792891502,
-0.08790063858032227,
-0.02090669982135296,
0.030754702165722847,
-0.1332768201828003,
-0.1297307312488556
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the git commit message generation task for the java commit changes.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_commit_generation_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_commit_generation_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/commit%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 3,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_large_commit_generation_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the git commit message generation task for the java commit changes.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 3,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 3,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 3,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 3,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.07786737382411957,
0.04480680450797081,
-0.0018171236151829362,
0.10131056606769562,
0.06028950214385986,
0.023240966722369194,
0.07473913580179214,
0.09925524145364761,
-0.032782260328531265,
0.06684606522321701,
0.07682068645954132,
-0.027658842504024506,
0.07570402324199677,
0.1811317354440689,
0.03266799449920654,
-0.16812600195407867,
-0.006378499325364828,
0.018478963524103165,
-0.03678830713033676,
0.10778677463531494,
0.09855461865663528,
-0.08205738663673401,
0.06998623162508011,
-0.03536170721054077,
-0.11421070992946625,
0.05013706162571907,
-0.047402575612068176,
-0.044285062700510025,
0.08482992649078369,
0.06580918282270432,
0.10200051963329315,
-0.015329687856137753,
0.06109391152858734,
-0.19522017240524292,
0.00015964297926984727,
0.030386699363589287,
0.0482490099966526,
0.01890864595770836,
0.058470726013183594,
0.07452426850795746,
0.12083763629198074,
-0.042025141417980194,
0.033174797892570496,
0.04822558909654617,
-0.05855056270956993,
-0.024265728890895844,
-0.0636289194226265,
0.08044387400150299,
0.1294327676296234,
0.0886484831571579,
-0.015882210806012154,
0.01997748576104641,
-0.08763958513736725,
0.08572354912757874,
0.1450921893119812,
-0.21654263138771057,
-0.02289082668721676,
0.11195102334022522,
0.0932970643043518,
0.07716575264930725,
-0.07080825418233871,
-0.022165406495332718,
0.10705626010894775,
0.04142851382493973,
0.04053334891796112,
-0.07179757207632065,
-0.015669630840420723,
0.0008320216438733041,
-0.0714503601193428,
-0.06555026024580002,
0.10997938364744186,
0.03661764785647392,
-0.06376027315855026,
-0.12171627581119537,
-0.034106310456991196,
-0.21118684113025665,
0.05117199197411537,
0.025562727823853493,
0.009379295632243156,
-0.015412421897053719,
0.012657621875405312,
-0.016317280009388924,
-0.09838993102312088,
-0.104925736784935,
-0.003495900658890605,
0.07919859886169434,
0.08203912526369095,
0.028148028999567032,
-0.0015945606864988804,
0.07966306805610657,
-0.013559290207922459,
-0.046274054795503616,
-0.03880677744746208,
0.008463229052722454,
-0.1270616054534912,
0.025176113471388817,
-0.02828637696802616,
-0.08157192170619965,
0.0002266443334519863,
0.09157299995422363,
-0.06979187577962875,
0.07623754441738129,
0.12388146668672562,
0.009933825582265854,
0.01679432950913906,
0.23110255599021912,
0.016468951478600502,
-0.11797349900007248,
0.009345758706331253,
0.03628099709749222,
0.0037801978178322315,
-0.012978557497262955,
-0.07068705558776855,
-0.025109533220529556,
0.025173701345920563,
0.06738853454589844,
-0.13127879798412323,
0.01994767040014267,
-0.03865967318415642,
-0.02000666968524456,
0.07599630951881409,
-0.12430226802825928,
0.02273278869688511,
0.008640645071864128,
-0.06819234788417816,
-0.04351582005620003,
0.05006682872772217,
-0.10033804178237915,
-0.11020452529191971,
0.023215068504214287,
-0.05078309401869774,
-0.03787486255168915,
-0.11307986080646515,
-0.12793874740600586,
-0.012731852009892464,
-0.06856653839349747,
0.010994363576173782,
-0.11029762774705887,
-0.0957084447145462,
-0.013185951858758926,
0.025010941550135612,
-0.007920840755105019,
-0.015564149245619774,
-0.0421878956258297,
0.01576180011034012,
-0.004372386261820793,
-0.03042967990040779,
0.0054946052841842175,
-0.04233282804489136,
0.08808383345603943,
0.08196711540222168,
0.05197691172361374,
0.02461674064397812,
0.01843443512916565,
-0.061560697853565216,
0.06722094118595123,
-0.09672940522432327,
0.06612598896026611,
-0.028658801689743996,
0.06577596813440323,
-0.09296921640634537,
-0.08844292163848877,
0.07229617983102798,
0.04593050479888916,
0.060038913041353226,
0.028035033494234085,
-0.07897583395242691,
0.008036617189645767,
0.12325284630060196,
-0.0931350439786911,
-0.12201958149671555,
0.13000912964344025,
0.00695436866953969,
-0.024778561666607857,
0.07438065111637115,
0.12767955660820007,
0.1411585509777069,
-0.0951058566570282,
-0.05675438418984413,
0.08002863824367523,
0.06283435225486755,
-0.04893343895673752,
0.09781725704669952,
0.035411395132541656,
0.033996883779764175,
0.018335722386837006,
0.04014172405004501,
0.06855754554271698,
-0.013205048628151417,
-0.030476216226816177,
-0.02219034731388092,
-0.08790285885334015,
-0.043561432510614395,
-0.02845836617052555,
0.024556780233979225,
-0.0797867402434349,
-0.07476946711540222,
0.00057980976998806,
0.1776711642742157,
-0.10247931629419327,
0.029533138498663902,
-0.09855864942073822,
-0.05072448030114174,
-0.08625107258558273,
0.019464829936623573,
-0.08634496480226517,
0.004251857288181782,
0.06170320510864258,
-0.050594888627529144,
0.0752408429980278,
0.07159377634525299,
0.0010688947513699532,
0.042127303779125214,
-0.034642789512872696,
-0.048304688185453415,
-0.04834204912185669,
-0.058853402733802795,
-0.126161590218544,
-0.017468677833676338,
-0.09557362645864487,
-0.023795492947101593,
-0.05583043396472931,
-0.1401437222957611,
0.020715631544589996,
-0.01192296203225851,
0.024546155706048012,
-0.007461997680366039,
-0.029162567108869553,
0.03171851858496666,
0.03838680312037468,
-0.06230143830180168,
-0.09534985572099686,
0.0041386885568499565,
0.01787247322499752,
-0.12412664294242859,
-0.02641049213707447,
-0.10877994447946548,
-0.0532187856733799,
0.07835294306278229,
0.10649626702070236,
-0.07215519994497299,
0.002293340628966689,
-0.029949767515063286,
-0.05869586020708084,
-0.0304742269217968,
-0.0851161852478981,
0.1726749837398529,
0.01333247683942318,
0.1586805284023285,
-0.13655857741832733,
-0.0510839968919754,
-0.02497239224612713,
-0.01445898786187172,
0.005308019462972879,
0.1425614356994629,
-0.011707586236298084,
-0.10512588918209076,
0.049499306827783585,
-0.026345890015363693,
-0.05998001620173454,
0.15269915759563446,
0.0036029955372214317,
-0.09035184234380722,
0.019957268610596657,
0.09918717294931412,
-0.013078132644295692,
0.12889164686203003,
-0.08033907413482666,
-0.013672559522092342,
-0.005983267445117235,
0.03331689164042473,
0.05114619806408882,
-0.13410557806491852,
0.0218631811439991,
0.05567722022533417,
-0.08061446249485016,
-0.038452427834272385,
-0.03531625121831894,
-0.0481971800327301,
0.03759213909506798,
0.010501785203814507,
0.0009336246876046062,
-0.0030499252025038004,
-0.021998973563313484,
-0.09366139769554138,
0.20701926946640015,
-0.08709505200386047,
-0.24224038422107697,
-0.1661854386329651,
0.004731866531074047,
-0.0715092346072197,
0.005488757509738207,
0.04913533851504326,
-0.13785552978515625,
-0.06276063621044159,
-0.08963440358638763,
0.14766772091388702,
-0.12084957212209702,
0.025323878973722458,
0.0005784924724139273,
0.031118085607886314,
0.02448219060897827,
-0.17280112206935883,
0.02859276533126831,
-0.005039363168179989,
-0.024974552914500237,
0.008880679495632648,
-0.07301241159439087,
0.08405245840549469,
0.1347276121377945,
-0.06950677186250687,
0.023626388981938362,
-0.011734222061932087,
0.1891845166683197,
-0.053720101714134216,
0.023706844076514244,
0.20132169127464294,
0.027119973674416542,
0.0348002091050148,
0.035171348601579666,
0.009151552803814411,
-0.08771495521068573,
0.06849239766597748,
0.05311938002705574,
-0.0400216318666935,
-0.2570122182369232,
0.007192461751401424,
-0.05420101433992386,
0.056527744978666306,
0.11125947535037994,
0.05635237321257591,
-0.12215442210435867,
0.04009651020169258,
-0.01819315366446972,
0.13904699683189392,
-0.028579389676451683,
0.05662311241030693,
-0.014308525249361992,
0.007194742560386658,
0.020305348560214043,
-0.07837260514497757,
0.01401018351316452,
0.08659995347261429,
0.12546582520008087,
0.19324789941310883,
-0.06515125185251236,
0.19255338609218597,
0.023682447150349617,
0.06849030405282974,
0.017435170710086823,
0.10212884843349457,
-0.12337259948253632,
-0.006895482540130615,
0.0038482067175209522,
-0.004879769403487444,
-0.07115534693002701,
0.05512049421668053,
-0.0006280069355852902,
0.055077794939279556,
-0.06962036341428757,
0.03694531321525574,
0.03124060295522213,
0.17822720110416412,
0.06924448907375336,
-0.18563587963581085,
-0.11005265265703201,
0.0216927919536829,
-0.1127423495054245,
-0.10055668652057648,
0.06769876927137375,
0.1997622400522232,
-0.04422670230269432,
0.020592840388417244,
-0.00976656936109066,
0.12901267409324646,
-0.09267878532409668,
-0.019419977441430092,
0.030470483005046844,
0.0873585045337677,
0.0024122365284711123,
0.1324688345193863,
-0.27200090885162354,
0.06337808817625046,
0.011802420020103455,
0.08344952762126923,
-0.028472045436501503,
0.06702012568712234,
-0.0405861958861351,
0.012700721621513367,
0.0630982294678688,
-0.0006075292476452887,
-0.05870300158858299,
-0.21868743002414703,
-0.07236772775650024,
0.022145478054881096,
0.06500515341758728,
-0.021789586171507835,
0.09361859411001205,
-0.0027970916125923395,
0.061344895511865616,
-0.03126301243901253,
-0.10752584040164948,
-0.07047586888074875,
-0.13251502811908722,
-0.007273159921169281,
0.006984710693359375,
-0.03151567652821541,
-0.0396907776594162,
0.011184433475136757,
-0.025835499167442322,
0.2485828399658203,
-0.13538748025894165,
-0.11569575220346451,
-0.09012100845575333,
0.07234017550945282,
0.12668228149414062,
-0.09777156263589859,
0.01598253659904003,
0.014247375540435314,
0.055772557854652405,
-0.04760592803359032,
-0.052380260080099106,
0.02829388901591301,
-0.06794075667858124,
-0.08922839164733887,
-0.019232947379350662,
0.09444913268089294,
-0.007266168482601643,
0.04770655930042267,
0.0031418667640537024,
-0.09994171559810638,
-0.04423379525542259,
-0.13507793843746185,
-0.07542791217565536,
-0.024504676461219788,
0.027263566851615906,
0.02620823122560978,
-0.061456017196178436,
0.10106433182954788,
-0.032813530415296555,
-0.09569200873374939,
0.07644522935152054,
0.20492427051067352,
-0.04611287638545036,
0.008343813940882683,
0.11451558768749237,
-0.04637937247753143,
-0.15434086322784424,
-0.06124524399638176,
0.05290444940328598,
0.08518580347299576,
-0.04437673091888428,
-0.14662960171699524,
0.047210775315761566,
0.03795122355222702,
0.023773271590471268,
0.007376204244792461,
-0.30668705701828003,
-0.12553229928016663,
0.03739798069000244,
0.08160993456840515,
0.07175314426422119,
-0.1267075538635254,
-0.03737955912947655,
-0.06487105786800385,
-0.05889947712421417,
0.04765353724360466,
0.05219780281186104,
0.1411231905221939,
-0.035345349460840225,
0.029010852798819542,
0.027916422113776207,
-0.03340959921479225,
0.10709782689809799,
-0.0008101602434180677,
0.08761679381132126,
-0.021543774753808975,
0.038424279540777206,
0.04849087819457054,
-0.07012109458446503,
0.16166383028030396,
-0.1849905103445053,
0.08589854091405869,
-0.19626416265964508,
-0.05791476368904114,
-0.012622441165149212,
-0.014208515174686909,
-0.032718200236558914,
-0.05925586447119713,
-0.12316308915615082,
0.010506018996238708,
0.03805674985051155,
-0.018457215279340744,
0.08523474633693695,
-0.02395908161997795,
-0.060711029917001724,
0.061111170798540115,
0.06602205336093903,
-0.04432559385895729,
-0.13425973057746887,
0.014786502346396446,
0.028958268463611603,
0.08070537447929382,
-0.1974632441997528,
0.01748955249786377,
0.11279154568910599,
-0.004151343367993832,
0.11782027035951614,
0.013026711530983448,
-0.09251168370246887,
0.04135998338460922,
0.07483508437871933,
-0.04183080419898033,
-0.06799138337373734,
-0.014161018654704094,
-0.012496848590672016,
-0.08605004847049713,
0.050747405737638474,
0.1021643579006195,
-0.046505168080329895,
-0.021323367953300476,
-0.016437534242868423,
0.0153675377368927,
-0.07178875058889389,
0.21239176392555237,
0.021410085260868073,
0.08274994045495987,
-0.06522219628095627,
0.07819265127182007,
0.09025806933641434,
-0.09473585337400436,
0.017353180795907974,
0.16727103292942047,
-0.07307251542806625,
-0.02572115696966648,
0.03351111710071564,
0.05786781758069992,
-0.05620132014155388,
-0.05780810862779617,
-0.10129459202289581,
-0.07457408308982849,
0.01887141354382038,
0.004767970647662878,
0.0678764060139656,
0.07640635222196579,
-0.028965497389435768,
0.024892328307032585,
-0.09060865640640259,
0.08572397381067276,
0.06728581339120865,
0.057626333087682724,
-0.15181484818458557,
0.13819316029548645,
0.04668620973825455,
0.10327368974685669,
0.0000244545481109526,
0.040646690875291824,
-0.10446043312549591,
0.04819729924201965,
-0.035180989652872086,
0.03168490156531334,
-0.013253181241452694,
0.04338642954826355,
-0.023644866421818733,
0.020223217085003853,
-0.028862522915005684,
0.04769960045814514,
-0.03937940299510956,
-0.02992989867925644,
-0.034055616706609726,
0.03618679940700531,
-0.05028761178255081,
-0.022080041468143463,
0.009733683429658413,
-0.08914228528738022,
0.11964283883571625,
-0.06857338547706604,
-0.010619443841278553,
-0.003422842361032963,
-0.00021082937018945813,
0.0685756504535675,
0.02905864454805851,
0.05193428322672844,
-0.013804657384753227,
-0.01029543299227953,
0.04210903123021126,
0.016631504520773888,
-0.010157739743590355,
0.0005111870705150068,
0.03944120556116104,
-0.13771265745162964,
-0.09230576455593109,
-0.09381754696369171,
-0.05508354678750038,
-0.0655987560749054,
0.08166778087615967,
0.08621800690889359,
0.07402051985263824,
0.0891541913151741,
-0.02595934271812439,
0.0015373313799500465,
-0.1350497156381607,
-0.03330845758318901,
0.055495068430900574,
-0.023057881742715836,
-0.07967973500490189,
-0.044505804777145386,
0.057450708001852036,
-0.03794016316533089,
0.1393948793411255,
-0.009272926487028599,
0.040269769728183746,
-0.010118464007973671,
-0.022184185683727264,
0.016093842685222626,
0.0033075306564569473,
0.20351894199848175,
-0.09457434713840485,
0.023675262928009033,
-0.0026702885515987873,
-0.010799549520015717,
0.05603927746415138,
0.10493338853120804,
0.09709196537733078,
0.10963694006204605,
0.06099206581711769,
0.11621744930744171,
-0.04620302468538284,
-0.03252188488841057,
-0.16381973028182983,
0.03173685073852539,
-0.011547116562724113,
0.03805694729089737,
-0.02469419129192829,
0.09409695863723755,
0.14074255526065826,
-0.12714044749736786,
0.08210250735282898,
0.03076275996863842,
-0.10665468871593475,
-0.04754447937011719,
-0.05485488101840019,
-0.04535422846674919,
-0.1050431877374649,
0.025377826765179634,
-0.11452119797468185,
0.02355777472257614,
0.08273833245038986,
0.05343436449766159,
-0.0196905005723238,
0.15077969431877136,
0.0011385693214833736,
-0.05765112489461899,
0.012410963885486126,
0.02276557683944702,
0.04081758111715317,
0.115120530128479,
-0.008971404284238815,
0.07601442188024521,
-0.04821587726473808,
0.09324696660041809,
0.014875617809593678,
0.01757606491446495,
0.030003689229488373,
-0.0001863711659098044,
-0.010951432399451733,
-0.050480179488658905,
-0.00823125708848238,
0.08369869738817215,
0.17136774957180023,
0.03142130374908447,
-0.046169545501470566,
-0.04758346080780029,
0.15913188457489014,
-0.05454685166478157,
-0.06294015049934387,
-0.11264527589082718,
0.1595694124698639,
0.04659288749098778,
0.026322875171899796,
0.004325849935412407,
-0.08023292571306229,
-0.06823872029781342,
0.24547336995601654,
0.027286238968372345,
-0.04049044847488403,
-0.04638870805501938,
-0.016524821519851685,
-0.011803741566836834,
-0.03493082523345947,
0.15251150727272034,
0.02371242456138134,
0.1936722844839096,
0.00687476247549057,
0.009082828648388386,
-0.034879062324762344,
-0.028131891041994095,
-0.03415444493293762,
0.17970888316631317,
-0.03239964693784714,
0.04307204857468605,
-0.09560151398181915,
-0.015872959047555923,
0.039496179670095444,
-0.10478807240724564,
0.08486760407686234,
-0.08348671346902847,
-0.07595664262771606,
0.038829442113637924,
0.09440749138593674,
-0.03295949101448059,
0.05147331953048706,
-0.010594004765152931,
0.056047916412353516,
0.01821324974298477,
-0.03361015394330025,
-0.09404296427965164,
-0.12408413738012314,
0.059391945600509644,
0.0015789226163178682,
0.1580917090177536,
0.020832564681768417,
0.0975998118519783,
0.0943724662065506,
0.002829169388860464,
-0.07902361452579498,
0.10371119529008865,
0.028135817497968674,
0.0008505678852088749,
0.06832490116357803,
0.13262119889259338,
-0.0356898196041584,
0.12425664067268372,
-0.00015863659791648388,
-0.04563216492533684,
-0.03772257640957832,
-0.027482742443680763,
0.00543418200686574,
-0.14944589138031006,
0.003864838043227792,
-0.06851407885551453,
0.1260790228843689,
0.17480498552322388,
-0.047279275953769684,
-0.022953709587454796,
-0.0348997600376606,
0.06778212636709213,
-0.026846405118703842,
0.10349608212709427,
-0.004098741337656975,
-0.18478074669837952,
0.008029373362660408,
-0.017308814451098442,
0.021907726302742958,
-0.19262681901454926,
-0.041513592004776,
-0.03215799108147621,
-0.03183640539646149,
-0.09173000603914261,
0.14671865105628967,
0.06702528148889542,
0.02413662150502205,
-0.04078264907002449,
-0.1725277602672577,
-0.026875097304582596,
0.0410228930413723,
-0.14705133438110352,
-0.12706303596496582
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the git commit message generation task for the java commit changes.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_commit_generation_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_commit_generation_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/commit%20generation/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_large_commit_generation_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the git commit message generation task for the java commit changes.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08047545701265335,
0.06004609167575836,
-0.0015562523622065783,
0.1060347780585289,
0.0656486302614212,
0.02405884675681591,
0.050306692719459534,
0.10577070713043213,
-0.036588698625564575,
0.06729111820459366,
0.06536613404750824,
-0.04593481123447418,
0.06871723383665085,
0.17772838473320007,
0.025283534079790115,
-0.18414297699928284,
-0.02022106572985649,
0.020804723724722862,
-0.05081852525472641,
0.1046384871006012,
0.09066815674304962,
-0.07839777320623398,
0.07655590772628784,
-0.042250730097293854,
-0.10916712135076523,
0.04953254014253616,
-0.036503277719020844,
-0.03872158005833626,
0.09335464984178543,
0.07692523300647736,
0.10585528612136841,
-0.026604777202010155,
0.05388832837343216,
-0.19594275951385498,
0.0005413639009930193,
0.040593408048152924,
0.04801202937960625,
0.028476262465119362,
0.049696989357471466,
0.06832796335220337,
0.14133746922016144,
-0.03697417676448822,
0.03166910260915756,
0.049962736666202545,
-0.06394162774085999,
-0.04792286455631256,
-0.056004494428634644,
0.05327460914850235,
0.10987932235002518,
0.10220079869031906,
-0.01634395308792591,
0.009565751999616623,
-0.07788997888565063,
0.08829759061336517,
0.13082532584667206,
-0.22655189037322998,
-0.01938238926231861,
0.10942395776510239,
0.091255784034729,
0.08210555464029312,
-0.0830494686961174,
-0.021783562377095222,
0.10744915157556534,
0.038344938308000565,
0.047799885272979736,
-0.07322147488594055,
-0.01796625554561615,
0.0018689888529479504,
-0.06675468385219574,
-0.05930062383413315,
0.12167404592037201,
0.03224295750260353,
-0.05588460713624954,
-0.11955512315034866,
-0.04769325628876686,
-0.20832203328609467,
0.04652862623333931,
0.027141379192471504,
0.0153597267344594,
-0.0034486246295273304,
-0.0024465268943458796,
-0.018377861008048058,
-0.0952921211719513,
-0.09847128391265869,
-0.004187732469290495,
0.05261785164475441,
0.07332753390073776,
0.029594384133815765,
-0.005421410780400038,
0.08294511586427689,
-0.019837085157632828,
-0.04718036204576492,
-0.03257358446717262,
0.012401500716805458,
-0.11821826547384262,
0.026949916034936905,
-0.024494774639606476,
-0.06138942018151283,
-0.0035784540232270956,
0.08438055962324142,
-0.08690842986106873,
0.07491236925125122,
0.11254875361919403,
0.009282910265028477,
0.010700905695557594,
0.23580478131771088,
0.022111374884843826,
-0.14154765009880066,
0.016315562650561333,
0.020385978743433952,
-0.0015498411376029253,
-0.007769126445055008,
-0.07613477110862732,
-0.03241961449384689,
0.01965225860476494,
0.06785266846418381,
-0.12919971346855164,
0.024021649733185768,
-0.03685346618294716,
-0.011695284396409988,
0.07268709689378738,
-0.12228172272443771,
0.028844643384218216,
0.004118212498724461,
-0.07343753427267075,
-0.03489302843809128,
0.06549371778964996,
-0.11369379609823227,
-0.11073554307222366,
0.02318570204079151,
-0.05017021670937538,
-0.046900808811187744,
-0.11965478956699371,
-0.1334391087293625,
-0.017357617616653442,
-0.05659111589193344,
0.010973010212182999,
-0.1104588732123375,
-0.09527319669723511,
-0.016382480040192604,
0.027995746582746506,
-0.007862760685384274,
-0.017281757667660713,
-0.049386899918317795,
0.006997367832809687,
0.0052905576303601265,
-0.02658878080546856,
0.012315974570810795,
-0.04590032622218132,
0.09067679196596146,
0.07507959753274918,
0.051657382398843765,
0.02198858931660652,
0.018486263230443,
-0.06960604339838028,
0.06597419828176498,
-0.10571765899658203,
0.07206439971923828,
-0.021079082041978836,
0.06501569598913193,
-0.09874564409255981,
-0.08896497637033463,
0.05664654076099396,
0.05185021460056305,
0.06140279024839401,
0.032027848064899445,
-0.08983483165502548,
0.00559708196669817,
0.12863989174365997,
-0.09982898086309433,
-0.11152993142604828,
0.1314070075750351,
0.00030153378611430526,
-0.004495091270655394,
0.08643045276403427,
0.13135387003421783,
0.14835533499717712,
-0.101930171251297,
-0.05740731582045555,
0.08146123588085175,
0.052587639540433884,
-0.050990406423807144,
0.07848402112722397,
0.03436293080449104,
0.02722945809364319,
0.021504757925868034,
0.045903123915195465,
0.07131990045309067,
-0.008771294727921486,
-0.027335157617926598,
-0.03225278481841087,
-0.08838017284870148,
-0.04442141205072403,
-0.016394201666116714,
0.020686564967036247,
-0.07362585514783859,
-0.08307883888483047,
0.0021534955594688654,
0.17604711651802063,
-0.108616903424263,
0.02771986462175846,
-0.08807981759309769,
-0.04963633045554161,
-0.07793235033750534,
0.025255141779780388,
-0.08778447657823563,
-0.003631309838965535,
0.05738477781414986,
-0.024681946262717247,
0.07098500430583954,
0.08136550337076187,
0.0076997894793748856,
0.03305228054523468,
-0.047130584716796875,
-0.051288753747940063,
-0.02965141274034977,
-0.06706348061561584,
-0.12630467116832733,
-0.02265772968530655,
-0.08664780110120773,
-0.0201089046895504,
-0.05096578598022461,
-0.15878574550151825,
0.01239804457873106,
-0.0018803320126608014,
0.026365894824266434,
0.0037972077261656523,
-0.023404570296406746,
0.02334398403763771,
0.035697415471076965,
-0.058616600930690765,
-0.10175630450248718,
0.0058049424551427364,
0.022923508659005165,
-0.11369118839502335,
-0.017647704109549522,
-0.1050729900598526,
-0.04468730464577675,
0.08343063294887543,
0.11018044501543045,
-0.08284439146518707,
-0.003751850686967373,
-0.027104414999485016,
-0.055702805519104004,
-0.03552321717143059,
-0.07701753079891205,
0.18633855879306793,
0.00485158571973443,
0.1625167578458786,
-0.13200373947620392,
-0.05187778174877167,
-0.02668747864663601,
-0.003312963293865323,
0.017206424847245216,
0.1490085870027542,
-0.02277601696550846,
-0.09861425310373306,
0.04860464110970497,
-0.04091699421405792,
-0.061814453452825546,
0.15960220992565155,
0.0012607952812686563,
-0.08847353607416153,
0.012780818156898022,
0.09230209141969681,
-0.005102577619254589,
0.14005091786384583,
-0.06652997434139252,
-0.008814495988190174,
-0.007305246312171221,
0.024907642975449562,
0.052835214883089066,
-0.12927378714084625,
0.025585150346159935,
0.05193352699279785,
-0.07971768081188202,
-0.029397007077932358,
-0.03729906678199768,
-0.050853513181209564,
0.03950824961066246,
0.013779868371784687,
0.0084962984547019,
-0.009565932676196098,
-0.022804392501711845,
-0.08719594031572342,
0.20652003586292267,
-0.08950041234493256,
-0.2424146682024002,
-0.17115181684494019,
0.01930982619524002,
-0.046738963574171066,
0.004258947446942329,
0.042281731963157654,
-0.1391066014766693,
-0.07031786441802979,
-0.08988109230995178,
0.14094075560569763,
-0.1293497532606125,
0.02150769904255867,
-0.03329288214445114,
0.0447981171309948,
0.02674650400876999,
-0.17215588688850403,
0.031746841967105865,
-0.009275399148464203,
-0.02499566785991192,
0.0008281403570435941,
-0.0712776631116867,
0.07535898685455322,
0.13668419420719147,
-0.06695316731929779,
0.022115197032690048,
-0.015110176056623459,
0.17701245844364166,
-0.0688503310084343,
0.03448233753442764,
0.19186203181743622,
0.028024068102240562,
0.0344652384519577,
0.0418238490819931,
0.004769126418977976,
-0.08901076018810272,
0.07301192730665207,
0.05321018025279045,
-0.03960569575428963,
-0.25195515155792236,
-0.002513974905014038,
-0.0555417500436306,
0.06655570864677429,
0.11847059428691864,
0.05560590326786041,
-0.12116800248622894,
0.029440730810165405,
-0.01869373768568039,
0.14871078729629517,
-0.02473275363445282,
0.0565323606133461,
0.001525899046100676,
0.0016911866841837764,
0.019230270758271217,
-0.0822833776473999,
0.013391530141234398,
0.08633088320493698,
0.11641260981559753,
0.2029838114976883,
-0.08046919107437134,
0.2053990364074707,
0.0059653823263943195,
0.08446695655584335,
0.0333385169506073,
0.08946920186281204,
-0.1298847645521164,
0.004442047793418169,
0.006627307273447514,
-0.009631381370127201,
-0.06213640421628952,
0.06033545732498169,
-0.01420716941356659,
0.050926871597766876,
-0.07007718831300735,
0.0321500338613987,
0.025081545114517212,
0.17309729754924774,
0.06787320971488953,
-0.19046235084533691,
-0.1216716542840004,
0.01885490119457245,
-0.10595705360174179,
-0.10404127091169357,
0.07141874730587006,
0.21913261711597443,
-0.04217121750116348,
0.010135081596672535,
-0.009100005030632019,
0.1318940967321396,
-0.11546824127435684,
-0.021895501762628555,
0.03292492404580116,
0.09573390334844589,
-0.0025140929501503706,
0.12776164710521698,
-0.2655044198036194,
0.05859300494194031,
0.014784778468310833,
0.08443652093410492,
-0.02129915915429592,
0.06400435417890549,
-0.037379294633865356,
0.01480699609965086,
0.06377877295017242,
0.0017989351181313396,
-0.06693786382675171,
-0.19854824244976044,
-0.06654708087444305,
0.02331211045384407,
0.05994429439306259,
-0.02086525410413742,
0.09274299442768097,
-0.01187931839376688,
0.04753106087446213,
-0.021574586629867554,
-0.1406095325946808,
-0.055704470723867416,
-0.14142027497291565,
-0.022902371361851692,
0.006625550799071789,
-0.014470112510025501,
-0.03777273744344711,
0.01966613158583641,
-0.0019913818687200546,
0.2603859603404999,
-0.13264700770378113,
-0.10973168164491653,
-0.0928337350487709,
0.07825052738189697,
0.1360217034816742,
-0.09803275763988495,
0.02679181471467018,
0.024061787873506546,
0.0623273141682148,
-0.04299532622098923,
-0.062302760779857635,
0.04019270837306976,
-0.060489505529403687,
-0.0778561681509018,
-0.021030887961387634,
0.09874570369720459,
-0.003853487316519022,
0.044154927134513855,
0.0014425216941162944,
-0.09197891503572464,
-0.0536048598587513,
-0.12894099950790405,
-0.0785565972328186,
-0.012851163744926453,
0.03540073335170746,
0.022378744557499886,
-0.07180678844451904,
0.09662023931741714,
-0.027266325429081917,
-0.08990296721458435,
0.06856786459684372,
0.17462848126888275,
-0.05646166950464249,
0.005478689447045326,
0.10389573872089386,
-0.05617355927824974,
-0.15510103106498718,
-0.0440278984606266,
0.048773836344480515,
0.08683164417743683,
-0.05469851568341255,
-0.15559323132038116,
0.04779012128710747,
0.03018277697265148,
0.027025533840060234,
0.036560580134391785,
-0.29784443974494934,
-0.12419026345014572,
0.023905320093035698,
0.07088767737150192,
0.07560110837221146,
-0.11676318943500519,
-0.04080939665436745,
-0.0694575160741806,
-0.0438140407204628,
0.04662735387682915,
0.046040017157793045,
0.13528241217136383,
-0.03227178752422333,
0.03424274921417236,
0.034388329833745956,
-0.02824949473142624,
0.0889432281255722,
-0.00820118933916092,
0.09018514305353165,
-0.021810293197631836,
0.03475872427225113,
0.048534635454416275,
-0.06792760640382767,
0.15798035264015198,
-0.18167315423488617,
0.09219561517238617,
-0.1849430799484253,
-0.057025983929634094,
-0.012761562131345272,
-0.005698336288332939,
-0.02894001454114914,
-0.05852843448519707,
-0.13014647364616394,
0.02115642838180065,
0.0399397574365139,
-0.017091063782572746,
0.06469433009624481,
-0.025170786306262016,
-0.062051910907030106,
0.06245064362883568,
0.05989541858434677,
-0.025656061246991158,
-0.13570964336395264,
0.029974764212965965,
0.029972070828080177,
0.0879271849989891,
-0.2128724604845047,
0.01737304963171482,
0.11526582390069962,
-0.0017089939210563898,
0.11617305129766464,
0.00999841932207346,
-0.08887498825788498,
0.03615344315767288,
0.06866676360368729,
-0.046442411839962006,
-0.07407926768064499,
-0.020656120032072067,
-0.030663955956697464,
-0.07902054488658905,
0.038762856274843216,
0.10796691477298737,
-0.053638383746147156,
-0.012583942152559757,
-0.012888606637716293,
0.014894632622599602,
-0.06762634962797165,
0.20017126202583313,
0.021597057580947876,
0.07452958822250366,
-0.06403560936450958,
0.07358909398317337,
0.09545780718326569,
-0.11481025815010071,
0.01995723694562912,
0.16579289734363556,
-0.07726161926984787,
-0.022494180127978325,
0.04272424429655075,
0.07652104645967484,
-0.05401346832513809,
-0.05634493753314018,
-0.09544967114925385,
-0.07201727479696274,
0.017179587855935097,
0.015006549656391144,
0.06268492341041565,
0.08304895460605621,
-0.03385011851787567,
0.02683086320757866,
-0.10633846372365952,
0.08944424241781235,
0.06875862926244736,
0.055168114602565765,
-0.15030591189861298,
0.147231787443161,
0.046477749943733215,
0.08115123957395554,
0.00030615818104706705,
0.03485905006527901,
-0.10787855833768845,
0.04271994158625603,
-0.02387204021215439,
0.03929763659834862,
-0.004175567999482155,
0.04548179730772972,
-0.031201256439089775,
0.021785348653793335,
-0.025830505415797234,
0.04811045154929161,
-0.03611236438155174,
-0.028225388377904892,
-0.03083551488816738,
0.030991502106189728,
-0.052913617342710495,
-0.02310342714190483,
0.005944365169852972,
-0.08884577453136444,
0.11692184954881668,
-0.06206660345196724,
-0.004058179911226034,
-0.0027100248262286186,
0.018176959827542305,
0.0705939456820488,
0.030733570456504822,
0.046786513179540634,
-0.011362616904079914,
-0.003136307466775179,
0.04163903743028641,
0.01028423011302948,
-0.01096324808895588,
-0.005885133054107428,
0.0460260771214962,
-0.1417534351348877,
-0.0850231796503067,
-0.0835753083229065,
-0.04883695766329765,
-0.0665021687746048,
0.08012715727090836,
0.08358275145292282,
0.07320206612348557,
0.08601851016283035,
-0.030774660408496857,
-0.00032787228701636195,
-0.14331825077533722,
-0.03384488448500633,
0.05564818158745766,
-0.02844042330980301,
-0.06351258605718613,
-0.04140211269259453,
0.06246002018451691,
-0.04154318571090698,
0.13490800559520721,
-0.003491275478154421,
0.05330418795347214,
-0.014054791070520878,
-0.031353406608104706,
0.0003994555736426264,
0.007983400486409664,
0.20296761393547058,
-0.08980680257081985,
0.02081693895161152,
-0.004577308427542448,
0.0010430836118757725,
0.05878939852118492,
0.12648747861385345,
0.08231952041387558,
0.101361945271492,
0.0758599117398262,
0.11235412955284119,
-0.05043654888868332,
-0.030611388385295868,
-0.15968453884124756,
0.050192300230264664,
-0.02620571479201317,
0.042864080518484116,
-0.031008344143629074,
0.10160335898399353,
0.13245126605033875,
-0.13066473603248596,
0.0910852700471878,
0.029992206022143364,
-0.10957282781600952,
-0.04291598126292229,
-0.06735949218273163,
-0.04729054868221283,
-0.11771760880947113,
0.016500692814588547,
-0.1165408119559288,
0.023546850308775902,
0.07434140145778656,
0.050361935049295425,
-0.025801319628953934,
0.148620143532753,
-0.000982183963060379,
-0.06555637717247009,
0.02993260882794857,
0.03280642628669739,
0.03672853112220764,
0.11242184787988663,
-0.0017345724627375603,
0.06894318759441376,
-0.058885786682367325,
0.08026179671287537,
0.01837589219212532,
0.021356996148824692,
0.034017521888017654,
0.01455641072243452,
-0.001484647043980658,
-0.054671160876750946,
0.0057484242133796215,
0.07624093443155289,
0.16792833805084229,
0.03314780443906784,
-0.050911951810121536,
-0.048233870416879654,
0.1831551045179367,
-0.06428234279155731,
-0.057027705013751984,
-0.11675145477056503,
0.16399219632148743,
0.03894468769431114,
0.021460233256220818,
0.001600760850124061,
-0.08101484179496765,
-0.05193709582090378,
0.2558845579624176,
0.03682403266429901,
-0.03999655321240425,
-0.0457892082631588,
-0.011869686655700207,
-0.014514343813061714,
-0.026609286665916443,
0.14797329902648926,
0.025272000581026077,
0.21388210356235504,
0.005221297964453697,
-0.006842756178230047,
-0.03352892026305199,
-0.029169032350182533,
-0.023161349818110466,
0.18889819085597992,
-0.04072794318199158,
0.03620162978768349,
-0.09317925572395325,
-0.018419932574033737,
0.026321351528167725,
-0.1255723088979721,
0.09652687609195709,
-0.10271857678890228,
-0.07506074011325836,
0.030772307887673378,
0.09453233331441879,
-0.036290090531110764,
0.05238492786884308,
-0.014692213386297226,
0.05333501100540161,
0.03390093147754669,
-0.029247688129544258,
-0.10274343192577362,
-0.13655006885528564,
0.04434322938323021,
-0.008967388421297073,
0.13887809216976166,
0.015502525493502617,
0.08404568582773209,
0.09424355626106262,
0.009205200709402561,
-0.0787724032998085,
0.09156114608049393,
0.024617571383714676,
-0.02210807427763939,
0.06483161449432373,
0.1300160437822342,
-0.03881341218948364,
0.13581828773021698,
0.0017654524417594075,
-0.04147535189986229,
-0.031986188143491745,
-0.021861042827367783,
0.0008937910897657275,
-0.15138022601604462,
-0.007036310620605946,
-0.06793098896741867,
0.1297629326581955,
0.183427631855011,
-0.04102346673607826,
-0.01683754287660122,
-0.04029109328985214,
0.06444409489631653,
-0.02642391435801983,
0.09514208137989044,
0.0024071424268186092,
-0.1724780797958374,
-0.0008941671112552285,
0.005496451631188393,
0.016524815931916237,
-0.19229255616664886,
-0.03875284641981125,
-0.03346191719174385,
-0.03445269912481308,
-0.09371540695428848,
0.1491537243127823,
0.06965819001197815,
0.02676430717110634,
-0.038247715681791306,
-0.1527201384305954,
-0.015071604400873184,
0.043342169374227524,
-0.14534889161586761,
-0.12623344361782074
] |
null | null |
transformers
|
# CodeTrans model for program synthesis
Pretrained model on programming language lisp inspired DSL using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_program_synthese_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_program_synthese_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/program%20synthesis/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | LISP |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 89.43 |
| CodeTrans-ST-Base | 89.65 |
| CodeTrans-TF-Small | 90.30 |
| CodeTrans-TF-Base | 90.24 |
| CodeTrans-TF-Large | 90.21 |
| CodeTrans-MT-Small | 82.88 |
| CodeTrans-MT-Base | 86.99 |
| CodeTrans-MT-Large | 90.27 |
| CodeTrans-MT-TF-Small | **90.31** |
| CodeTrans-MT-TF-Base | 90.30 |
| CodeTrans-MT-TF-Large | 90.17 |
| State of the art | 85.80 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
|
summarization
|
SEBIS/code_trans_t5_large_program_synthese_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for program synthesis
=====================================
Pretrained model on programming language lisp inspired DSL using the t5 large model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
63,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 220,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1344955563545227,
-0.0038741466123610735,
-0.00064493139507249,
0.14534616470336914,
0.10510635375976562,
0.01616276428103447,
0.09119503945112228,
0.052628688514232635,
-0.017900582402944565,
0.009413938969373703,
0.05880211293697357,
0.033931419253349304,
0.005976720713078976,
0.18332193791866302,
0.0038280896842479706,
-0.11732819676399231,
-0.0025417418219149113,
0.025882069021463394,
-0.0677703246474266,
0.1250450164079666,
0.09635298699140549,
-0.07340557873249054,
0.047611452639102936,
-0.06569612771272659,
-0.21108271181583405,
0.0504598394036293,
-0.014707605354487896,
-0.04343552887439728,
0.09666144847869873,
0.046113599091768265,
0.12799420952796936,
-0.008503404445946217,
0.050621118396520615,
-0.15739218890666962,
0.002088216133415699,
0.009830307215452194,
0.0364476703107357,
0.008901749737560749,
0.048790305852890015,
0.03023146651685238,
0.13719157874584198,
-0.0014530913904309273,
0.04689209163188934,
0.05033973976969719,
-0.05707499012351036,
-0.09561825543642044,
-0.026607247069478035,
0.018307171761989594,
0.05787745863199234,
0.1177259087562561,
-0.005766636226326227,
0.11228440701961517,
-0.16129356622695923,
0.13431592285633087,
0.12240727990865707,
-0.22897934913635254,
-0.012197032570838928,
0.10031045973300934,
0.07155100256204605,
0.08578372746706009,
-0.033704448491334915,
-0.047957319766283035,
0.09434060752391815,
0.04035908728837967,
0.017804866656661034,
-0.08493705838918686,
-0.07947709411382675,
0.041504696011543274,
-0.10532492399215698,
-0.052699413150548935,
0.23137842118740082,
0.005463764537125826,
-0.06717950850725174,
-0.07085245102643967,
-0.025922011584043503,
-0.12544448673725128,
0.02637552283704281,
0.026276592165231705,
0.013007252477109432,
-0.02349567413330078,
0.023044293746352196,
0.01889442466199398,
-0.08375421911478043,
-0.1517883986234665,
0.028914455324411392,
0.1272585391998291,
0.0585455484688282,
0.02075464278459549,
-0.08036951720714569,
0.11375928670167923,
0.08357605338096619,
-0.04615714028477669,
-0.031659599393606186,
-0.023224931210279465,
-0.11349689960479736,
0.05061100795865059,
-0.07524382323026657,
-0.17137424647808075,
0.005116866435855627,
0.03626313433051109,
0.024203995242714882,
0.0476834736764431,
0.03433466702699661,
0.018654951825737953,
0.008724028244614601,
0.1995924711227417,
0.03750183805823326,
-0.09018559753894806,
0.052013467997312546,
0.04707205295562744,
-0.016008766368031502,
-0.021990885958075523,
-0.08368498086929321,
-0.0758351981639862,
0.0651872456073761,
0.1011124774813652,
-0.13124790787696838,
0.05190663784742355,
-0.07563045620918274,
-0.03800360858440399,
-0.029243985190987587,
-0.17707030475139618,
0.01034877635538578,
0.017806967720389366,
-0.06128625199198723,
-0.03318464756011963,
0.10030665993690491,
-0.15823443233966827,
-0.1387576311826706,
-0.02753610350191593,
-0.07225559651851654,
-0.03428661450743675,
-0.16171306371688843,
-0.15014097094535828,
-0.011659813113510609,
-0.046549972146749496,
0.008923206478357315,
-0.07720038294792175,
-0.1840902417898178,
-0.023097731173038483,
0.030170796439051628,
0.01688397116959095,
0.006786707788705826,
-0.10449276119470596,
-0.027002044022083282,
-0.01973581127822399,
-0.03157463297247887,
-0.0070166862569749355,
-0.045090049505233765,
0.1185961663722992,
0.10628791898488998,
0.032748714089393616,
-0.03417409956455231,
0.05147377774119377,
-0.07531103491783142,
0.048909347504377365,
-0.08154815435409546,
0.12681223452091217,
-0.08942697197198868,
0.07028758525848389,
-0.011256640776991844,
-0.10540970414876938,
0.07071053236722946,
0.07035856693983078,
0.06898586452007294,
0.05567505210638046,
-0.1696779578924179,
-0.02974379062652588,
0.16384851932525635,
-0.1283850371837616,
-0.11386916786432266,
0.09961587190628052,
-0.05087437480688095,
0.05488331988453865,
0.10031051933765411,
0.14059452712535858,
0.1654701977968216,
-0.04037363454699516,
0.011913081631064415,
0.06238384172320366,
0.05842791497707367,
-0.12635745108127594,
0.06922490894794464,
0.07630378007888794,
-0.08088229596614838,
0.07653564214706421,
-0.06890398263931274,
0.11105851083993912,
-0.004947279114276171,
-0.038199763745069504,
-0.058430515229701996,
-0.07959014922380447,
-0.0032570641487836838,
0.01176836621016264,
0.05847663804888725,
-0.0853499323129654,
-0.07993745803833008,
0.10807561874389648,
0.17672103643417358,
-0.13605482876300812,
0.011575764045119286,
-0.09319602698087692,
0.03887256234884262,
-0.04340001940727234,
0.019909624010324478,
-0.16013844311237335,
0.009274237789213657,
0.05382435768842697,
0.016734207049012184,
0.07980064302682877,
0.12766525149345398,
0.012223777361214161,
0.03274460881948471,
0.004612778313457966,
-0.01461423747241497,
-0.11697782576084137,
-0.0591079480946064,
-0.06225462257862091,
-0.0696781575679779,
-0.0904492661356926,
-0.06599327176809311,
-0.00022454577265307307,
-0.1846732348203659,
0.006453930865973234,
0.0023003173992037773,
0.018182680010795593,
0.013626955449581146,
-0.015348657965660095,
0.012279277667403221,
0.08034472912549973,
-0.0644068643450737,
-0.037739936262369156,
0.023585934191942215,
0.012965922243893147,
-0.07226978987455368,
-0.06316316872835159,
-0.07181423902511597,
-0.000589253322687,
0.11915082484483719,
0.01621183566749096,
-0.060889214277267456,
0.024489879608154297,
-0.011936435475945473,
-0.029498139396309853,
0.03311551362276077,
-0.06837908923625946,
0.15422643721103668,
0.007189259864389896,
0.17857040464878082,
-0.15915140509605408,
-0.042413536459207535,
-0.050304461270570755,
0.01590145379304886,
0.043067608028650284,
0.14526475965976715,
-0.0005412733880802989,
-0.06420503556728363,
0.06190050020813942,
0.013150005601346493,
-0.09480743110179901,
0.194476917386055,
-0.06284245103597641,
-0.08849471062421799,
0.02272837609052658,
0.10960585623979568,
-0.017985178157687187,
0.14349432289600372,
-0.16522134840488434,
-0.03562954440712929,
0.011921246536076069,
0.0042987060733139515,
0.06481877714395523,
-0.14112111926078796,
-0.01084967516362667,
0.015563011169433594,
-0.08249036967754364,
-0.06842971593141556,
-0.011286349035799503,
-0.010189907625317574,
0.041000910103321075,
-0.0235540010035038,
-0.04488430544734001,
0.026116730645298958,
-0.04311838373541832,
-0.10069374740123749,
0.21217095851898193,
-0.12214317172765732,
-0.2235385626554489,
-0.1961812525987625,
0.07208361476659775,
-0.07910630106925964,
-0.0020404437091201544,
0.02648702822625637,
-0.11360882222652435,
-0.06633022427558899,
-0.06950851529836655,
0.16552749276161194,
-0.0734737366437912,
-0.017831668257713318,
0.0026716634165495634,
0.06485027074813843,
0.0026917767245322466,
-0.22426429390907288,
0.037843234837055206,
-0.019634252414107323,
-0.008328855037689209,
-0.00821665022522211,
-0.08324836194515228,
0.08808668702840805,
0.1686369627714157,
-0.054256074130535126,
0.029180273413658142,
0.006793922279030085,
0.1810765415430069,
-0.029180215671658516,
-0.07282309979200363,
0.1418084055185318,
-0.024668598547577858,
-0.004250252619385719,
0.018816106021404266,
0.0007264412706717849,
-0.1175692155957222,
0.059542808681726456,
0.007046101614832878,
-0.026367317885160446,
-0.27520644664764404,
-0.03914407268166542,
-0.08603635430335999,
0.029922133311629295,
0.01756947487592697,
0.04915846139192581,
-0.10322631150484085,
0.023841137066483498,
0.04793059453368187,
0.11885051429271698,
0.0015873841475695372,
0.03960699215531349,
0.1053239107131958,
-0.015394055284559727,
0.023053549230098724,
-0.10062597692012787,
-0.01415258925408125,
0.06707793474197388,
0.08054482936859131,
0.26121091842651367,
-0.10743572562932968,
0.19555051624774933,
0.04492590203881264,
0.04870879277586937,
0.027390001341700554,
0.13954244554042816,
-0.10309939086437225,
0.026762207970023155,
0.010648254305124283,
-0.013740862719714642,
-0.11535365879535675,
0.027730681002140045,
-0.06253552436828613,
0.07134105265140533,
-0.09618349373340607,
-0.023082565516233444,
0.00996993575245142,
0.12245533615350723,
0.036070168018341064,
-0.22177626192569733,
-0.08727476000785828,
0.027419282123446465,
-0.07960034161806107,
-0.08824994415044785,
0.07819261401891708,
0.21084365248680115,
-0.04006142169237137,
-0.019381709396839142,
-0.0031224284321069717,
0.1252630352973938,
-0.04687708988785744,
-0.03068004548549652,
-0.032613251358270645,
0.06681550294160843,
0.03124636597931385,
0.1396089345216751,
-0.29009300470352173,
0.12747451663017273,
-0.008465255610644817,
0.061131853610277176,
-0.03903057053685188,
0.05738275498151779,
-0.0415293425321579,
0.05628295987844467,
0.05026616156101227,
-0.01301293820142746,
0.046142831444740295,
-0.1617136150598526,
-0.013107202015817165,
0.033308226615190506,
0.05642350763082504,
0.044978074729442596,
0.0711316242814064,
-0.0107888737693429,
0.049921758472919464,
0.005504118278622627,
-0.12374719977378845,
-0.0695866048336029,
-0.11098082363605499,
-0.007546399254351854,
-0.025913268327713013,
-0.03987247124314308,
-0.0546913668513298,
-0.027500197291374207,
0.014557039365172386,
0.21825070679187775,
-0.07818335294723511,
-0.08815722167491913,
-0.0770372599363327,
0.043770864605903625,
0.12835197150707245,
-0.0737098976969719,
0.034228961914777756,
0.016195308417081833,
0.014353982172906399,
-0.005122197791934013,
-0.09340396523475647,
0.03953760489821434,
-0.02689131163060665,
-0.0670568123459816,
-0.013875599019229412,
0.08755239099264145,
0.015881629660725594,
0.020508253946900368,
-0.0013691327767446637,
-0.08596009761095047,
-0.03272080048918724,
-0.12637701630592346,
-0.1353793889284134,
-0.006491366308182478,
-0.012214338406920433,
0.06379377096891403,
-0.14425905048847198,
-0.004627096001058817,
-0.014498480595648289,
-0.027175672352313995,
0.1314319372177124,
0.17613361775875092,
-0.05957813188433647,
0.04436362907290459,
0.10444122552871704,
-0.06477075815200806,
-0.1939699351787567,
0.02095397561788559,
0.05331827327609062,
0.1060314029455185,
-0.04333058372139931,
-0.18065473437309265,
0.07741639018058777,
-0.0010319205466657877,
0.03746729716658592,
0.05352622643113136,
-0.3152914047241211,
-0.12140457332134247,
0.10477668792009354,
0.141643688082695,
0.09614518284797668,
-0.11096957325935364,
-0.03365869075059891,
-0.06928017735481262,
-0.11101458221673965,
0.1052512675523758,
0.008027090691030025,
0.12741395831108093,
-0.05072394758462906,
0.044564709067344666,
0.03969950973987579,
-0.03347405791282654,
0.07076352089643478,
0.03501708433032036,
0.13757432997226715,
-0.04866274818778038,
0.020326822996139526,
0.10936170816421509,
-0.03463198244571686,
0.17885828018188477,
-0.15399760007858276,
0.0988553836941719,
-0.204942986369133,
-0.0700780525803566,
-0.05811126530170441,
0.0130130834877491,
-0.028008630499243736,
-0.045394301414489746,
-0.0952472984790802,
0.018669746816158295,
-0.011234480887651443,
-0.0068772295489907265,
0.0494864359498024,
-0.022303514182567596,
-0.048164863139390945,
0.06457709521055222,
0.1289258599281311,
-0.009212319739162922,
-0.05201517418026924,
0.04483308643102646,
0.04718857258558273,
0.09580409526824951,
-0.16588890552520752,
0.016634782776236534,
0.11685440689325333,
0.02575136348605156,
0.11619638651609421,
0.05183645337820053,
-0.10189151018857956,
0.016360674053430557,
0.08909507840871811,
-0.08753499388694763,
-0.07489917427301407,
-0.02871672250330448,
-0.09529010206460953,
-0.059586796909570694,
0.06191982328891754,
0.11117129772901535,
-0.041083309799432755,
-0.0076499744318425655,
-0.01954258792102337,
-0.022347349673509598,
-0.10819147527217865,
0.1863277554512024,
0.0775558352470398,
0.06420660763978958,
-0.07857485860586166,
0.06238158419728279,
0.05865845829248428,
-0.06204187124967575,
0.02108360454440117,
0.16800574958324432,
-0.09635432809591293,
-0.045776546001434326,
0.05171218141913414,
0.1971704363822937,
-0.01594976894557476,
-0.047457970678806305,
-0.13002048432826996,
-0.07722341269254684,
0.03946378827095032,
0.16161562502384186,
0.10775987058877945,
0.10545481741428375,
-0.056007131934165955,
0.014009837061166763,
-0.08961047977209091,
0.09162484854459763,
0.079364113509655,
0.050248853862285614,
-0.14745016396045685,
0.13372986018657684,
0.02383008599281311,
0.099488265812397,
-0.03549705073237419,
-0.007204901427030563,
-0.12197960168123245,
0.07162711769342422,
-0.09691517055034637,
0.025728344917297363,
-0.0035140214022248983,
0.06385435909032822,
-0.013366120867431164,
-0.0036516457330435514,
-0.022855810821056366,
0.0631009191274643,
-0.09552627056837082,
-0.002741846488788724,
0.01354033313691616,
0.03920790180563927,
-0.07334227859973907,
-0.034423548728227615,
0.025869479402899742,
-0.09381266683340073,
0.1202787533402443,
-0.00957828015089035,
-0.043054938316345215,
0.08529787510633469,
-0.079782634973526,
0.04549449682235718,
0.035954490303993225,
0.0762384757399559,
0.0032089725136756897,
0.03172225132584572,
0.06277349591255188,
0.05419750511646271,
0.05901997536420822,
0.02546524815261364,
0.11588848382234573,
-0.13384953141212463,
-0.08947424590587616,
-0.058797504752874374,
-0.10460279881954193,
-0.051772356033325195,
0.09692336618900299,
0.08224727213382721,
0.1195538192987442,
0.11267847567796707,
-0.0204387828707695,
0.007578105200082064,
-0.12624236941337585,
-0.054615266621112823,
0.0291476808488369,
-0.02467242442071438,
-0.10410653799772263,
-0.06461784243583679,
0.04383783042430878,
-0.03614401817321777,
0.14901907742023468,
0.0023283823393285275,
0.004780435003340244,
-0.03630708158016205,
-0.029457110911607742,
0.0474872924387455,
0.021245669573545456,
0.23460717499256134,
-0.07124819606542587,
0.04926636442542076,
0.003200810868293047,
-0.004034486133605242,
0.017367102205753326,
0.10442609339952469,
0.1122622862458229,
0.15590770542621613,
-0.03403593227267265,
0.1061336100101471,
0.02297940105199814,
-0.008583007380366325,
-0.0662444531917572,
-0.0029764368664473295,
0.014195616357028484,
0.054310329258441925,
-0.05782774090766907,
0.22151410579681396,
0.05044484883546829,
-0.09683980792760849,
0.10181141644716263,
0.041973214596509933,
-0.13344581425189972,
-0.051010146737098694,
-0.019123485311865807,
-0.03392678126692772,
-0.1542373150587082,
0.03369889780879021,
-0.1340523511171341,
-0.019644558429718018,
0.04202120006084442,
0.05604133382439613,
-0.071226567029953,
0.15958115458488464,
0.0052572512067854404,
-0.04574083164334297,
0.04213686287403107,
-0.009807083755731583,
0.00732757244259119,
0.03029770590364933,
0.01789899915456772,
0.03395183011889458,
-0.021722370758652687,
0.030781058594584465,
0.02358127012848854,
-0.019084017723798752,
-0.00802325364202261,
-0.017829878255724907,
0.012949574738740921,
-0.024278054013848305,
0.031606387346982956,
0.07067480683326721,
0.18977917730808258,
0.033948205411434174,
-0.09685827046632767,
-0.029679251834750175,
0.15668927133083344,
-0.04938938468694687,
-0.10611312836408615,
-0.10697296261787415,
0.13810965418815613,
0.03017972968518734,
0.03033573180437088,
0.004999757278710604,
-0.07936430722475052,
-0.05675588175654411,
0.21174798905849457,
0.07170766592025757,
-0.04076005518436432,
-0.022132745012640953,
0.0010792268440127373,
-0.0011157565750181675,
-0.0631057620048523,
0.19074851274490356,
0.023710913956165314,
0.24517038464546204,
0.025824198499321938,
-0.03655141219496727,
-0.0799652636051178,
-0.029615582898259163,
-0.004897539969533682,
0.12377630174160004,
-0.04412347078323364,
-0.04890124872326851,
-0.08473104983568192,
0.001914899330586195,
-0.01887638494372368,
-0.049301281571388245,
0.08456532657146454,
-0.11489661037921906,
-0.09196746349334717,
-0.03301476687192917,
0.05470540374517441,
-0.072496697306633,
0.0121288588270545,
-0.0320686474442482,
0.045385003089904785,
0.07282648235559464,
-0.0360570102930069,
-0.12640011310577393,
-0.14048399031162262,
0.09046494215726852,
-0.047222550958395004,
0.12103095650672913,
-0.006151036825031042,
0.1614837348461151,
0.08126170188188553,
0.045233938843011856,
-0.055220093578100204,
0.11418740451335907,
0.03101506270468235,
0.02835095301270485,
0.055047161877155304,
0.09823916852474213,
-0.057134464383125305,
0.1448456197977066,
-0.038054291158914566,
-0.012234156019985676,
-0.00938849151134491,
-0.05413941666483879,
-0.033659692853689194,
-0.17701669037342072,
-0.025778481736779213,
-0.12413583695888519,
0.09999710321426392,
0.1807975471019745,
-0.034676071256399155,
-0.029054315760731697,
-0.08931860327720642,
0.09771756827831268,
-0.03458753228187561,
0.052815936505794525,
-0.018205082044005394,
-0.1866520792245865,
-0.0162450410425663,
0.056037094444036484,
0.014173022471368313,
-0.23877349495887756,
-0.020689263939857483,
-0.03564691171050072,
-0.02869398333132267,
-0.06769729405641556,
0.15575356781482697,
0.10150863230228424,
0.04204864427447319,
-0.03415742516517639,
-0.14515239000320435,
-0.024569325149059296,
0.06020106002688408,
-0.14120319485664368,
-0.12659528851509094
] |
null | null |
transformers
|
# CodeTrans model for program synthesis
Pretrained model on programming language lisp inspired DSL using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
## Intended uses & limitations
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_program_synthese_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_program_synthese_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/program%20synthesis/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | LISP |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 89.43 |
| CodeTrans-ST-Base | 89.65 |
| CodeTrans-TF-Small | 90.30 |
| CodeTrans-TF-Base | 90.24 |
| CodeTrans-TF-Large | 90.21 |
| CodeTrans-MT-Small | 82.88 |
| CodeTrans-MT-Base | 86.99 |
| CodeTrans-MT-Large | 90.27 |
| CodeTrans-MT-TF-Small | **90.31** |
| CodeTrans-MT-TF-Base | 90.30 |
| CodeTrans-MT-TF-Large | 90.17 |
| State of the art | 85.80 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
|
summarization
|
SEBIS/code_trans_t5_large_program_synthese_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for program synthesis
=====================================
Pretrained model on programming language lisp inspired DSL using the t5 large model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
Intended uses & limitations
---------------------------
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
63,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.0961504876613617,
0.06992530822753906,
-0.0021891917567700148,
0.11690188944339752,
0.05332275480031967,
0.012518389150500298,
0.09226809442043304,
0.08868654072284698,
-0.025609511882066727,
0.05005837604403496,
0.07951898127794266,
0.005204792134463787,
0.04008793830871582,
0.1802089512348175,
0.03870374709367752,
-0.1555328518152237,
0.00166411476675421,
0.021383510902523994,
-0.049448102712631226,
0.10815728455781937,
0.09957858920097351,
-0.08595598489046097,
0.06367794424295425,
-0.048116352409124374,
-0.12682923674583435,
0.057222846895456314,
-0.05129322409629822,
-0.02954622358083725,
0.07570234686136246,
0.0503293052315712,
0.0974331796169281,
-0.010700837709009647,
0.10237544775009155,
-0.1996440440416336,
-0.00706452177837491,
0.016405489295721054,
0.039607565850019455,
0.016290439292788506,
0.07854609191417694,
0.05614159628748894,
0.14728432893753052,
-0.022374749183654785,
0.03310983628034592,
0.04479432851076126,
-0.05217619240283966,
-0.06809482723474503,
-0.05598954111337662,
0.08453063666820526,
0.14534273743629456,
0.09018474072217941,
-0.013191784732043743,
0.01452322956174612,
-0.07855214178562164,
0.09785403311252594,
0.12218280881643295,
-0.1994311809539795,
-0.0264242272824049,
0.0639527216553688,
0.07242834568023682,
0.05963284894824028,
-0.03744347393512726,
-0.029400570318102837,
0.09867403656244278,
0.029882624745368958,
0.023296836763620377,
-0.08136092126369476,
-0.059673406183719635,
0.0006885682232677937,
-0.08024132251739502,
-0.05685386061668396,
0.1491089016199112,
0.03465067222714424,
-0.05980098620057106,
-0.10656120628118515,
-0.039162516593933105,
-0.1701274812221527,
0.046755094081163406,
0.009698515757918358,
0.02628214657306671,
-0.011151292361319065,
0.07143744081258774,
-0.017910225316882133,
-0.11528566479682922,
-0.11929493397474289,
0.028153741732239723,
0.07803623378276825,
0.07962053269147873,
0.022953355684876442,
-0.0017013351898640394,
0.09921036660671234,
0.051190704107284546,
-0.036519765853881836,
-0.020273836329579353,
0.009349985048174858,
-0.12297084927558899,
0.04957445338368416,
-0.04495782032608986,
-0.08547703176736832,
-0.030179863795638084,
0.07741358876228333,
0.010548679158091545,
0.052990976721048355,
0.1344825178384781,
0.01201658882200718,
-0.016545655205845833,
0.19950635731220245,
0.017321189865469933,
-0.08204062283039093,
-0.004879568703472614,
0.029592033475637436,
0.004222672898322344,
-0.01363467238843441,
-0.08349179476499557,
-0.03169484809041023,
-0.017260026186704636,
0.06528088450431824,
-0.12699101865291595,
0.013642577454447746,
-0.05217504873871803,
-0.02015564776957035,
0.03286602720618248,
-0.1469109207391739,
0.03345149755477905,
0.0006265518604777753,
-0.05188598111271858,
-0.06129543110728264,
0.08344625681638718,
-0.08806650340557098,
-0.12050224840641022,
0.006387976463884115,
-0.03400976583361626,
-0.017672596499323845,
-0.12605150043964386,
-0.11444694548845291,
-0.0005059599643573165,
-0.07719121873378754,
0.009832317940890789,
-0.10614901781082153,
-0.12988708913326263,
-0.0306718647480011,
0.04684878885746002,
-0.0022848285734653473,
-0.013431298546493053,
-0.06926344335079193,
0.009993412531912327,
-0.010401356965303421,
-0.01682262122631073,
0.02949289232492447,
-0.02083192765712738,
0.09222731739282608,
0.08649388700723648,
0.047409068793058395,
0.0017174166860058904,
0.031955234706401825,
-0.06077949330210686,
0.06120836362242699,
-0.037967342883348465,
0.08281619846820831,
-0.028862392529845238,
0.05430474504828453,
-0.07700039446353912,
-0.07632070034742355,
0.062431078404188156,
0.056752175092697144,
0.06200113520026207,
0.03275644779205322,
-0.13911834359169006,
0.01749000884592533,
0.13294120132923126,
-0.10511857271194458,
-0.10926999896764755,
0.09694193303585052,
-0.004624064080417156,
0.03522346168756485,
0.06712767481803894,
0.14702484011650085,
0.15241605043411255,
-0.08179599791765213,
-0.02909870818257332,
0.07018320262432098,
0.08069515973329544,
-0.07564034312963486,
0.0922052413225174,
0.0372064933180809,
0.03733270242810249,
0.036048732697963715,
-0.000350620859535411,
0.07159539312124252,
-0.005997819826006889,
-0.051597267389297485,
-0.02464943751692772,
-0.09587918221950531,
-0.04679005965590477,
-0.008506905287504196,
0.0386810228228569,
-0.05615204945206642,
-0.07225717604160309,
0.04634612426161766,
0.17681314051151276,
-0.10871564596891403,
0.046866193413734436,
-0.09621288627386093,
-0.03870126232504845,
-0.08741118013858795,
0.01239120215177536,
-0.11482904851436615,
0.03398752957582474,
0.04505569115281105,
-0.029392309486865997,
0.0785304605960846,
0.08918026089668274,
-0.0007439820910803974,
0.036532267928123474,
-0.0296004731208086,
-0.0484858974814415,
-0.06936071068048477,
-0.05646417662501335,
-0.13051360845565796,
-0.03700214996933937,
-0.10970654338598251,
-0.04181461036205292,
-0.051289673894643784,
-0.1591549515724182,
0.016195526346564293,
-0.03529326990246773,
0.04355991259217262,
0.005369282327592373,
-0.033503711223602295,
0.024033471941947937,
0.06291607767343521,
-0.055962007492780685,
-0.07980689406394958,
0.015235799364745617,
0.006997162941843271,
-0.1234603300690651,
-0.02518616057932377,
-0.10564558953046799,
-0.04733675718307495,
0.06866969913244247,
0.04624662548303604,
-0.06451365351676941,
0.027625778689980507,
-0.02754361554980278,
-0.05701684579253197,
-0.0033641764894127846,
-0.08017437905073166,
0.14404557645320892,
0.002056880621239543,
0.1472359448671341,
-0.13956409692764282,
-0.06450141221284866,
-0.03315811604261398,
-0.005500472616404295,
-0.002370077883824706,
0.1687607765197754,
0.029499530792236328,
-0.05763232335448265,
0.035036951303482056,
0.022006956860423088,
-0.046020906418561935,
0.12988227605819702,
-0.017334111034870148,
-0.11038175970315933,
0.0072260103188455105,
0.10339440405368805,
-0.018081821501255035,
0.11059349030256271,
-0.07164885103702545,
-0.018343335017561913,
0.0006866699550300837,
0.018656829372048378,
0.04014359042048454,
-0.14468497037887573,
0.01651882566511631,
0.056878119707107544,
-0.0746375024318695,
-0.03451993688941002,
-0.023175446316599846,
-0.057435017079114914,
0.04812397062778473,
-0.027609452605247498,
-0.0117121497169137,
-0.0004968151915818453,
-0.028774771839380264,
-0.09431727975606918,
0.17694121599197388,
-0.09151462465524673,
-0.1939113438129425,
-0.17170202732086182,
0.03308523818850517,
-0.05834735929965973,
0.015374950133264065,
0.05019921809434891,
-0.11761213839054108,
-0.056298427283763885,
-0.11078596860170364,
0.11149097234010696,
-0.10997192561626434,
0.004451960790902376,
0.008226660080254078,
0.04147138446569443,
0.0357796736061573,
-0.17751632630825043,
0.039092179387807846,
-0.0110219307243824,
0.004116474650800228,
-0.003979930654168129,
-0.038881152868270874,
0.08802895992994308,
0.12532277405261993,
-0.06785417348146439,
0.019244462251663208,
0.0043379999697208405,
0.175648495554924,
-0.03523094579577446,
0.022678963840007782,
0.2054009884595871,
-0.001801174134016037,
0.03044752962887287,
0.050045453011989594,
0.014147588983178139,
-0.12053085118532181,
0.057963982224464417,
0.04504602029919624,
-0.03403964638710022,
-0.26263290643692017,
-0.009732140228152275,
-0.06257429718971252,
0.027524705976247787,
0.0938335508108139,
0.06371160596609116,
-0.15848378837108612,
0.03701364994049072,
-0.001668523414991796,
0.13908076286315918,
-0.04111481085419655,
0.04109703004360199,
0.01807786524295807,
-0.011629092507064342,
0.006700287107378244,
-0.09021958708763123,
-0.007152814883738756,
0.06747426092624664,
0.09629587829113007,
0.20701256394386292,
-0.0652594268321991,
0.1973099261522293,
0.027245428413152695,
0.08092613518238068,
-0.007335915230214596,
0.12345916777849197,
-0.09672673791646957,
0.01268158107995987,
0.010636729188263416,
-0.015858972445130348,
-0.07325176894664764,
0.08446168154478073,
-0.01097784098237753,
0.04794535040855408,
-0.07198753952980042,
0.04422420263290405,
0.0076351650059223175,
0.16467095911502838,
0.05451882258057594,
-0.18240608274936676,
-0.06972122192382812,
0.03275367617607117,
-0.09572577476501465,
-0.10692165791988373,
0.06487895548343658,
0.20753724873065948,
-0.012737150304019451,
-0.0025376135017722845,
0.004227776080369949,
0.129522442817688,
-0.0657089576125145,
-0.03559141978621483,
0.027571382001042366,
0.031198222190141678,
0.04222387447953224,
0.14016085863113403,
-0.24983416497707367,
0.08522935956716537,
0.009750676341354847,
0.08747045695781708,
-0.032292358577251434,
0.07398548722267151,
-0.06207578256726265,
-0.006354896351695061,
0.10428237915039062,
0.011407961137592793,
-0.06428492814302444,
-0.20168626308441162,
-0.0793011337518692,
0.01524015236645937,
0.07783230394124985,
-0.03952701389789581,
0.0830984115600586,
-0.000018143749912269413,
0.056402675807476044,
-0.021194299682974815,
-0.10167426615953445,
-0.08006817102432251,
-0.149948850274086,
0.005262790247797966,
0.016475999727845192,
-0.035236068069934845,
-0.04639502987265587,
0.005191748961806297,
-0.036323972046375275,
0.24110953509807587,
-0.14512230455875397,
-0.1042976900935173,
-0.08915980905294418,
0.04518571123480797,
0.13962991535663605,
-0.08718308806419373,
0.012186702340841293,
0.04404165968298912,
0.02493329532444477,
-0.030757511034607887,
-0.0664159432053566,
0.02870861440896988,
-0.049172479659318924,
-0.09349388629198074,
-0.03508048132061958,
0.0932815670967102,
-0.003471995238214731,
0.03656324744224548,
-0.003078545443713665,
-0.08708277344703674,
-0.03446326404809952,
-0.12460814416408539,
-0.06678371131420135,
0.019154474139213562,
0.029354581609368324,
-0.004772873129695654,
-0.11394844949245453,
0.12002851068973541,
-0.016236111521720886,
-0.10662426799535751,
0.07052292674779892,
0.2310163378715515,
-0.06078612431883812,
0.05305422469973564,
0.09260690957307816,
-0.08159849792718887,
-0.16105736792087555,
-0.07200663536787033,
0.06052342802286148,
0.08562569320201874,
-0.01391243003308773,
-0.15461739897727966,
0.06819164752960205,
0.0031886680517345667,
0.024979347363114357,
0.02323935553431511,
-0.29570719599723816,
-0.14271490275859833,
0.06788389384746552,
0.08739794790744781,
0.034425195306539536,
-0.11644945293664932,
-0.03635237738490105,
-0.06052038073539734,
-0.059876952320337296,
0.04950374737381935,
0.09132970124483109,
0.12473567575216293,
-0.03196687623858452,
0.02382597140967846,
0.03285454213619232,
-0.020283034071326256,
0.11368200182914734,
0.01863674819469452,
0.10655427724123001,
-0.033770058304071426,
0.01714191772043705,
0.05515454709529877,
-0.058978550136089325,
0.15839286148548126,
-0.14650674164295197,
0.07570834457874298,
-0.23080292344093323,
-0.0637468472123146,
-0.010406668297946453,
-0.012563352473080158,
-0.0408257357776165,
-0.06356403231620789,
-0.09057573974132538,
0.00017933480557985604,
0.051513057202100754,
-0.01982719451189041,
0.042552608996629715,
-0.0316731259226799,
-0.08026187121868134,
0.08560136705636978,
0.09402170777320862,
0.0007536811172030866,
-0.07484658807516098,
0.003292429493740201,
0.023636318743228912,
0.08070406317710876,
-0.16519735753536224,
0.015563715249300003,
0.13238079845905304,
-0.0005431885947473347,
0.09949233382940292,
0.013406005688011646,
-0.07767964154481888,
0.028543440625071526,
0.07738368213176727,
-0.0544036440551281,
-0.09658225625753403,
-0.021931922063231468,
-0.03371372073888779,
-0.0757729709148407,
0.03053223341703415,
0.09969650954008102,
-0.06238628551363945,
-0.021731320768594742,
-0.017952483147382736,
0.006313478574156761,
-0.08068092167377472,
0.18833708763122559,
0.027750495821237564,
0.06516518443822861,
-0.05983780324459076,
0.0916006788611412,
0.08554209768772125,
-0.12129504233598709,
0.025132495909929276,
0.1400148719549179,
-0.06907869875431061,
-0.039390336722135544,
0.050992850214242935,
0.08347863703966141,
-0.030953308567404747,
-0.06846674531698227,
-0.09268903732299805,
-0.06710071861743927,
0.033648259937763214,
0.010887007229030132,
0.07704274356365204,
0.0886848047375679,
-0.031032457947731018,
0.020178893581032753,
-0.08720492571592331,
0.09426211565732956,
0.09153087437152863,
0.04847697168588638,
-0.16902777552604675,
0.1542583405971527,
0.016982777044177055,
0.1032615527510643,
-0.00478527182713151,
0.04512469097971916,
-0.07102593779563904,
0.05198383331298828,
-0.04091566801071167,
0.028122369199991226,
-0.0040620011277496815,
0.04624903202056885,
-0.005633345805108547,
0.022720228880643845,
-0.017495548352599144,
0.054015904664993286,
-0.06695660948753357,
-0.02753186970949173,
-0.03347793594002724,
0.037491559982299805,
-0.056414347141981125,
-0.04639779031276703,
0.003379508852958679,
-0.07859403640031815,
0.10413355380296707,
-0.04193674400448799,
-0.026909496635198593,
-0.006233559921383858,
-0.032331082969903946,
0.0964806005358696,
0.03125851973891258,
0.06362684071063995,
-0.031088339164853096,
-0.007905052974820137,
0.029586154967546463,
0.035245489329099655,
-0.012187868356704712,
-0.017913153395056725,
0.05070545896887779,
-0.1474592536687851,
-0.08177205920219421,
-0.08991766721010208,
-0.07375022768974304,
-0.06833495199680328,
0.0805295929312706,
0.07675004005432129,
0.09275446832180023,
0.08428844809532166,
-0.014475126750767231,
0.002558619249612093,
-0.1278824806213379,
-0.020912425592541695,
0.060170095413923264,
0.006271407473832369,
-0.10513271391391754,
-0.06868547946214676,
0.04913724958896637,
-0.03499835357069969,
0.13037817180156708,
-0.02624472603201866,
0.0014296298613771796,
-0.029235024005174637,
-0.06352433562278748,
0.018750296905636787,
0.014526007696986198,
0.21598395705223083,
-0.10710710287094116,
0.03350142762064934,
-0.008663213811814785,
-0.03022695891559124,
0.04573682323098183,
0.162491112947464,
0.06502785533666611,
0.1375969797372818,
0.0683770552277565,
0.1142793819308281,
-0.04908536374568939,
-0.028107495978474617,
-0.1632188856601715,
0.031629305332899094,
0.006799258757382631,
0.049562323838472366,
-0.05015195906162262,
0.13694794476032257,
0.09551965445280075,
-0.11488956212997437,
0.06852418184280396,
0.031560927629470825,
-0.09648958593606949,
-0.051869455724954605,
-0.09111669659614563,
-0.04442320764064789,
-0.08477213978767395,
0.01818174310028553,
-0.10204896330833435,
0.0255388543009758,
0.029633820056915283,
0.04156463220715523,
-0.020672274753451347,
0.14207413792610168,
-0.0396416038274765,
-0.03760150820016861,
0.009568469598889351,
0.010036439634859562,
0.03017456829547882,
0.08911357074975967,
-0.0031211988534778357,
0.0657520443201065,
-0.03948241099715233,
0.059589434415102005,
0.02810152992606163,
-0.0033652782440185547,
0.011831632815301418,
0.00295851263217628,
0.00728042284026742,
-0.04836609587073326,
0.003996671177446842,
0.09688948839902878,
0.17550727725028992,
0.03564513102173805,
-0.06913094222545624,
-0.04966888576745987,
0.15373580157756805,
-0.06991056352853775,
-0.060601718723773956,
-0.09077884256839752,
0.11584462970495224,
0.04649516940116882,
0.03549119830131531,
0.007721399422734976,
-0.07986419647932053,
-0.05678785964846611,
0.2257682979106903,
0.03049519844353199,
-0.029711585491895676,
-0.03627276420593262,
0.008140485733747482,
-0.0012569921091198921,
-0.06769774109125137,
0.14045970141887665,
0.006684973370283842,
0.20013122260570526,
-0.009100443683564663,
-0.005892507266253233,
-0.0457429401576519,
-0.020310178399086,
-0.036790113896131516,
0.19841229915618896,
-0.04895210638642311,
0.016198083758354187,
-0.08329986035823822,
-0.014427871443331242,
0.024472087621688843,
-0.10105733573436737,
0.12408192455768585,
-0.06140494346618652,
-0.06939766556024551,
0.041021376848220825,
0.09221020340919495,
-0.04096153751015663,
0.02982689067721367,
-0.03795154020190239,
0.05278818681836128,
0.06674759835004807,
-0.024991163983941078,
-0.11093762516975403,
-0.08983217924833298,
0.0492035411298275,
-0.03373130038380623,
0.1534052938222885,
0.02154652401804924,
0.1059422641992569,
0.06806426495313644,
0.01309992466121912,
-0.09392476081848145,
0.11090195924043655,
0.03369844704866409,
0.010283363051712513,
0.08052904158830643,
0.14125321805477142,
-0.04706738889217377,
0.13912007212638855,
-0.012367123737931252,
-0.018046988174319267,
-0.013195611536502838,
-0.00526332575827837,
-0.016242559999227524,
-0.13977286219596863,
-0.0049850777722895145,
-0.09019232541322708,
0.1323300004005432,
0.15655355155467987,
-0.04585109278559685,
-0.02648838981986046,
-0.039219457656145096,
0.07746534794569016,
-0.03132133558392525,
0.07876521348953247,
0.014894225634634495,
-0.16609716415405273,
0.011701003648340702,
0.0264875590801239,
0.03727170452475548,
-0.16078345477581024,
-0.05878850445151329,
-0.040934737771749496,
-0.05151486024260521,
-0.08877526968717575,
0.1505342423915863,
0.08374908566474915,
0.01926947385072708,
-0.03824950009584427,
-0.20421095192432404,
-0.019981972873210907,
0.05143062025308609,
-0.14301013946533203,
-0.11977430433034897
] |
null | null |
transformers
|
# CodeTrans model for program synthesis
Pretrained model on programming language lisp inspired DSL using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
## Intended uses & limitations
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_program_synthese_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_program_synthese_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "you are given an array of numbers a and a number b , compute the difference of elements in a and b"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/transfer%20learning%20fine-tuning/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 3,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | LISP |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 89.43 |
| CodeTrans-ST-Base | 89.65 |
| CodeTrans-TF-Small | 90.30 |
| CodeTrans-TF-Base | 90.24 |
| CodeTrans-TF-Large | 90.21 |
| CodeTrans-MT-Small | 82.88 |
| CodeTrans-MT-Base | 86.99 |
| CodeTrans-MT-Large | 90.27 |
| CodeTrans-MT-TF-Small | **90.31** |
| CodeTrans-MT-TF-Base | 90.30 |
| CodeTrans-MT-TF-Large | 90.17 |
| State of the art | 85.80 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "you are given an array of numbers a and a number b , compute the difference of elements in a and b"}]}
|
summarization
|
SEBIS/code_trans_t5_large_program_synthese_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for program synthesis
=====================================
Pretrained model on programming language lisp inspired DSL using the t5 large model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the program synthesis task for the lisp inspired DSL code.
Intended uses & limitations
---------------------------
The model could be used to generate lisp inspired DSL code given the human language description tasks.
### How to use
Here is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 3,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 3,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 3,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
63,
87,
113
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate lisp inspired DSL code using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 3,500 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing lisp inspired DSL data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10272372514009476,
0.07141919434070587,
-0.001538818352855742,
0.11314376443624496,
0.06247517839074135,
0.012302834540605545,
0.09640771895647049,
0.07943756878376007,
-0.047222789376974106,
0.04023699834942818,
0.07541350275278091,
0.024813037365674973,
0.05066752806305885,
0.19609536230564117,
0.0384012795984745,
-0.15987630188465118,
0.00037281704135239124,
0.03379175812005997,
-0.05753476172685623,
0.11482920497655869,
0.09210021793842316,
-0.0904088169336319,
0.06940672546625137,
-0.05255081504583359,
-0.12887312471866608,
0.05805603787302971,
-0.0586283877491951,
-0.030044175684452057,
0.06784592568874359,
0.05970417708158493,
0.09900747239589691,
-0.010089847259223461,
0.10377534478902817,
-0.19327066838741302,
-0.008077229373157024,
0.01807083934545517,
0.02969379909336567,
0.024182815104722977,
0.07809744775295258,
0.058565184473991394,
0.14989341795444489,
-0.027192438021302223,
0.028784310445189476,
0.049770206212997437,
-0.06359625607728958,
-0.08055286854505539,
-0.058172378689050674,
0.07752159982919693,
0.14020884037017822,
0.09826734662055969,
-0.014058046042919159,
0.00959740299731493,
-0.07424788922071457,
0.09698092192411423,
0.1409451961517334,
-0.20251967012882233,
-0.02246798202395439,
0.0594569556415081,
0.0631955936551094,
0.0645155981183052,
-0.036749836057424545,
-0.03776520863175392,
0.0954979658126831,
0.02566712722182274,
0.05387743562459946,
-0.08143329620361328,
-0.04865521565079689,
0.000058640813222154975,
-0.08945766091346741,
-0.05435550957918167,
0.15892428159713745,
0.03201288729906082,
-0.054729264229536057,
-0.11440648883581161,
-0.037121038883924484,
-0.1865835189819336,
0.03751782327890396,
0.016798630356788635,
0.025299934670329094,
-0.00994657352566719,
0.05765062943100929,
-0.012843858450651169,
-0.10999757051467896,
-0.11464615166187286,
0.042680852115154266,
0.06844274699687958,
0.07318351417779922,
0.019223444163799286,
-0.00595758855342865,
0.10325810313224792,
0.07480129599571228,
-0.030268503352999687,
-0.01683826744556427,
0.0020934075582772493,
-0.1099470853805542,
0.04909788817167282,
-0.05047976225614548,
-0.08503729104995728,
-0.05208629369735718,
0.0809493362903595,
0.021104630082845688,
0.052301693707704544,
0.14985372126102448,
0.008871277794241905,
-0.01535516045987606,
0.20111900568008423,
0.0006076967692933977,
-0.08426643162965775,
-0.003366802353411913,
0.03889639675617218,
-0.005815739277750254,
-0.010226035490632057,
-0.08422842621803284,
-0.03205007314682007,
-0.011555378325283527,
0.07413094490766525,
-0.12623174488544464,
0.028468696400523186,
-0.046850841492414474,
-0.02768096514046192,
0.054989248514175415,
-0.14927954971790314,
0.020356258377432823,
-0.0024562927428632975,
-0.05830049514770508,
-0.06172853708267212,
0.08629602938890457,
-0.09738117456436157,
-0.11821158975362778,
0.006806692574173212,
-0.03262212872505188,
-0.027775898575782776,
-0.13625892996788025,
-0.12479732185602188,
-0.012711351737380028,
-0.07646951079368591,
0.005798663944005966,
-0.10117306560277939,
-0.1454804539680481,
-0.026713859289884567,
0.055702026933431625,
0.0035641093272715807,
-0.0038681072182953358,
-0.06856781244277954,
0.009038017131388187,
-0.012113349512219429,
-0.026577338576316833,
0.0261565949767828,
-0.020308135077357292,
0.09990968555212021,
0.0880892351269722,
0.04911412298679352,
-0.010876316577196121,
0.04155832529067993,
-0.06725519150495529,
0.05882657691836357,
-0.036797791719436646,
0.09086116403341293,
-0.020597917959094048,
0.062335748225450516,
-0.07734899967908859,
-0.08112265169620514,
0.05847472324967384,
0.05219760164618492,
0.07065646350383759,
0.03991616517305374,
-0.14390306174755096,
0.009092562831938267,
0.11787149310112,
-0.09486474096775055,
-0.10993416607379913,
0.08107949793338776,
-0.009219683706760406,
0.04037202522158623,
0.0723029151558876,
0.1298181414604187,
0.1529240608215332,
-0.06642349809408188,
-0.037606120109558105,
0.06726840883493423,
0.07609997689723969,
-0.1024308130145073,
0.07781775295734406,
0.03731884807348251,
0.02905052714049816,
0.040539149194955826,
-0.014729611575603485,
0.08244072645902634,
-0.010764194652438164,
-0.05119071528315544,
-0.012972189113497734,
-0.09769685566425323,
-0.053063102066516876,
-0.007776335347443819,
0.03580638766288757,
-0.03725837543606758,
-0.07353442907333374,
0.038981515914201736,
0.16017232835292816,
-0.11827098578214645,
0.042927347123622894,
-0.08751381933689117,
-0.03185928240418434,
-0.09269760549068451,
0.012474311515688896,
-0.12397710978984833,
0.02170243300497532,
0.03888300806283951,
-0.02877577394247055,
0.07854349166154861,
0.07703761756420135,
0.004117170348763466,
0.03628377616405487,
-0.02801642194390297,
-0.0531790517270565,
-0.060253892093896866,
-0.06527863442897797,
-0.1341738998889923,
-0.020607711747288704,
-0.12904739379882812,
-0.03957832604646683,
-0.054845504462718964,
-0.1498548686504364,
0.01801689714193344,
-0.04720228537917137,
0.03191563859581947,
0.009703104384243488,
-0.03479493036866188,
0.03435538709163666,
0.06306028366088867,
-0.04747145250439644,
-0.08023139834403992,
0.02179887518286705,
-0.000051258557505207136,
-0.12408600747585297,
-0.02992173656821251,
-0.12222881615161896,
-0.050593435764312744,
0.0703921765089035,
0.049103666096925735,
-0.062056202441453934,
-0.002772626467049122,
-0.03383908048272133,
-0.04606911912560463,
0.019671326503157616,
-0.0747494176030159,
0.14014577865600586,
0.0033937266562134027,
0.15210336446762085,
-0.13657954335212708,
-0.06441676616668701,
-0.0457756333053112,
0.012047862634062767,
-0.00548446225002408,
0.15727761387825012,
0.05119933560490608,
-0.03426321968436241,
0.03617378696799278,
0.02880524843931198,
-0.021221725270152092,
0.1314241886138916,
-0.014781772159039974,
-0.10565082728862762,
0.014643009752035141,
0.09342607855796814,
-0.015406004153192043,
0.10523111373186111,
-0.041361067444086075,
-0.024472391232848167,
0.004621912259608507,
0.004894922021776438,
0.0449773408472538,
-0.14266160130500793,
0.011776357889175415,
0.055533744394779205,
-0.06873627752065659,
-0.026265015825629234,
-0.021177954971790314,
-0.06091824918985367,
0.04147486016154289,
-0.02184930071234703,
-0.008443886414170265,
0.002908776979893446,
-0.03008570522069931,
-0.08418799191713333,
0.1797533929347992,
-0.09070397913455963,
-0.19606219232082367,
-0.1872534155845642,
0.0191268939524889,
-0.05164428427815437,
0.014940188266336918,
0.041223566979169846,
-0.12217947840690613,
-0.06772823631763458,
-0.11096495389938354,
0.12850512564182281,
-0.10527949035167694,
0.0029630844946950674,
-0.002235012361779809,
0.032144203782081604,
0.029474731534719467,
-0.18601632118225098,
0.044946905225515366,
-0.02138950489461422,
-0.0025834764819592237,
-0.0038943372201174498,
-0.04286335036158562,
0.098432257771492,
0.11795076727867126,
-0.06067948788404465,
0.02020132727921009,
-0.0029761306941509247,
0.19548581540584564,
-0.03415786102414131,
0.020822087302803993,
0.1910039335489273,
0.006964345462620258,
0.03555857390165329,
0.05794821307063103,
0.01652914099395275,
-0.13319170475006104,
0.05667782574892044,
0.045350950211286545,
-0.030290869995951653,
-0.25214898586273193,
-0.009818822145462036,
-0.06763820350170135,
0.024132708087563515,
0.10215309262275696,
0.055644441395998,
-0.1424238383769989,
0.03904053196310997,
-0.012433338910341263,
0.15096993744373322,
-0.03533752262592316,
0.045983683317899704,
0.0023226761259138584,
-0.003613770939409733,
0.005754425190389156,
-0.08596577495336533,
-0.004420801531523466,
0.057522568851709366,
0.08908407390117645,
0.23234820365905762,
-0.07029592990875244,
0.2155587077140808,
0.019208166748285294,
0.08861558884382248,
-0.002630190923810005,
0.1441235989332199,
-0.09962508827447891,
0.023798968642950058,
0.010360737331211567,
-0.022992489859461784,
-0.08677226305007935,
0.0780348926782608,
-0.024459874257445335,
0.04165169224143028,
-0.07192422449588776,
0.03195686638355255,
-0.004910857882350683,
0.16837072372436523,
0.04592817649245262,
-0.20367646217346191,
-0.0622752383351326,
0.02896931953728199,
-0.08032061904668808,
-0.10160666704177856,
0.06072157993912697,
0.17858931422233582,
-0.007079017348587513,
-0.0061970241367816925,
-0.008243116550147533,
0.1323794722557068,
-0.07621026039123535,
-0.04107660427689552,
0.030977893620729446,
0.05608058348298073,
0.03652532398700714,
0.12584558129310608,
-0.2494712620973587,
0.10206952691078186,
0.01838812604546547,
0.08481871336698532,
-0.03471292182803154,
0.07430896908044815,
-0.06600603461265564,
-0.009703487157821655,
0.10269888490438461,
0.006076894700527191,
-0.05401553586125374,
-0.2164505422115326,
-0.07963699847459793,
0.014936354011297226,
0.08454561233520508,
-0.041665319353342056,
0.08914075046777725,
0.0032690810039639473,
0.05088970065116882,
-0.01777132786810398,
-0.08688808232545853,
-0.08658820390701294,
-0.1717495173215866,
0.004937062505632639,
0.013577165082097054,
-0.015003754757344723,
-0.04208267852663994,
0.015311198309063911,
-0.03727317228913307,
0.23087966442108154,
-0.14626920223236084,
-0.11064271628856659,
-0.09081734716892242,
0.05155977979302406,
0.14664863049983978,
-0.08921434730291367,
0.0165251512080431,
0.04273422434926033,
0.0192066989839077,
-0.029464617371559143,
-0.0883941799402237,
0.035257723182439804,
-0.04954078420996666,
-0.07924302667379379,
-0.035726360976696014,
0.08534412086009979,
0.013697050511837006,
0.03266412392258644,
-0.012846680358052254,
-0.08989418298006058,
-0.03170182183384895,
-0.11515682935714722,
-0.07737346738576889,
0.03529900684952736,
0.02837635576725006,
0.0012353401398286223,
-0.12841956317424774,
0.10771406441926956,
-0.024798084050416946,
-0.08647510409355164,
0.058139950037002563,
0.2012973129749298,
-0.05817485228180885,
0.045199114829301834,
0.11282873898744583,
-0.07367537915706635,
-0.1596643626689911,
-0.06556810438632965,
0.05691481754183769,
0.07686609774827957,
-0.024929555132985115,
-0.17493148148059845,
0.060909539461135864,
0.0024635132867842913,
0.035114120692014694,
0.030028140172362328,
-0.29778361320495605,
-0.13906565308570862,
0.06584013253450394,
0.09572893381118774,
0.01761160045862198,
-0.12166202068328857,
-0.036650650203228,
-0.0643015205860138,
-0.07000312954187393,
0.054408665746450424,
0.11748075485229492,
0.12645713984966278,
-0.03893827274441719,
0.020985497161746025,
0.031362053006887436,
-0.02372484654188156,
0.09593196213245392,
0.02769569493830204,
0.12633249163627625,
-0.03769640997052193,
0.031256772577762604,
0.061743468046188354,
-0.059586234390735626,
0.1620243489742279,
-0.15549229085445404,
0.07190724462270737,
-0.23003531992435455,
-0.06724579632282257,
-0.016106806695461273,
-0.01844054087996483,
-0.03560696169734001,
-0.06990677118301392,
-0.09853111952543259,
0.006292718928307295,
0.057336077094078064,
-0.014936613850295544,
0.03238663077354431,
-0.02604888752102852,
-0.07661145180463791,
0.07648934423923492,
0.09397342056035995,
-0.01022360473871231,
-0.0745643898844719,
0.0004979912773706019,
0.02951783686876297,
0.08718830347061157,
-0.1504940688610077,
-0.002238507615402341,
0.1288689374923706,
-0.004553495906293392,
0.10606678575277328,
0.013457894325256348,
-0.07766639441251755,
0.021409381181001663,
0.07907172292470932,
-0.05193736031651497,
-0.09590313583612442,
-0.022463548928499222,
-0.025092823430895805,
-0.08319858461618423,
0.03633050620555878,
0.08433052152395248,
-0.061910249292850494,
-0.022603847086429596,
-0.020900703966617584,
0.0061662220396101475,
-0.07523448765277863,
0.18688279390335083,
0.028681574389338493,
0.06262621283531189,
-0.06804856657981873,
0.08733122050762177,
0.09311288595199585,
-0.09874569624662399,
0.022291043773293495,
0.13466762006282806,
-0.07564152032136917,
-0.03476576879620552,
0.06514932215213776,
0.07893891632556915,
-0.04252172261476517,
-0.05420631915330887,
-0.07532766461372375,
-0.06421767920255661,
0.04072987288236618,
0.017835620790719986,
0.06675256043672562,
0.086812824010849,
-0.031916916370391846,
0.020420178771018982,
-0.09416165947914124,
0.07639561593532562,
0.08256583660840988,
0.05133085325360298,
-0.15357941389083862,
0.16521593928337097,
0.01786195859313011,
0.08895813673734665,
-0.002601681277155876,
0.04463215544819832,
-0.06384442001581192,
0.048329420387744904,
-0.022970518097281456,
0.004903456661850214,
-0.016933057457208633,
0.04251040518283844,
-0.016420897096395493,
0.035163406282663345,
-0.011549623683094978,
0.05220133438706398,
-0.068614661693573,
-0.029394984245300293,
-0.02408459037542343,
0.024697845801711082,
-0.06648644059896469,
-0.0469033345580101,
-0.008590339682996273,
-0.08014417439699173,
0.0991627499461174,
-0.04199588671326637,
-0.02514869160950184,
-0.0021478389389812946,
-0.02642267569899559,
0.08288773894309998,
0.022621463984251022,
0.060579318553209305,
-0.02774837799370289,
0.013526002876460552,
0.020984333008527756,
0.028041360899806023,
-0.01834775134921074,
-0.023219600319862366,
0.04537566006183624,
-0.15216955542564392,
-0.07261741161346436,
-0.07692376524209976,
-0.05261429026722908,
-0.06605264544487,
0.08853273093700409,
0.07554808259010315,
0.07733721286058426,
0.08112575113773346,
-0.007305394392460585,
-0.004930883180350065,
-0.132187157869339,
-0.025217745453119278,
0.061697933822870255,
0.001479066675528884,
-0.10692650079727173,
-0.08250695466995239,
0.05295547842979431,
-0.026202505454421043,
0.13320501148700714,
-0.02162914350628853,
-0.00816280860453844,
-0.03789573162794113,
-0.058269597589969635,
0.023829275742173195,
0.007681184448301792,
0.21849393844604492,
-0.09524170309305191,
0.027821432799100876,
0.005056978203356266,
-0.009613347239792347,
0.044695671647787094,
0.16312886774539948,
0.08595190942287445,
0.12569652497768402,
0.06005334481596947,
0.1046605259180069,
-0.04428988695144653,
-0.0311588067561388,
-0.14627932012081146,
0.014633850194513798,
0.019018584862351418,
0.05411659553647041,
-0.05231663957238197,
0.14495763182640076,
0.08678936958312988,
-0.1098707765340805,
0.07678147405385971,
0.04942801967263222,
-0.09807949513196945,
-0.04462166130542755,
-0.10186505317687988,
-0.03400968015193939,
-0.07933076471090317,
0.015416582114994526,
-0.1013488620519638,
0.01586775854229927,
0.027294879779219627,
0.039555978029966354,
-0.01570100151002407,
0.15561923384666443,
-0.02907346747815609,
-0.03951151669025421,
0.018359273672103882,
0.007438522297888994,
0.018350908532738686,
0.06747014820575714,
-0.002224565716460347,
0.060549210757017136,
-0.04802162945270538,
0.051747582852840424,
0.028810296207666397,
-0.016928190365433693,
0.020865151658654213,
0.017685094848275185,
0.004021379165351391,
-0.050185125321149826,
0.015087293460965157,
0.09323100745677948,
0.19220946729183197,
0.028698313981294632,
-0.06685610115528107,
-0.05457666516304016,
0.13967840373516083,
-0.06511051207780838,
-0.04400299862027168,
-0.0794575959444046,
0.12078624218702316,
0.05010956898331642,
0.03053447976708412,
0.007474855054169893,
-0.08075397461652756,
-0.053744394332170486,
0.23307564854621887,
0.047179318964481354,
-0.03292996436357498,
-0.03441012650728226,
-0.004721099976450205,
-0.00022099945636000484,
-0.07813891023397446,
0.15170057117938995,
0.019272996112704277,
0.20493105053901672,
-0.00531962513923645,
-0.006005248986184597,
-0.04364415258169174,
-0.021524954587221146,
-0.026629677042365074,
0.19144776463508606,
-0.04212174937129021,
0.007011755369603634,
-0.07990404218435287,
-0.0177980437874794,
0.01724827103316784,
-0.10132639110088348,
0.11784626543521881,
-0.056716084480285645,
-0.05830469727516174,
0.025019381195306778,
0.0709463357925415,
-0.028141386806964874,
0.037825170904397964,
-0.037106383591890335,
0.05693414807319641,
0.08541519194841385,
-0.02009624056518078,
-0.12194187939167023,
-0.10251825302839279,
0.05889612436294556,
-0.03161017969250679,
0.1483258605003357,
0.02495664916932583,
0.09126519411802292,
0.06561145931482315,
0.01093953475356102,
-0.09515996277332306,
0.11053111404180527,
0.038200534880161285,
0.02306579239666462,
0.07776553928852081,
0.12995915114879608,
-0.040470149368047714,
0.14609394967556,
-0.033470455557107925,
-0.020900459960103035,
-0.0014380927896127105,
0.001147770555689931,
-0.012174413539469242,
-0.1335858553647995,
-0.008229076862335205,
-0.08579085022211075,
0.132979154586792,
0.16203583776950836,
-0.042443618178367615,
-0.026052214205265045,
-0.04493311792612076,
0.07200077921152115,
-0.025635013356804848,
0.06375835835933685,
0.01107027381658554,
-0.15584951639175415,
0.004403964150696993,
0.03443635255098343,
0.03368491679430008,
-0.16048766672611237,
-0.04461688920855522,
-0.04873783513903618,
-0.060389939695596695,
-0.08779417723417282,
0.14835761487483978,
0.08091619610786438,
0.0201518964022398,
-0.037270959466695786,
-0.15915219485759735,
-0.02417212538421154,
0.04943538457155228,
-0.15840421617031097,
-0.10950317978858948
] |
null | null |
transformers
|
# CodeTrans model for source code summarization csharp
Pretrained model on programming language csharp using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_csharp_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_csharp_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/csharp/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_csharp_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us
|
CodeTrans model for source code summarization csharp
====================================================
Pretrained model on programming language csharp using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
50,
62,
146
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1287207007408142,
-0.0000357325843651779,
0.0001143340123235248,
0.12556146085262299,
0.11329541355371475,
0.019848283380270004,
0.10129964351654053,
0.05009119212627411,
-0.049380671232938766,
0.026828870177268982,
0.029640814289450645,
0.01632949709892273,
0.055699922144412994,
0.19256456196308136,
0.023923233151435852,
-0.2297435700893402,
0.03888353332877159,
0.01759367808699608,
-0.11704479902982712,
0.10769627243280411,
0.10130022466182709,
-0.0749170333147049,
0.04166371002793312,
0.001163478009402752,
-0.19751383364200592,
0.047053262591362,
-0.011997358873486519,
-0.09012730419635773,
0.12289503216743469,
0.07705075293779373,
0.13457824289798737,
0.017939791083335876,
0.06232535094022751,
-0.0845378041267395,
0.012424247339367867,
0.052669115364551544,
0.023679256439208984,
0.041913896799087524,
0.06575021147727966,
0.041038550436496735,
0.17817656695842743,
0.021678950637578964,
0.06607101857662201,
0.062196094542741776,
-0.069624163210392,
-0.13329578936100006,
-0.033422183245420456,
0.04390822350978851,
0.07762584090232849,
0.08275293558835983,
-0.009071480482816696,
0.09765205532312393,
-0.1310296654701233,
0.09592566639184952,
0.11101064831018448,
-0.271904319524765,
-0.01855557970702648,
0.11657613515853882,
0.06280701607465744,
0.0483432374894619,
-0.020857010036706924,
-0.030110185965895653,
0.06678277254104614,
0.04407917708158493,
0.05796486511826515,
-0.053453173488378525,
-0.04581755772233009,
0.027460968121886253,
-0.13182052969932556,
-0.07911469787359238,
0.20526869595050812,
0.013659333810210228,
-0.06034959852695465,
-0.09729927778244019,
-0.07289256900548935,
-0.11242082715034485,
0.018681613728404045,
0.02716071717441082,
-0.0011133663356304169,
-0.014600134454667568,
0.007065997458994389,
0.04230433329939842,
-0.10211944580078125,
-0.13768580555915833,
-0.0047583687119185925,
0.0719415545463562,
0.06266359239816666,
0.037361759692430496,
-0.06525047868490219,
0.12302345782518387,
0.009807033464312553,
-0.04579426720738411,
-0.020319504663348198,
-0.046440090984106064,
-0.1585935354232788,
0.03740844503045082,
-0.07610274106264114,
-0.2096097618341446,
-0.03416344150900841,
0.03925744444131851,
0.02091158740222454,
0.05283716693520546,
0.06228616461157799,
0.010528919287025928,
0.02499672956764698,
0.21060298383235931,
-0.043298833072185516,
-0.051667340099811554,
0.026282930746674538,
0.03777740150690079,
-0.07882266491651535,
-0.05517208203673363,
-0.05968201905488968,
-0.06318627297878265,
0.01907181181013584,
0.08218207210302353,
-0.11901017278432846,
0.08174602687358856,
-0.03700471296906471,
-0.05706216022372246,
0.03401283547282219,
-0.16914157569408417,
-0.024736933410167694,
0.0022455707658082247,
-0.04166242480278015,
-0.008522924035787582,
0.08859942853450775,
-0.12323981523513794,
-0.13441653549671173,
0.0020690462552011013,
-0.09378071129322052,
-0.07007383555173874,
-0.13175590336322784,
-0.13681624829769135,
-0.011151809245347977,
-0.0410151369869709,
-0.030862070620059967,
-0.077793650329113,
-0.13169468939304352,
-0.01660599559545517,
0.03725595399737358,
0.0025842387694865465,
-0.0005345899262465537,
-0.050333358347415924,
-0.009341161698102951,
-0.0033238709438592196,
-0.02569033019244671,
-0.0012539689196273685,
-0.03739132732152939,
0.07357817888259888,
0.06199723482131958,
0.044660937041044235,
0.012678912840783596,
0.0496731698513031,
-0.044895239174366,
0.07225152850151062,
-0.1711038053035736,
0.11925848573446274,
-0.0816468745470047,
0.06922934949398041,
-0.058321889489889145,
-0.09276948124170303,
0.027463970705866814,
0.04154154658317566,
0.031553495675325394,
0.052231475710868835,
-0.14764752984046936,
-0.04209452494978905,
0.19934235513210297,
-0.14648273587226868,
-0.10227871686220169,
0.08778609335422516,
-0.05351095646619797,
0.057051341980695724,
0.07932649552822113,
0.1273590475320816,
0.15421324968338013,
-0.0668560117483139,
0.01493661105632782,
0.051491979509592056,
0.041900306940078735,
-0.09012158960103989,
0.09803590923547745,
0.03713073953986168,
-0.09450015425682068,
0.05146276578307152,
-0.045844849199056625,
0.08414499461650848,
-0.020755110308527946,
-0.04561052843928337,
-0.032364849001169205,
-0.054158661514520645,
0.013624578714370728,
-0.029717912897467613,
0.06952429562807083,
-0.022710317745804787,
-0.07609488815069199,
0.046023089438676834,
0.1602214127779007,
-0.14657266438007355,
-0.0039007384330034256,
-0.10421571880578995,
0.03244442120194435,
-0.070347860455513,
0.026948196813464165,
-0.17733070254325867,
-0.022248227149248123,
0.07537076622247696,
-0.06261174380779266,
0.0569024421274662,
0.1118083968758583,
0.011353637091815472,
0.0717615932226181,
0.005193647462874651,
-0.004584181122481823,
-0.09638607501983643,
-0.04252762719988823,
-0.04514174535870552,
-0.05559432506561279,
-0.12952223420143127,
-0.042298875749111176,
0.02361396513879299,
-0.1519322246313095,
0.02993161976337433,
0.023288877680897713,
0.050774455070495605,
0.013198007829487324,
-0.012420933693647385,
-0.003659594338387251,
0.05508192256093025,
-0.04243002086877823,
-0.052629005163908005,
0.012532881461083889,
0.0187788438051939,
-0.07493706047534943,
0.022964123636484146,
-0.12767021358013153,
0.011954055167734623,
0.10113738477230072,
0.07368636131286621,
-0.07655557245016098,
0.025548486039042473,
-0.027614904567599297,
-0.03709988296031952,
0.025577640160918236,
-0.08226627111434937,
0.15023426711559296,
0.016665810719132423,
0.20062103867530823,
-0.15074093639850616,
-0.02642904222011566,
0.0023156492970883846,
0.04311596602201462,
0.03352953866124153,
0.08779163658618927,
0.05316448584198952,
-0.09559550136327744,
0.03110765479505062,
0.01611129194498062,
-0.01847081445157528,
0.19475072622299194,
-0.04991502687335014,
-0.10443280637264252,
-0.004605432506650686,
0.07140880823135376,
-0.007799488492310047,
0.11716587841510773,
-0.10960135608911514,
-0.006309183314442635,
0.02357092685997486,
0.03133160248398781,
0.07000796496868134,
-0.1505456119775772,
0.01150433998554945,
0.0542370080947876,
-0.05436962470412254,
-0.11501109600067139,
-0.0035443417727947235,
-0.010696646757423878,
0.03972024843096733,
0.007868829183280468,
-0.013329405337572098,
0.01320083998143673,
-0.03291603550314903,
-0.08955924957990646,
0.21584410965442657,
-0.10996203869581223,
-0.24018345773220062,
-0.23325270414352417,
0.11012334376573563,
-0.037834301590919495,
0.017429029569029808,
0.011587304063141346,
-0.08361133188009262,
-0.08029115200042725,
-0.07589802145957947,
0.1998782455921173,
-0.06908129155635834,
-0.0037761263083666563,
0.01683962158858776,
0.04479290917515755,
0.015221838839352131,
-0.21664448082447052,
0.029073629528284073,
0.001143340370617807,
-0.043681200593709946,
-0.01406546588987112,
-0.10227324068546295,
0.077983558177948,
0.181474968791008,
-0.056212544441223145,
0.03143564984202385,
-0.014364142902195454,
0.20582623779773712,
-0.03740392625331879,
-0.03705302253365517,
0.1532268226146698,
-0.0006919961306266487,
0.028632836416363716,
0.03845995292067528,
0.0008383096428588033,
-0.0825510323047638,
0.0440465547144413,
0.00634048692882061,
-0.051705364137887955,
-0.27751603722572327,
-0.02279059588909149,
-0.0875294953584671,
0.057831063866615295,
0.06043648719787598,
0.049206677824258804,
-0.02211211808025837,
0.04779326170682907,
0.016422225162386894,
0.12540213763713837,
0.004888244904577732,
0.040282927453517914,
0.1031867042183876,
-0.0035014536697417498,
0.027291404083371162,
-0.10678860545158386,
-0.02389584667980671,
0.0707777813076973,
0.07050897926092148,
0.28245994448661804,
-0.06962083280086517,
0.2192220687866211,
0.04072988033294678,
0.06269724667072296,
0.0647592842578888,
0.2033042013645172,
-0.08406724035739899,
0.013747951947152615,
-0.010131611488759518,
-0.03324498236179352,
-0.11126162856817245,
0.04135827720165253,
-0.020563405007123947,
0.0076119303703308105,
-0.1230054721236229,
-0.040463730692863464,
-0.013097244314849377,
0.19478639960289001,
0.014132020063698292,
-0.24916866421699524,
-0.10097013413906097,
-0.00828460231423378,
-0.10217281430959702,
-0.08526172488927841,
0.05085828900337219,
0.22092723846435547,
-0.08275745064020157,
-0.01272860262542963,
-0.03274417668581009,
0.12360114604234695,
-0.04430149495601654,
-0.0374298170208931,
-0.024531526491045952,
0.06091109290719032,
0.017677241936326027,
0.11452263593673706,
-0.16122102737426758,
0.15472733974456787,
-0.01196115743368864,
0.045590486377477646,
-0.05716430023312569,
0.08314204961061478,
-0.033667054027318954,
0.05628982558846474,
0.04316260293126106,
-0.011661330237984657,
-0.06647562235593796,
-0.19295236468315125,
-0.02370496839284897,
0.029797645285725594,
0.062054213136434555,
-0.016714198514819145,
0.0797785148024559,
-0.017066776752471924,
0.017779408022761345,
0.01959030143916607,
-0.05667097866535187,
-0.07410494238138199,
-0.1487765908241272,
0.018694987520575523,
-0.041056759655475616,
0.003389952005818486,
-0.07053817063570023,
-0.03841447830200195,
0.021015936508774757,
0.1436806470155716,
-0.05571809038519859,
-0.08373387157917023,
-0.10693822056055069,
-0.003599863965064287,
0.1835618019104004,
-0.07073614001274109,
0.07824107259511948,
-0.005108073353767395,
0.08397291600704193,
-0.008533804677426815,
-0.12482674419879913,
0.0570460744202137,
-0.029244927689433098,
-0.09067700803279877,
-0.025993093848228455,
0.09538372606039047,
0.004355469718575478,
0.01328448485583067,
-0.006765519734472036,
-0.04245008900761604,
-0.02892245352268219,
-0.12703749537467957,
-0.12057720124721527,
0.008902517147362232,
-0.004105107858777046,
0.04432614892721176,
-0.10534217953681946,
-0.017763225361704826,
0.005217463709414005,
-0.0045097884722054005,
0.10281726717948914,
0.15249058604240417,
-0.07714302092790604,
0.051947593688964844,
0.1082354262471199,
-0.03470974415540695,
-0.18224717676639557,
-0.012432594783604145,
0.06794527918100357,
0.10440481454133987,
-0.02140611596405506,
-0.19322507083415985,
0.05207418277859688,
0.029089603573083878,
0.028346501290798187,
-0.00039272962021641433,
-0.3568630814552307,
-0.13603585958480835,
0.053352829068899155,
0.10355427861213684,
-0.003956240136176348,
-0.07611683756113052,
-0.024301329627633095,
-0.06147541105747223,
-0.15424668788909912,
0.1352205127477646,
0.011631093919277191,
0.1401551067829132,
-0.016907010227441788,
0.031129905954003334,
0.019359325990080833,
-0.04838211461901665,
0.090596504509449,
0.05143391340970993,
0.1017075777053833,
-0.023568617179989815,
0.034263256937265396,
0.173760786652565,
-0.045229893177747726,
0.15155737102031708,
-0.09824968874454498,
0.10376255959272385,
-0.1713973432779312,
-0.09152556955814362,
-0.06324911117553711,
0.01068188901990652,
-0.035349041223526,
-0.04006784036755562,
-0.08129285275936127,
0.02217574045062065,
0.01784949190914631,
-0.02975320816040039,
0.04309578984975815,
-0.02636306919157505,
0.005483242683112621,
0.10156280547380447,
0.07499408721923828,
-0.0137482900172472,
-0.09328288584947586,
0.013658476993441582,
0.04675886034965515,
0.10812738537788391,
-0.21769647300243378,
0.03219339996576309,
0.11124792695045471,
0.002883367473259568,
0.11507338285446167,
0.0674239993095398,
-0.1085657924413681,
0.016298089176416397,
0.0881994217634201,
-0.08195902407169342,
-0.13587743043899536,
-0.015144085511565208,
-0.056093405932188034,
-0.06024787202477455,
0.07383550703525543,
0.09879131615161896,
-0.05610543116927147,
-0.01674918457865715,
-0.019465411081910133,
-0.0569036640226841,
-0.09081269055604935,
0.21911196410655975,
0.03587513044476509,
0.0822635143995285,
-0.08453021943569183,
0.06629114598035812,
0.08602698147296906,
-0.12566174566745758,
-0.005359329748898745,
0.15692175924777985,
-0.11678308248519897,
-0.043649375438690186,
-0.01566549576818943,
0.12112496793270111,
-0.06638490408658981,
-0.038378916680812836,
-0.1117638573050499,
-0.07603655755519867,
0.04843934625387192,
0.17032377421855927,
0.06137484684586525,
0.13873541355133057,
-0.06649799644947052,
0.005380746442824602,
-0.07203477621078491,
0.04849305748939514,
0.06606978178024292,
0.04668961465358734,
-0.1388903558254242,
0.1775294989347458,
0.04575952887535095,
0.09914544969797134,
-0.040947139263153076,
-0.026824481785297394,
-0.0838780328631401,
0.03065774217247963,
-0.08745288848876953,
0.023252073675394058,
-0.03275132179260254,
0.03811158239841461,
-0.00911145843565464,
0.016403192654252052,
-0.0186529029160738,
0.06963324546813965,
-0.09803227335214615,
-0.015184162184596062,
-0.03223094716668129,
0.04446752741932869,
-0.05516846477985382,
0.016269035637378693,
0.050740569829940796,
-0.10108909755945206,
0.12067452073097229,
-0.0162835530936718,
-0.048746220767498016,
0.07916592061519623,
-0.025137899443507195,
0.050258539617061615,
-0.006734284572303295,
0.060831546783447266,
0.011387779377400875,
0.046636179089546204,
0.08185038715600967,
0.014032435603439808,
0.026415176689624786,
-0.0023958112578839064,
0.03995032608509064,
-0.14573395252227783,
-0.10077754408121109,
-0.037199780344963074,
-0.0835513323545456,
-0.05294511839747429,
0.07187464833259583,
0.0783093124628067,
0.09029747545719147,
0.08376193046569824,
-0.00952418614178896,
0.011575041338801384,
-0.1539534032344818,
-0.06077371910214424,
0.0320703461766243,
-0.03427378088235855,
-0.03233189135789871,
-0.08528674393892288,
0.044790226966142654,
-0.02247399464249611,
0.15338364243507385,
-0.01405332051217556,
0.04171087220311165,
-0.02845621481537819,
-0.012350887060165405,
0.05622527748346329,
0.0457044318318367,
0.23698575794696808,
-0.02851700782775879,
0.031956251710653305,
0.014600895345211029,
-0.0009084618650376797,
-0.015333455055952072,
0.11239919066429138,
0.0999225303530693,
0.08888283371925354,
-0.012933267280459404,
0.08640167862176895,
0.042858973145484924,
0.0022482096683233976,
-0.10867097973823547,
0.001474065356887877,
-0.008855894207954407,
0.09067843854427338,
-0.07379157096147537,
0.1145249456167221,
0.06711431592702866,
-0.09957289695739746,
0.11728303134441376,
0.031256046146154404,
-0.11126714944839478,
-0.05301779508590698,
-0.05413274094462395,
-0.03407250717282295,
-0.15350955724716187,
0.0176859050989151,
-0.11227329075336456,
-0.014611716382205486,
0.11159049719572067,
0.0500059612095356,
-0.04409264028072357,
0.18231157958507538,
-0.008352861739695072,
-0.05116049945354462,
0.04355054721236229,
0.007287286221981049,
0.027238713577389717,
0.043441735208034515,
-0.006259136833250523,
0.060266293585300446,
0.002252994803711772,
0.033136527985334396,
0.0009960730094462633,
-0.017588632181286812,
0.020569488406181335,
-0.01569429785013199,
-0.004274704959243536,
-0.020192384719848633,
0.013261234387755394,
0.02975449152290821,
0.10160909593105316,
0.018499847501516342,
-0.08014789968729019,
-0.04578008875250816,
0.17253950238227844,
-0.05102743208408356,
-0.08702728152275085,
-0.16572189331054688,
0.17115207016468048,
0.03468513861298561,
0.03387177363038063,
0.030520493164658546,
-0.08229783922433853,
-0.07981272786855698,
0.19878368079662323,
0.10563657432794571,
0.002566842595115304,
-0.022951187565922737,
0.003458058461546898,
-0.009051372297108173,
-0.04884903132915497,
0.21035848557949066,
0.017434965819120407,
0.2510216534137726,
0.0035931800957769156,
0.006694601383060217,
-0.053050145506858826,
-0.027102155610919,
-0.009866323322057724,
0.1468152552843094,
-0.057713475078344345,
-0.030608242377638817,
-0.04962186515331268,
0.017625998705625534,
0.0099418880417943,
-0.1138727068901062,
0.07815244793891907,
-0.09817434847354889,
-0.09848377108573914,
-0.03812437504529953,
0.03517206758260727,
-0.04282081499695778,
0.03957752138376236,
-0.03821258619427681,
0.03136100247502327,
0.09555469453334808,
-0.005364485550671816,
-0.09734645485877991,
-0.13201795518398285,
0.10360897332429886,
-0.0647038072347641,
0.1418168544769287,
0.000689463980961591,
0.07828176021575928,
0.09483355283737183,
0.05850199982523918,
-0.062146592885255814,
0.08733537048101425,
0.036699406802654266,
0.011120859533548355,
0.030075155198574066,
0.11934114992618561,
-0.052698902785778046,
0.155553936958313,
-0.06334627419710159,
-0.0918668732047081,
0.020436907187104225,
-0.08499936759471893,
0.020899610593914986,
-0.13414905965328217,
-0.01734778843820095,
-0.09871138632297516,
0.10277843475341797,
0.19803686439990997,
-0.035825375467538834,
-0.024234065786004066,
-0.08049235492944717,
0.08206665515899658,
-0.022929927334189415,
0.04058006778359413,
-0.04082157090306282,
-0.19329868257045746,
-0.004298702348023653,
-0.016951505094766617,
0.02342500537633896,
-0.21907494962215424,
-0.037971071898937225,
-0.029416628181934357,
-0.03574629873037338,
-0.037407342344522476,
0.15165172517299652,
0.0985676720738411,
0.044982075691223145,
-0.021989865228533745,
-0.1498778611421585,
-0.03676208481192589,
0.0699753612279892,
-0.1284618377685547,
-0.11804204434156418
] |
null | null |
transformers
|
# CodeTrans model for source code summarization csharp
Pretrained model on programming language csharp using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the csharp code snippets.
## Intended uses & limitations
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_csharp_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_csharp_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/csharp/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_csharp_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization csharp
====================================================
Pretrained model on programming language csharp using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the csharp code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
62,
88,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10860937833786011,
0.04873759299516678,
-0.001480065518990159,
0.09627636522054672,
0.0528414249420166,
0.011879944242537022,
0.0743202492594719,
0.07881288975477219,
-0.046449169516563416,
0.06797820329666138,
0.07951399683952332,
-0.014709630981087685,
0.06395129859447479,
0.20364147424697876,
0.03958802670240402,
-0.16490869224071503,
-0.005980851594358683,
0.03653666004538536,
-0.041877493262290955,
0.11303851008415222,
0.09220656007528305,
-0.10254260897636414,
0.05892791226506233,
-0.03881005197763443,
-0.13625085353851318,
0.04722454771399498,
-0.049479689449071884,
-0.05157266929745674,
0.08817053586244583,
0.05291334539651871,
0.12146507948637009,
-0.01529866922646761,
0.08486462384462357,
-0.1892356276512146,
0.0011958957184106112,
0.021879177540540695,
0.033898208290338516,
0.030149010941386223,
0.07308810204267502,
0.06717785447835922,
0.14827418327331543,
-0.028998829424381256,
0.031968165189027786,
0.050536174327135086,
-0.06553094834089279,
-0.06691619753837585,
-0.047150179743766785,
0.08788718283176422,
0.13729698956012726,
0.0842791497707367,
-0.0163285955786705,
0.02932905964553356,
-0.07855606079101562,
0.09086073189973831,
0.11857718229293823,
-0.22128835320472717,
-0.026000628247857094,
0.07984032481908798,
0.09452793002128601,
0.06123848631978035,
-0.06074494495987892,
-0.03685172274708748,
0.09639836847782135,
0.03261013329029083,
0.05373886972665787,
-0.08377251774072647,
-0.05094603821635246,
-0.007116349879652262,
-0.07583510875701904,
-0.06820048391819,
0.13178059458732605,
0.0463780052959919,
-0.05918683856725693,
-0.10336437821388245,
-0.05207762122154236,
-0.20732827484607697,
0.03942491114139557,
0.020874736830592155,
0.017212508246302605,
-0.018808208405971527,
0.03944304212927818,
0.000991893233731389,
-0.11543568968772888,
-0.11020800471305847,
0.022325219586491585,
0.058456212282180786,
0.06969572603702545,
0.023903863504529,
-0.0025126487016677856,
0.08759573101997375,
0.023338742554187775,
-0.039510197937488556,
-0.008923153392970562,
0.011257821694016457,
-0.13872498273849487,
0.026060335338115692,
-0.03915557265281677,
-0.08791522681713104,
-0.024632016196846962,
0.07762116193771362,
-0.03923002630472183,
0.06452992558479309,
0.1504564732313156,
0.012848925776779652,
-0.0023531783372163773,
0.2171640396118164,
0.013009512796998024,
-0.1067076176404953,
-0.0037200478836894035,
0.02847634069621563,
-0.009472519159317017,
-0.015899421647191048,
-0.07556165754795074,
-0.0414830707013607,
0.006389331072568893,
0.06822971999645233,
-0.13681267201900482,
0.016247646883130074,
-0.04100118950009346,
-0.020189307630062103,
0.0737474113702774,
-0.13389194011688232,
0.02022378146648407,
0.016409117728471756,
-0.03826311230659485,
-0.06042422726750374,
0.05895055830478668,
-0.10697326809167862,
-0.12850862741470337,
0.01793232560157776,
-0.048265956342220306,
-0.030445050448179245,
-0.126571387052536,
-0.11677316576242447,
-0.005139661021530628,
-0.08375059813261032,
0.009286640211939812,
-0.0984657034277916,
-0.08095743507146835,
-0.021935680881142616,
0.029139848425984383,
0.002420438453555107,
-0.010225453414022923,
-0.05117962881922722,
0.011518624611198902,
-0.010071511380374432,
-0.018442055210471153,
0.022335868328809738,
-0.0317896232008934,
0.09249871969223022,
0.0796319842338562,
0.054196640849113464,
0.013375463895499706,
0.027877327054739,
-0.05741347000002861,
0.07031499594449997,
-0.061648622155189514,
0.05917138233780861,
-0.017152057960629463,
0.055981192737817764,
-0.08805514872074127,
-0.08130314946174622,
0.054955292493104935,
0.04624290391802788,
0.0537138357758522,
0.018689284101128578,
-0.11320900171995163,
0.011798839084804058,
0.1544695496559143,
-0.0895485207438469,
-0.12715308368206024,
0.10249598324298859,
-0.002035935176536441,
-0.00010142652899958193,
0.058224037289619446,
0.12089069932699203,
0.12867796421051025,
-0.09759770333766937,
-0.03530266135931015,
0.08179422467947006,
0.06170286238193512,
-0.0798725038766861,
0.1047079861164093,
0.019765162840485573,
0.05025825649499893,
0.018701890483498573,
0.03365699574351311,
0.07066372036933899,
-0.012914031744003296,
-0.04274146258831024,
-0.012624791823327541,
-0.08852623403072357,
-0.03763445094227791,
-0.014963459223508835,
0.03822467848658562,
-0.050269100815057755,
-0.062462590634822845,
0.001228248467668891,
0.16180042922496796,
-0.10963377356529236,
0.02917218953371048,
-0.09318622201681137,
-0.02562488056719303,
-0.11187472194433212,
0.010400247760117054,
-0.1204155683517456,
0.02507011592388153,
0.056762706488370895,
-0.053063418716192245,
0.061132971197366714,
0.08511773496866226,
-0.0030097172129899263,
0.04877191409468651,
-0.02843424864113331,
-0.04647132381796837,
-0.06055533513426781,
-0.053579773753881454,
-0.13413047790527344,
-0.017055341973900795,
-0.11540701985359192,
-0.03340275585651398,
-0.0500720776617527,
-0.1625663936138153,
0.017454616725444794,
-0.013713636435568333,
0.02852972410619259,
0.0007063961820676923,
-0.0244111530482769,
0.0318225659430027,
0.05953598767518997,
-0.042581021785736084,
-0.08070334792137146,
0.019402118399739265,
0.020487617701292038,
-0.12718689441680908,
-0.026828709989786148,
-0.13377593457698822,
-0.049084123224020004,
0.07165651768445969,
0.054416269063949585,
-0.0887894257903099,
0.0015853320946916938,
-0.03409729152917862,
-0.05639476329088211,
-0.013293967582285404,
-0.07168817520141602,
0.15450812876224518,
0.008779877796769142,
0.17514196038246155,
-0.13907133042812347,
-0.058009374886751175,
-0.01941652037203312,
0.005924978293478489,
0.01151048019528389,
0.14879275858402252,
0.02032345160841942,
-0.076727494597435,
0.031304288655519485,
0.0249021053314209,
-0.04542382061481476,
0.15470346808433533,
-0.014695399440824986,
-0.11934733390808105,
0.009071211330592632,
0.09443818032741547,
-0.01686534658074379,
0.13160456717014313,
-0.08495526760816574,
-0.020481841638684273,
0.002099258592352271,
0.007768706418573856,
0.039266861975193024,
-0.13756421208381653,
0.030073147267103195,
0.06071983277797699,
-0.05867989361286163,
-0.05431370809674263,
-0.030423201620578766,
-0.05152038484811783,
0.03444976359605789,
-0.00232097040861845,
-0.008840037509799004,
-0.004198336508125067,
-0.031197194010019302,
-0.094611756503582,
0.19864636659622192,
-0.08087442815303802,
-0.1849585771560669,
-0.18227241933345795,
0.05383008345961571,
-0.044380709528923035,
0.018678337335586548,
0.04222749546170235,
-0.09889629483222961,
-0.05461640655994415,
-0.09744270145893097,
0.12340372055768967,
-0.11574892699718475,
0.011267143301665783,
-0.0029008372221142054,
0.0333881601691246,
0.028377581387758255,
-0.168429896235466,
0.038470372557640076,
-0.005589927546679974,
-0.007203145883977413,
0.0053215972147881985,
-0.054638251662254333,
0.10093831270933151,
0.13791458308696747,
-0.0834059789776802,
0.015015743672847748,
-0.0056822411715984344,
0.18295066058635712,
-0.055902473628520966,
0.026027744635939598,
0.18750633299350739,
0.004408756271004677,
0.03463941067457199,
0.03140946850180626,
0.014916276559233665,
-0.09407322108745575,
0.051431380212306976,
0.031779151409864426,
-0.03077147714793682,
-0.2480638325214386,
-0.0007292812806554139,
-0.06322722882032394,
0.0385478213429451,
0.11201176792383194,
0.04929187148809433,
-0.1172432228922844,
0.04491584375500679,
-0.011746845208108425,
0.15099133551120758,
-0.04874188080430031,
0.053961336612701416,
0.005297871772199869,
0.005742594599723816,
0.0010992158204317093,
-0.09318848699331284,
-0.007329537998884916,
0.06719237565994263,
0.10309913754463196,
0.21682173013687134,
-0.04724877327680588,
0.23562845587730408,
0.018418721854686737,
0.07149111479520798,
0.021526049822568893,
0.13350670039653778,
-0.1018381118774414,
0.0036275009624660015,
0.0050872149877250195,
-0.016827823594212532,
-0.07574577629566193,
0.06167588382959366,
0.0011796951293945312,
0.0413745678961277,
-0.073126420378685,
0.01205105148255825,
0.010954813100397587,
0.1933392733335495,
0.051321737468242645,
-0.1893676519393921,
-0.10596363991498947,
0.007532409857958555,
-0.09790242463350296,
-0.11400019377470016,
0.058874547481536865,
0.20708145201206207,
-0.04068000614643097,
-0.009695685468614101,
-0.01070467010140419,
0.13285265862941742,
-0.07014212012290955,
-0.030579376965761185,
0.012792030349373817,
0.04376271739602089,
0.022994711995124817,
0.13041727244853973,
-0.22616997361183167,
0.09917174279689789,
0.01730051077902317,
0.0876557007431984,
-0.03792741894721985,
0.0712895467877388,
-0.058059193193912506,
0.003630696563050151,
0.08818819373846054,
0.0008391272858716547,
-0.08181358128786087,
-0.2263861894607544,
-0.05638373643159866,
0.015617850236594677,
0.0727759599685669,
-0.02016816847026348,
0.08935969322919846,
0.0032503954134881496,
0.051972463726997375,
-0.02711467072367668,
-0.11446686834096909,
-0.06285794079303741,
-0.13721689581871033,
-0.0013749564532190561,
0.013195236213505268,
-0.018430935218930244,
-0.0371202789247036,
0.014253618195652962,
-0.012554326094686985,
0.19850435853004456,
-0.16296693682670593,
-0.09603359550237656,
-0.09038638323545456,
0.049667444080114365,
0.14051572978496552,
-0.09299827367067337,
0.03168430179357529,
0.020631790161132812,
0.05668338015675545,
-0.03895743563771248,
-0.05273715779185295,
0.036171287298202515,
-0.055056966841220856,
-0.0982401967048645,
-0.029229480773210526,
0.09678330272436142,
-0.006531420163810253,
0.03974466025829315,
-0.0049828034825623035,
-0.07992388308048248,
-0.051021747291088104,
-0.12850362062454224,
-0.06811575591564178,
-0.017546411603689194,
0.052475135773420334,
-0.009334094822406769,
-0.09558659791946411,
0.10330991446971893,
-0.009664281271398067,
-0.08981244266033173,
0.08039642870426178,
0.21381860971450806,
-0.0678299069404602,
0.02094336785376072,
0.12295430153608322,
-0.05544726550579071,
-0.16447153687477112,
-0.07717912644147873,
0.05537731945514679,
0.09066355973482132,
-0.02737545222043991,
-0.1486770510673523,
0.06326493620872498,
0.025252690538764,
0.031889304518699646,
0.023012522608041763,
-0.28853926062583923,
-0.1351320594549179,
0.059389133006334305,
0.08656082302331924,
0.0160319022834301,
-0.11858845502138138,
-0.04294871538877487,
-0.06307794153690338,
-0.0721779614686966,
0.03819894418120384,
0.05539550259709358,
0.1353164166212082,
-0.03666917234659195,
0.0032670956570655107,
0.02398795075714588,
-0.035102181136608124,
0.10897552967071533,
0.019072378054261208,
0.09445122629404068,
-0.019411031156778336,
0.019538506865501404,
0.1060134619474411,
-0.05546138063073158,
0.1454906314611435,
-0.14975768327713013,
0.0793936625123024,
-0.22216317057609558,
-0.0624176524579525,
-0.018936118111014366,
-0.014707406982779503,
-0.04434490203857422,
-0.05711478739976883,
-0.09376668184995651,
-0.006393768358975649,
0.05442909896373749,
-0.025282278656959534,
0.052369486540555954,
-0.03477335721254349,
-0.0627894401550293,
0.08484368771314621,
0.08473585546016693,
-0.024515680968761444,
-0.11312403529882431,
0.008324135094881058,
0.025630736723542213,
0.08877523243427277,
-0.16932262480258942,
0.013644132763147354,
0.1270993947982788,
0.0034388434141874313,
0.09931547939777374,
0.02000182494521141,
-0.0738445520401001,
0.042520683258771896,
0.07068328559398651,
-0.04203419387340546,
-0.08905057609081268,
-0.013838992454111576,
-0.03722626715898514,
-0.08512242138385773,
0.04344337061047554,
0.08158798515796661,
-0.053147975355386734,
-0.02494615502655506,
-0.02474842220544815,
-0.013747272081673145,
-0.07276033610105515,
0.191751167178154,
0.033255115151405334,
0.08693919330835342,
-0.056805647909641266,
0.07906629145145416,
0.10414411127567291,
-0.13467024266719818,
0.005363772623240948,
0.14418336749076843,
-0.08245494961738586,
-0.03779760003089905,
0.05628274381160736,
0.08232199400663376,
-0.056209441274404526,
-0.06537223607301712,
-0.09822273254394531,
-0.06170022860169411,
0.03026586025953293,
0.04655894637107849,
0.06585291028022766,
0.08651317656040192,
-0.027385210618376732,
0.01512504555284977,
-0.09609775990247726,
0.08699694275856018,
0.06088649481534958,
0.04619397968053818,
-0.13787442445755005,
0.17995251715183258,
0.02311761863529682,
0.10830195993185043,
-0.004151974804699421,
0.036266203969717026,
-0.07235049456357956,
0.045774538069963455,
-0.031339071691036224,
0.028470873832702637,
-0.020156560465693474,
0.035179585218429565,
-0.020652510225772858,
0.039524808526039124,
-0.016073107719421387,
0.055112726986408234,
-0.054973054677248,
-0.02308746799826622,
-0.03231500834226608,
0.03611663728952408,
-0.0528322272002697,
-0.01911141350865364,
0.001067589269950986,
-0.07967742532491684,
0.10856229811906815,
-0.05724486708641052,
-0.01560436375439167,
-0.0010154839837923646,
-0.02413227967917919,
0.0871695950627327,
0.013194093480706215,
0.04914335906505585,
-0.021332209929823875,
0.0112910820171237,
0.03870053589344025,
0.021069083362817764,
-0.016028037294745445,
-0.015446044504642487,
0.03611484169960022,
-0.14404655992984772,
-0.09486627578735352,
-0.08413496613502502,
-0.06620299816131592,
-0.07764498889446259,
0.08004582673311234,
0.07705356180667877,
0.0690457820892334,
0.07694275677204132,
-0.022475600242614746,
0.0076500424183905125,
-0.1290554404258728,
-0.03275703266263008,
0.058531682938337326,
-0.0012758991215378046,
-0.09328223764896393,
-0.07208087295293808,
0.057103231549263,
-0.04095885530114174,
0.13357995450496674,
-0.04081255570054054,
0.04372718557715416,
-0.018635012209415436,
-0.06665414571762085,
0.008713403716683388,
0.02131679281592369,
0.2337268441915512,
-0.0957755520939827,
0.013537880964577198,
-0.0064479755237698555,
-0.0027108723297715187,
0.03519027680158615,
0.16861385107040405,
0.07350443303585052,
0.1177477091550827,
0.055351778864860535,
0.11210673302412033,
-0.053579941391944885,
-0.03391670435667038,
-0.15125948190689087,
0.037339724600315094,
0.007228342816233635,
0.05419931188225746,
-0.04259255900979042,
0.1106477677822113,
0.12495904415845871,
-0.11261840164661407,
0.07793644070625305,
0.022749731317162514,
-0.1028955951333046,
-0.032148171216249466,
-0.0553361214697361,
-0.04448913410305977,
-0.07879999279975891,
0.018751272931694984,
-0.09745801240205765,
0.017450887709856033,
0.06272268295288086,
0.04252336174249649,
-0.025676202028989792,
0.17668941617012024,
-0.015652528032660484,
-0.044333238154649734,
0.02543477714061737,
0.021523449569940567,
0.04612791910767555,
0.09549202769994736,
0.00436262646690011,
0.07853119820356369,
-0.07004436105489731,
0.06935814768075943,
0.01580742932856083,
0.0010744805913418531,
0.018440421670675278,
0.0031607996206730604,
-0.006229673512279987,
-0.03715452179312706,
-0.001844961429014802,
0.08344856649637222,
0.16474135220050812,
0.02735806070268154,
-0.052346643060445786,
-0.053371280431747437,
0.14520607888698578,
-0.055751338601112366,
-0.06068960577249527,
-0.10969947278499603,
0.11925856024026871,
0.06941431015729904,
0.02560037560760975,
0.012291663326323032,
-0.08555709570646286,
-0.041341885924339294,
0.2173691838979721,
0.025703953579068184,
-0.020505419000983238,
-0.031168296933174133,
0.003999772481620312,
-0.004893729463219643,
-0.04463827237486839,
0.1450420469045639,
0.007059930823743343,
0.19752386212348938,
-0.00894178543239832,
-0.004799561575055122,
-0.041216205805540085,
-0.028419267386198044,
-0.04488009959459305,
0.1958397477865219,
-0.037528906017541885,
0.021497461944818497,
-0.08034518361091614,
-0.012112601660192013,
0.048610709607601166,
-0.1243465319275856,
0.126760333776474,
-0.0755167081952095,
-0.05998612567782402,
0.03484156355261803,
0.06430243700742722,
-0.019425513222813606,
0.03212663158774376,
-0.02637992613017559,
0.05128495395183563,
0.06523265689611435,
-0.022656960412859917,
-0.1010492742061615,
-0.1003427505493164,
0.04690904915332794,
-0.05656249076128006,
0.15957866609096527,
0.02245832048356533,
0.10050404071807861,
0.08418659120798111,
0.015151971019804478,
-0.07474000006914139,
0.12285663187503815,
0.04027878865599632,
0.014935195446014404,
0.08051291108131409,
0.13831420242786407,
-0.03413255885243416,
0.12620770931243896,
-0.026589786633849144,
-0.041198864579200745,
-0.008326792158186436,
-0.02723681926727295,
0.00024798608501441777,
-0.13588449358940125,
-0.0014375047758221626,
-0.06717510521411896,
0.13715803623199463,
0.17348721623420715,
-0.048801906406879425,
-0.02899830788373947,
-0.03426199033856392,
0.06658672541379929,
-0.015771694481372833,
0.08031099289655685,
-0.008642318658530712,
-0.16141487658023834,
0.027373114600777626,
-0.011971695348620415,
0.03301757946610451,
-0.16076473891735077,
-0.042265117168426514,
-0.0410967655479908,
-0.04697851836681366,
-0.08147307485342026,
0.13445813953876495,
0.06420060992240906,
0.021217407658696175,
-0.03755949065089226,
-0.16767704486846924,
-0.026427987962961197,
0.04667402058839798,
-0.15538883209228516,
-0.12542589008808136
] |
null | null |
transformers
|
# CodeTrans model for source code summarization csharp
Pretrained model on programming language csharp using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the csharp code snippets.
## Intended uses & limitations
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_csharp_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_csharp_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/csharp/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static DateTime ParseUnixDateTime ( double unixTime ) { var dt = new DateTime ( CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , CODE_INTEGER , System . DateTimeKind . Utc ) ; dt = dt . AddSeconds ( unixTimeStamp ) . ToLocalTime ( ) ; return dt ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_csharp_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization csharp
====================================================
Pretrained model on programming language csharp using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized csharp code functions: it works best with tokenized csharp functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the csharp code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the csharp function or be fine-tuned on other csharp code tasks. It can be used on unparsed and untokenized csharp code. However, if the csharp code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
62,
87,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate csharp function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing csharp code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.0965915396809578,
0.06072751060128212,
-0.0014754128642380238,
0.10692141205072403,
0.05108549818396568,
0.01759032905101776,
0.06834445893764496,
0.08219561725854874,
-0.05325156822800636,
0.06297775357961655,
0.0734557956457138,
-0.036947399377822876,
0.0657811388373375,
0.19025960564613342,
0.02871670387685299,
-0.179581880569458,
-0.014357353560626507,
0.03241674229502678,
-0.05521555244922638,
0.11255266517400742,
0.1008058562874794,
-0.094425268471241,
0.0647394210100174,
-0.0346980020403862,
-0.12912985682487488,
0.051337406039237976,
-0.04364614933729172,
-0.05016586557030678,
0.09558447450399399,
0.06293068826198578,
0.13096824288368225,
-0.01727236434817314,
0.08328267931938171,
-0.19247063994407654,
0.0019742075819522142,
0.017467591911554337,
0.04050992429256439,
0.03232971578836441,
0.0680907815694809,
0.06221717596054077,
0.14929041266441345,
-0.02854483388364315,
0.03529779985547066,
0.045153286308050156,
-0.060849629342556,
-0.05193480849266052,
-0.04882043972611427,
0.08282820135354996,
0.12337689846754074,
0.09136518090963364,
-0.010306045413017273,
0.02719895727932453,
-0.08484597504138947,
0.0833640843629837,
0.12021133303642273,
-0.23441742360591888,
-0.02733093872666359,
0.08338376134634018,
0.09087622910737991,
0.06444608420133591,
-0.07562591135501862,
-0.03833486884832382,
0.1006244421005249,
0.03924930840730667,
0.055814482271671295,
-0.08233082294464111,
-0.0487518310546875,
0.0033014847431331873,
-0.06583385914564133,
-0.0564601868391037,
0.158381849527359,
0.044343847781419754,
-0.05257600173354149,
-0.09852425754070282,
-0.04926444962620735,
-0.1854916661977768,
0.046799421310424805,
0.013104526326060295,
0.015371066518127918,
-0.012936202809214592,
0.026468314230442047,
0.0010666665621101856,
-0.11593030393123627,
-0.10444919019937515,
-0.0025744184385985136,
0.06370644271373749,
0.06409348547458649,
0.02502971701323986,
-0.008158046752214432,
0.08455543220043182,
0.03302159532904625,
-0.044957950711250305,
-0.019154230132699013,
0.0040144850499928,
-0.12170702964067459,
0.01872120425105095,
-0.03194179758429527,
-0.08357376605272293,
-0.028858235105872154,
0.07795118540525436,
-0.05226480960845947,
0.06578832119703293,
0.13191524147987366,
0.011822950094938278,
-0.0012121720938012004,
0.22322127223014832,
0.025547925382852554,
-0.11212185025215149,
-0.0019034220604225993,
0.028619611635804176,
-0.013899131678044796,
-0.008171247318387032,
-0.07144433259963989,
-0.04342253878712654,
0.00966682843863964,
0.06453792750835419,
-0.14345720410346985,
0.017667513340711594,
-0.04575562849640846,
-0.0202239491045475,
0.05528690665960312,
-0.13560554385185242,
0.02597281150519848,
0.010443507693707943,
-0.05255952849984169,
-0.05204995721578598,
0.07116702944040298,
-0.11502400040626526,
-0.11237182468175888,
0.020385906100273132,
-0.05653592571616173,
-0.03182227909564972,
-0.12397557497024536,
-0.11735324561595917,
-0.010894284583628178,
-0.04381083697080612,
0.011419840157032013,
-0.1096147671341896,
-0.08258073031902313,
-0.009122475050389767,
0.029933558776974678,
0.001757515245117247,
-0.0037381532602012157,
-0.05501211807131767,
0.010277815163135529,
-0.0038964238483458757,
-0.02165774628520012,
0.0037560632918030024,
-0.03879746049642563,
0.09191153198480606,
0.07586395740509033,
0.0499185211956501,
0.014177381061017513,
0.026200514286756516,
-0.06240643560886383,
0.06857272982597351,
-0.07544703781604767,
0.05915413424372673,
-0.02523328736424446,
0.04730668291449547,
-0.08588939905166626,
-0.08582743257284164,
0.047328054904937744,
0.04916771873831749,
0.05085187032818794,
0.022683035582304,
-0.1354137808084488,
0.016419436782598495,
0.14521001279354095,
-0.09593832492828369,
-0.12340819835662842,
0.1034301370382309,
-0.010659603402018547,
0.005158757790923119,
0.057790614664554596,
0.11910871416330338,
0.13666322827339172,
-0.10323750972747803,
-0.04361836984753609,
0.08623698353767395,
0.05460168793797493,
-0.057116005569696426,
0.1007198914885521,
0.024541659280657768,
0.038743533194065094,
0.0190049447119236,
0.03295501694083214,
0.06743687391281128,
-0.01109255664050579,
-0.03746287152171135,
-0.017999183386564255,
-0.0847516730427742,
-0.0260429996997118,
-0.016544640064239502,
0.028589516878128052,
-0.0607590489089489,
-0.06520188599824905,
0.021016523241996765,
0.16638077795505524,
-0.10803087800741196,
0.024952378123998642,
-0.09647145122289658,
-0.04438621178269386,
-0.08653651922941208,
0.0036643731873482466,
-0.11042024195194244,
0.014925183728337288,
0.05121389403939247,
-0.03281945735216141,
0.06556902825832367,
0.09255018085241318,
-0.00036901410203427076,
0.039761971682310104,
-0.03986130654811859,
-0.04343980923295021,
-0.058244604617357254,
-0.05093409866094589,
-0.12386064231395721,
-0.016587013378739357,
-0.10160159319639206,
-0.025232376530766487,
-0.04752122610807419,
-0.17729778587818146,
0.010322535410523415,
-0.011439023539423943,
0.03598697856068611,
0.0035316399298608303,
-0.01429978758096695,
0.021392809227108955,
0.04992254450917244,
-0.052339281886816025,
-0.08625490963459015,
0.004850741010159254,
0.025272680446505547,
-0.12479832768440247,
-0.02880721166729927,
-0.129912331700325,
-0.04964959993958473,
0.07572679966688156,
0.077232226729393,
-0.09058836847543716,
0.006084059365093708,
-0.035410210490226746,
-0.05444805696606636,
-0.026401270180940628,
-0.07192596793174744,
0.1674177348613739,
0.005987957585602999,
0.17398476600646973,
-0.14079247415065765,
-0.04843888431787491,
-0.034035079181194305,
-0.006384455133229494,
0.00800152774900198,
0.14619089663028717,
0.0039177765138447285,
-0.09496691077947617,
0.03742699697613716,
0.013418489135801792,
-0.051327671855688095,
0.15684294700622559,
-0.007410663645714521,
-0.1054924875497818,
0.0047457278706133366,
0.08974152058362961,
-0.00965221505612135,
0.14972971379756927,
-0.043268486857414246,
-0.009432896971702576,
-0.0009459559805691242,
0.017850063741207123,
0.04217054322361946,
-0.1353229433298111,
0.025341950356960297,
0.06663858890533447,
-0.06039046123623848,
-0.04769754037261009,
-0.03608213737607002,
-0.04445299506187439,
0.0378643237054348,
0.0005427232244983315,
-0.007452106568962336,
-0.007657639216631651,
-0.027583254501223564,
-0.09427772462368011,
0.20338404178619385,
-0.1008841022849083,
-0.21691487729549408,
-0.18197885155677795,
0.06275181472301483,
-0.036158088594675064,
0.015821389853954315,
0.03610161691904068,
-0.10217217355966568,
-0.06608910858631134,
-0.10042628645896912,
0.1219952255487442,
-0.11907834559679031,
0.006459211464971304,
-0.007694153115153313,
0.03906470909714699,
0.029027938842773438,
-0.17520210146903992,
0.03564392775297165,
-0.0024592119734734297,
-0.001640558592043817,
-0.000019884417270077392,
-0.07098501175642014,
0.09350869804620743,
0.13514718413352966,
-0.08916914463043213,
0.011037950403988361,
-0.006499520502984524,
0.16830900311470032,
-0.05387450009584427,
0.02572971023619175,
0.1819111555814743,
0.0038921029772609472,
0.03386359661817551,
0.03567148745059967,
0.009812986478209496,
-0.08918418735265732,
0.06552986055612564,
0.03144103288650513,
-0.028115596622228622,
-0.24835926294326782,
-0.0072314925491809845,
-0.06392370164394379,
0.04577920213341713,
0.11296968162059784,
0.04853207990527153,
-0.12197853624820709,
0.033212147653102875,
-0.007348394021391869,
0.145846426486969,
-0.034947291016578674,
0.05492856726050377,
0.005762670189142227,
0.008091586641967297,
0.005668688099831343,
-0.0924825593829155,
0.002261453540995717,
0.07429439574480057,
0.10145983099937439,
0.21611928939819336,
-0.05557617172598839,
0.20869550108909607,
0.01463310793042183,
0.07645457983016968,
0.031914547085762024,
0.11972096562385559,
-0.10408353060483932,
0.0023950820323079824,
0.0033453479409217834,
-0.011649804189801216,
-0.08209825307130814,
0.061285242438316345,
-0.009523450396955013,
0.06793369352817535,
-0.07520605623722076,
0.03779648244380951,
0.015842370688915253,
0.171894371509552,
0.04967900365591049,
-0.18991193175315857,
-0.12038925290107727,
0.011028326116502285,
-0.10301221907138824,
-0.11154291033744812,
0.0630398690700531,
0.2206946611404419,
-0.043990880250930786,
-0.01235885452479124,
-0.0071149528957903385,
0.12881645560264587,
-0.06835683435201645,
-0.029293308034539223,
0.016866344958543777,
0.0561828687787056,
0.012184111401438713,
0.13012120127677917,
-0.2401384711265564,
0.09870477020740509,
0.016421061009168625,
0.08921732753515244,
-0.025238441303372383,
0.06637664884328842,
-0.04159851372241974,
0.008149256929755211,
0.07488806545734406,
0.002292183693498373,
-0.09920503199100494,
-0.2101278007030487,
-0.04845864698290825,
0.018382228910923004,
0.07294028252363205,
-0.014261344447731972,
0.08691752701997757,
-0.01080032903701067,
0.05653519555926323,
-0.016799023374915123,
-0.12152741849422455,
-0.06347644329071045,
-0.13605326414108276,
-0.009706186130642891,
0.002550214296206832,
-0.010982169769704342,
-0.04328656569123268,
0.016029616817831993,
-0.0018250783905386925,
0.20898208022117615,
-0.16616302728652954,
-0.09348317235708237,
-0.09149317443370819,
0.060647618025541306,
0.14027100801467896,
-0.09364943206310272,
0.030421655625104904,
0.027799736708402634,
0.06754021346569061,
-0.03305378928780556,
-0.062085557729005814,
0.023987840861082077,
-0.054460957646369934,
-0.08276885747909546,
-0.02888553962111473,
0.09712836146354675,
-0.003151971148326993,
0.04662536829710007,
0.006588705815374851,
-0.0828164666891098,
-0.04989613965153694,
-0.12908804416656494,
-0.08799371123313904,
-0.02182750403881073,
0.041879478842020035,
-0.0009241201332770288,
-0.09549947828054428,
0.08615896850824356,
-0.013694746419787407,
-0.08774186670780182,
0.0762808620929718,
0.18465186655521393,
-0.071213498711586,
0.016306549310684204,
0.10088597983121872,
-0.0581478513777256,
-0.15959201753139496,
-0.0620877668261528,
0.051770757883787155,
0.09328065812587738,
-0.021122543141245842,
-0.1406150460243225,
0.07256199419498444,
0.0322587713599205,
0.03587035834789276,
0.009658132679760456,
-0.28034257888793945,
-0.13219989836215973,
0.05245150625705719,
0.07495392858982086,
0.017734332010149956,
-0.10716287046670914,
-0.04172268509864807,
-0.06206614896655083,
-0.08588899672031403,
0.045140065252780914,
0.06295476853847504,
0.1312851458787918,
-0.032893747091293335,
0.028920238837599754,
0.026119479909539223,
-0.026270078495144844,
0.10970932245254517,
0.007088293321430683,
0.0989256426692009,
-0.020507995039224625,
0.015539023093879223,
0.08492931723594666,
-0.058865778148174286,
0.1485663652420044,
-0.1576482355594635,
0.09045404940843582,
-0.2202952653169632,
-0.056572724133729935,
-0.013715107925236225,
-0.007662896532565355,
-0.042051125317811966,
-0.05588661879301071,
-0.10276168584823608,
0.011197302490472794,
0.05269867926836014,
-0.022827476263046265,
0.0463956817984581,
-0.03148059919476509,
-0.06208881363272667,
0.05659429356455803,
0.0975809320807457,
-0.02886125259101391,
-0.11004311591386795,
0.0176063384860754,
0.03075745329260826,
0.08823135495185852,
-0.1846318542957306,
0.024048134684562683,
0.13120295107364655,
0.00499147642403841,
0.10582171380519867,
0.02448662742972374,
-0.06892464309930801,
0.04270146042108536,
0.07175453007221222,
-0.04812261462211609,
-0.08471308648586273,
-0.013113956898450851,
-0.011398057453334332,
-0.08742771297693253,
0.03011207841336727,
0.09032201766967773,
-0.055378299206495285,
-0.01932368613779545,
-0.014313219115138054,
-0.014219011180102825,
-0.07114071398973465,
0.18517214059829712,
0.03066769428551197,
0.08296725898981094,
-0.04706616327166557,
0.08051661401987076,
0.09614953398704529,
-0.11932511627674103,
0.008502730168402195,
0.1442103236913681,
-0.0818319320678711,
-0.02780872955918312,
0.058597058057785034,
0.08519812673330307,
-0.06047729030251503,
-0.06322037428617477,
-0.0963035598397255,
-0.07364747673273087,
0.01850590482354164,
0.05203091353178024,
0.06531353294849396,
0.08936120569705963,
-0.034854236990213394,
0.021741800010204315,
-0.09842721372842789,
0.09184891730546951,
0.06912748515605927,
0.04833698645234108,
-0.13786298036575317,
0.17643071711063385,
0.020889176055788994,
0.09358566254377365,
-0.004380557220429182,
0.03877270966768265,
-0.08293716609477997,
0.04229617863893509,
-0.036036137491464615,
0.03996674343943596,
-0.01544432993978262,
0.039338063448667526,
-0.01941581629216671,
0.031700119376182556,
-0.02407699264585972,
0.05360469967126846,
-0.045320622622966766,
-0.022809738293290138,
-0.025819988921284676,
0.0400664359331131,
-0.052608735859394073,
-0.02311309427022934,
0.012897779233753681,
-0.08127617835998535,
0.10193365812301636,
-0.05409283936023712,
-0.007475121412426233,
0.003695643739774823,
-0.01808284968137741,
0.07350174337625504,
0.012447952292859554,
0.056903887540102005,
-0.011683182790875435,
0.006171071901917458,
0.04181675240397453,
0.02144506387412548,
-0.01580776274204254,
-0.012988322414457798,
0.03911762312054634,
-0.1351836770772934,
-0.09733446687459946,
-0.09484320133924484,
-0.06132432818412781,
-0.08071146160364151,
0.08248931914567947,
0.0909818485379219,
0.06822168827056885,
0.0844806358218193,
-0.030061308294534683,
0.003677947912365198,
-0.13274934887886047,
-0.03472688049077988,
0.05406087636947632,
-0.008516160771250725,
-0.0838174819946289,
-0.06268006563186646,
0.05666240304708481,
-0.04779455438256264,
0.11924536526203156,
-0.02822905033826828,
0.04875744506716728,
-0.012938380241394043,
-0.0604904443025589,
-0.002856409875676036,
0.01817592978477478,
0.2324344366788864,
-0.09493488073348999,
0.01591353490948677,
0.0022787614725530148,
-0.003586288308724761,
0.03647693991661072,
0.14455391466617584,
0.09209775179624557,
0.12991347908973694,
0.044515546411275864,
0.10854706168174744,
-0.05140738934278488,
-0.035372935235500336,
-0.1600007265806198,
0.034202948212623596,
0.0018132281256839633,
0.037108518183231354,
-0.027475154027342796,
0.10403349995613098,
0.1252143532037735,
-0.12790153920650482,
0.083920419216156,
0.009895427152514458,
-0.10359951108694077,
-0.0347927026450634,
-0.04460740461945534,
-0.04467641934752464,
-0.08830318599939346,
0.019338315352797508,
-0.11145172268152237,
0.013787450268864632,
0.071019247174263,
0.0390852689743042,
-0.0261150561273098,
0.15935282409191132,
-0.031733106821775436,
-0.05499167740345001,
0.021004054695367813,
0.026444215327501297,
0.04726330190896988,
0.08470468968153,
0.0035319658927619457,
0.07106863707304001,
-0.06436782330274582,
0.06778106093406677,
0.008020770736038685,
0.019899539649486542,
0.00824013538658619,
0.0025897722225636244,
-0.0005635519046336412,
-0.039815209805965424,
-0.0029020793735980988,
0.07390601933002472,
0.1561938375234604,
0.03210975229740143,
-0.04900715500116348,
-0.049957260489463806,
0.1659933626651764,
-0.056967560201883316,
-0.06848625838756561,
-0.12160711735486984,
0.14082610607147217,
0.05539189651608467,
0.02974536456167698,
0.011683262884616852,
-0.0841836929321289,
-0.04020525515079498,
0.22112229466438293,
0.01681697927415371,
-0.0370684489607811,
-0.03235755115747452,
-0.004221796058118343,
-0.005079361144453287,
-0.0450332835316658,
0.13963280618190765,
0.02033155970275402,
0.20281818509101868,
-0.0035836573224514723,
-0.019420089200139046,
-0.047323692589998245,
-0.02299119159579277,
-0.036133263260126114,
0.18868957459926605,
-0.04123977944254875,
0.02809079736471176,
-0.08392450958490372,
-0.01953664980828762,
0.046078771352767944,
-0.12890388071537018,
0.11862163990736008,
-0.09193678945302963,
-0.06006503850221634,
0.032016821205616,
0.07356879860162735,
-0.034834835678339005,
0.03353835269808769,
-0.025773806497454643,
0.04863832890987396,
0.06270304322242737,
-0.020873917266726494,
-0.09472205489873886,
-0.09794841706752777,
0.04929962009191513,
-0.04916505888104439,
0.15968355536460876,
0.017737312242388725,
0.09123937040567398,
0.08679474890232086,
0.027874518185853958,
-0.07253925502300262,
0.10990885645151138,
0.042804766446352005,
0.010449445806443691,
0.07606026530265808,
0.1278935819864273,
-0.03644206374883652,
0.1493004411458969,
-0.007934905588626862,
-0.04719366878271103,
-0.017761750146746635,
-0.024075044319033623,
-0.005389390978962183,
-0.13595741987228394,
-0.00005659161979565397,
-0.06348586082458496,
0.13865545392036438,
0.1775386482477188,
-0.045974645763635635,
-0.029159091413021088,
-0.029754478484392166,
0.0702461451292038,
-0.014030013233423233,
0.09606628865003586,
-0.0016688420437276363,
-0.15585540235042572,
0.015926942229270935,
-0.021099623292684555,
0.01894219033420086,
-0.17056187987327576,
-0.0509452298283577,
-0.03914497047662735,
-0.04774850606918335,
-0.07116729766130447,
0.14097613096237183,
0.07377558201551437,
0.02893316000699997,
-0.040159620344638824,
-0.16166967153549194,
-0.025391699746251106,
0.04819164425134659,
-0.1468266248703003,
-0.12040539085865021
] |
null | null |
transformers
|
# CodeTrans model for source code summarization Python
Pretrained model on programming language python using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the Python function or be fine-tuned on other Python code tasks. It can be used on unparsed and untokenized Python code. However, if the Python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate Python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_python_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_python_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) '''
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/python/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Training
The model was trained on a single TPU Pod V3-8 for 80,000 steps, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. (We have trained in total 260,000 steps.)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| State of the art | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_python_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization Python
====================================================
Pretrained model on programming language python using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the Python function or be fine-tuned on other Python code tasks. It can be used on unparsed and untokenized Python code. However, if the Python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate Python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Training
The model was trained on a single TPU Pod V3-8 for 80,000 steps, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. (We have trained in total 260,000 steps.)
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate Python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Training\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. (We have trained in total 260,000 steps.)\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate Python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Training\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. (We have trained in total 260,000 steps.)\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
152
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate Python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Training\n\n\nThe model was trained on a single TPU Pod V3-8 for 80,000 steps, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. (We have trained in total 260,000 steps.)\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10039039701223373,
0.026729099452495575,
-0.0008481373661197722,
0.12902206182479858,
0.1250653862953186,
0.027499262243509293,
0.1014251634478569,
0.06594407558441162,
-0.04693801328539848,
-0.0012377677485346794,
0.03539039194583893,
0.057450126856565475,
0.03858531266450882,
0.18445995450019836,
0.03599809110164642,
-0.24226875603199005,
0.033477600663900375,
0.02586882747709751,
-0.0769757628440857,
0.1119801253080368,
0.0848008468747139,
-0.06664353609085083,
0.03597252443432808,
0.007907581515610218,
-0.19828911125659943,
0.05205409601330757,
-0.0226595401763916,
-0.07874374091625214,
0.12836851179599762,
0.07245204597711563,
0.1538621485233307,
-0.008954949676990509,
0.05964348837733269,
-0.10543505847454071,
0.0055421642027795315,
0.06499754637479782,
0.021015791222453117,
0.023018965497612953,
0.0634230524301529,
0.0387258417904377,
0.22302982211112976,
0.02045314386487007,
0.093630351126194,
0.047094911336898804,
-0.05766768380999565,
-0.17927704751491547,
-0.034323398023843765,
0.05654775723814964,
0.07752280682325363,
0.09309718012809753,
0.0028885314241051674,
0.11180106550455093,
-0.12650322914123535,
0.11454198509454727,
0.08543749153614044,
-0.2590179145336151,
-0.012920529581606388,
0.09616818279027939,
0.059326473623514175,
0.05990592762827873,
-0.022329194471240044,
-0.054726507514715195,
0.05864384025335312,
0.06482188403606415,
0.07115665078163147,
-0.05541433393955231,
-0.028116142377257347,
0.003145599039271474,
-0.12446168810129166,
-0.08113652467727661,
0.15539608895778656,
-0.004348098300397396,
-0.06657004356384277,
-0.1267441362142563,
-0.05564747005701065,
-0.07265106588602066,
0.028772393241524696,
0.022600863128900528,
0.0033979120198637247,
-0.006460970733314753,
-0.00019872968550771475,
0.011710080318152905,
-0.08484875410795212,
-0.11588778346776962,
0.003401713212952018,
0.10032253712415695,
0.08445682376623154,
0.04094254970550537,
-0.06697912514209747,
0.12715739011764526,
0.027272755280137062,
-0.027809256687760353,
-0.029768921434879303,
-0.04636799171566963,
-0.1349003165960312,
0.03478627651929855,
-0.05759594962000847,
-0.16892893612384796,
-0.037618499249219894,
0.04639441892504692,
0.016024764627218246,
0.04349860921502113,
0.05230264738202095,
0.00021145975915715098,
0.024129832163453102,
0.21279805898666382,
-0.03435645252466202,
-0.044602375477552414,
0.017669403925538063,
0.03681257739663124,
-0.08925540745258331,
-0.03621949255466461,
-0.0655372366309166,
-0.06515401601791382,
0.03305094689130783,
0.06191914901137352,
-0.1045088842511177,
0.08553951978683472,
-0.022428492084145546,
-0.05527249351143837,
-0.009911968372762203,
-0.15789037942886353,
-0.011900849640369415,
-0.010893072932958603,
-0.0405479334294796,
0.032353680580854416,
0.09338891506195068,
-0.12749908864498138,
-0.13763327896595,
-0.0017782211070880294,
-0.0855788066983223,
-0.04163160175085068,
-0.13990789651870728,
-0.15558339655399323,
-0.021886102855205536,
-0.024922171607613564,
-0.014836869202554226,
-0.07512152194976807,
-0.14563190937042236,
-0.013599407859146595,
0.06318379193544388,
0.013896658085286617,
0.0006931077223271132,
-0.05471789464354515,
0.0076598599553108215,
-0.011634517461061478,
-0.03614135459065437,
-0.028725339099764824,
-0.03314754739403725,
0.08531083166599274,
0.0638396292924881,
0.02594578079879284,
0.005928695201873779,
0.05370611697435379,
-0.029406847432255745,
0.05853600054979324,
-0.15963730216026306,
0.12879084050655365,
-0.06532330811023712,
0.07230380177497864,
-0.04908135160803795,
-0.10729262232780457,
0.05404457077383995,
0.050088316202163696,
0.02517879009246826,
0.0482318252325058,
-0.09561273455619812,
-0.05025620386004448,
0.218440979719162,
-0.13404075801372528,
-0.08221928030252457,
0.0979904755949974,
-0.07617546617984772,
0.0646451860666275,
0.10103864967823029,
0.10054893791675568,
0.18313874304294586,
-0.07315244525671005,
0.012562877498567104,
0.0707038938999176,
0.037805844098329544,
-0.08039060235023499,
0.09706035256385803,
0.032653890550136566,
-0.08506324142217636,
0.053388193249702454,
-0.05784071981906891,
0.1033409908413887,
-0.017227906733751297,
-0.0308693777769804,
-0.031007956713438034,
-0.048673275858163834,
0.027971556410193443,
-0.0296801645308733,
0.0677013248205185,
-0.03112528845667839,
-0.10413536429405212,
0.11481918394565582,
0.1866128295660019,
-0.16081492602825165,
-0.008790253661572933,
-0.10288339853286743,
0.01696472242474556,
-0.07371851056814194,
0.01783939264714718,
-0.17220567166805267,
-0.017245430499315262,
0.057590994983911514,
-0.057711243629455566,
0.05684247985482216,
0.12315431237220764,
0.025182252749800682,
0.07710637152194977,
0.009984110482037067,
-0.009877383708953857,
-0.1289844512939453,
-0.046405259519815445,
-0.048943307250738144,
-0.03739544376730919,
-0.13450485467910767,
-0.05620522052049637,
0.03920428082346916,
-0.1720593273639679,
0.029354754835367203,
-0.02766355127096176,
0.02238466776907444,
0.011984116397798061,
-0.01618078351020813,
-0.016702931374311447,
0.056157954037189484,
-0.05464621260762215,
-0.05732622370123863,
0.020897457376122475,
0.025057828053832054,
-0.06442992389202118,
-0.023012962192296982,
-0.11901067197322845,
0.03080955147743225,
0.09202801436185837,
0.07554415613412857,
-0.060167498886585236,
0.032153841108083725,
-0.028295084834098816,
-0.028447862714529037,
0.021806243807077408,
-0.06674410402774811,
0.1881837546825409,
0.022585710510611534,
0.20906776189804077,
-0.13710083067417145,
-0.02453501895070076,
-0.00601339153945446,
0.03136110305786133,
0.034662965685129166,
0.08162454515695572,
0.02184559404850006,
-0.070283442735672,
0.043600134551525116,
-0.016062172129750252,
-0.034590281546115875,
0.18627698719501495,
-0.03186631575226784,
-0.0908007025718689,
0.004777838010340929,
0.08421429246664047,
-0.008284566923975945,
0.08372770994901657,
-0.09411841630935669,
-0.013666083104908466,
0.019637031480669975,
0.029584500938653946,
0.07658559083938599,
-0.15414629876613617,
0.01780177652835846,
0.06393957883119583,
-0.05186799168586731,
-0.11274033784866333,
-0.032496094703674316,
-0.011342363432049751,
0.036550190299749374,
0.008448868989944458,
-0.028520390391349792,
0.010851090773940086,
-0.019524341449141502,
-0.09159484505653381,
0.21917708218097687,
-0.10666561126708984,
-0.25032174587249756,
-0.23581185936927795,
0.035114213824272156,
-0.033746637403964996,
-0.0005677674780599773,
0.007412466686218977,
-0.08157552778720856,
-0.06142168119549751,
-0.05037500709295273,
0.18196921050548553,
-0.10194611549377441,
-0.02477128803730011,
0.024058794602751732,
0.05313766375184059,
0.010643531568348408,
-0.22533047199249268,
0.022893602028489113,
0.0036023352295160294,
-0.06410177052021027,
-0.015169047750532627,
-0.1181935966014862,
0.053437504917383194,
0.16361719369888306,
-0.05529562756419182,
0.030534299090504646,
-0.02088000811636448,
0.19234956800937653,
-0.033992014825344086,
-0.06409575790166855,
0.15860861539840698,
-0.0008420596132054925,
0.0325852707028389,
0.023042170330882072,
0.0006198084447532892,
-0.07703640311956406,
0.03317680209875107,
0.0060323067009449005,
-0.04134497418999672,
-0.2647648751735687,
-0.01588069647550583,
-0.07988016307353973,
0.03785202279686928,
0.08218694478273392,
0.0379655621945858,
-0.01685251295566559,
0.04074731096625328,
0.019515806809067726,
0.13880643248558044,
0.022702718153595924,
0.05228167027235031,
0.07194360345602036,
0.003871264634653926,
0.030102195218205452,
-0.09982959181070328,
-0.038134023547172546,
0.05809810385107994,
0.07831843942403793,
0.30509504675865173,
-0.06669990718364716,
0.1769685298204422,
0.04428838938474655,
0.04748772457242012,
0.04600220546126366,
0.19992651045322418,
-0.09568030387163162,
0.027685312554240227,
-0.003832872724160552,
-0.026217296719551086,
-0.11602427065372467,
0.02303726226091385,
-0.022041672840714455,
0.01271835807710886,
-0.1327635645866394,
-0.05328798666596413,
-0.019857747480273247,
0.16723473370075226,
0.01833139918744564,
-0.26421159505844116,
-0.0926765725016594,
-0.0145496791228652,
-0.1272631734609604,
-0.0923507809638977,
0.03341027349233627,
0.19991794228553772,
-0.10641630738973618,
-0.00281779607757926,
-0.048498619347810745,
0.12472424656152725,
-0.062405962496995926,
-0.03909829631447792,
-0.0334552526473999,
0.051570191979408264,
0.021307872608304024,
0.11654053628444672,
-0.20611122250556946,
0.16047649085521698,
-0.023454591631889343,
0.07316989451646805,
-0.05584573745727539,
0.09466762840747833,
-0.024785736575722694,
0.07065588980913162,
0.04296760633587837,
-0.012316782958805561,
0.000009134811080002692,
-0.17943811416625977,
-0.026767397299408913,
0.033419519662857056,
0.03574807196855545,
0.0037788066547363997,
0.1067437008023262,
-0.021934902295470238,
0.03599994629621506,
0.010363581590354443,
-0.04325507581233978,
-0.04962848871946335,
-0.1697041243314743,
-0.000054248663218459114,
-0.08365673571825027,
-0.015284470282495022,
-0.05889562517404556,
-0.04534710943698883,
0.03842420130968094,
0.15702597796916962,
-0.06490274518728256,
-0.09775149077177048,
-0.09947772324085236,
0.014719394966959953,
0.14847950637340546,
-0.07426148653030396,
0.06852704286575317,
0.004074390046298504,
0.06670726835727692,
-0.012956605292856693,
-0.133467435836792,
0.028948305174708366,
-0.03615196794271469,
-0.05206075683236122,
-0.008566619828343391,
0.05301954969763756,
0.020490484312176704,
0.018399519845843315,
0.0034084608778357506,
-0.06056344136595726,
-0.04072635993361473,
-0.11081255227327347,
-0.1409972757101059,
0.044678594917058945,
0.034963201731443405,
0.049219660460948944,
-0.11401528865098953,
-0.01970088668167591,
-0.02569728158414364,
0.01782561093568802,
0.10695772618055344,
0.14164602756500244,
-0.0615333691239357,
0.045560698956251144,
0.12092144042253494,
-0.054969072341918945,
-0.14773550629615784,
-0.03335569426417351,
0.07914295047521591,
0.10109352320432663,
-0.014703057706356049,
-0.23005598783493042,
0.03908609226346016,
0.05538816750049591,
0.03283190354704857,
-0.017677925527095795,
-0.3682023286819458,
-0.12056951969861984,
0.07039033621549606,
0.11500748246908188,
0.030894557014107704,
-0.08137834072113037,
-0.002342143328860402,
-0.06647784262895584,
-0.15704534947872162,
0.1323704570531845,
0.012044159695506096,
0.1337025910615921,
-0.020352782681584358,
0.054634444415569305,
0.029680080711841583,
-0.04421759024262428,
0.07210034877061844,
0.027070002630352974,
0.10716081410646439,
-0.027308188378810883,
0.046609342098236084,
0.13330093026161194,
-0.05511714890599251,
0.1573166698217392,
-0.12362223863601685,
0.08739439398050308,
-0.216727152466774,
-0.10432401299476624,
-0.07158644497394562,
0.008820541203022003,
-0.02148747257888317,
-0.05453278869390488,
-0.07512535154819489,
0.023940449580550194,
0.03563756123185158,
-0.014252102002501488,
0.0434599407017231,
-0.004351432900875807,
0.0385846309363842,
0.050683461129665375,
0.07213280349969864,
0.01233753003180027,
-0.13456185162067413,
-0.007987412624061108,
0.05133024603128433,
0.09519462287425995,
-0.1979897916316986,
0.01670815981924534,
0.1095086932182312,
0.0019200071692466736,
0.13220317661762238,
0.06610190123319626,
-0.11169014126062393,
0.012308333069086075,
0.07976099103689194,
-0.06392818689346313,
-0.14659340679645538,
0.0013090685242787004,
-0.04409914091229439,
-0.06511546671390533,
0.06391524523496628,
0.09930013120174408,
-0.059653084725141525,
-0.03692096844315529,
-0.027093980461359024,
-0.05200119689106941,
-0.09840968996286392,
0.21794533729553223,
0.035377681255340576,
0.07022672891616821,
-0.07084163278341293,
0.039716936647892,
0.0767940804362297,
-0.09774670749902725,
0.005842828657478094,
0.1378636658191681,
-0.1058160811662674,
-0.032688554376363754,
0.028642499819397926,
0.13497935235500336,
-0.05596315860748291,
-0.03832431510090828,
-0.09186386317014694,
-0.04840739816427231,
0.04119826480746269,
0.12976577877998352,
0.07445866614580154,
0.12739410996437073,
-0.0806974470615387,
0.0042336019687354565,
-0.10072348266839981,
0.046311184763908386,
0.042231399565935135,
0.03610502555966377,
-0.15725070238113403,
0.2086825668811798,
0.0652337372303009,
0.09499344229698181,
-0.032930128276348114,
-0.03425012156367302,
-0.09176544845104218,
0.03746616467833519,
-0.0902416929602623,
0.026662254706025124,
-0.04421576112508774,
0.048019103705883026,
-0.003563822479918599,
0.02241835743188858,
-0.009989229030907154,
0.06812518835067749,
-0.10088001936674118,
-0.01235322654247284,
-0.012233230285346508,
0.03917593136429787,
-0.038365744054317474,
0.01091025024652481,
0.037990469485521317,
-0.1096598207950592,
0.10938761383295059,
-0.0028250939212739468,
-0.04338100552558899,
0.08652710914611816,
0.01990438625216484,
0.03892321512103081,
-0.009979717433452606,
0.0605856291949749,
0.014305728487670422,
0.03558181971311569,
0.08499052375555038,
0.01755080185830593,
0.041981231421232224,
0.008451757952570915,
0.050918444991111755,
-0.14376325905323029,
-0.10412916541099548,
-0.023084763437509537,
-0.10272856056690216,
-0.046558644622564316,
0.09733806550502777,
0.06714440137147903,
0.09292767941951752,
0.12117190659046173,
-0.018742330372333527,
0.00854696985334158,
-0.14995694160461426,
-0.0726204589009285,
0.02684897370636463,
-0.028956854715943336,
0.008858172222971916,
-0.0916854739189148,
0.030056091025471687,
-0.006521829403936863,
0.1506253033876419,
-0.001508355955593288,
0.04729756712913513,
-0.047844067215919495,
-0.04600029066205025,
0.046433743089437485,
0.010263982228934765,
0.2397991269826889,
-0.033200498670339584,
0.025058217346668243,
0.022283395752310753,
-0.007911646738648415,
-0.010977701283991337,
0.12802159786224365,
0.1029823049902916,
0.11400853097438812,
-0.014251341111958027,
0.07644710689783096,
0.05257086828351021,
0.007339439354836941,
-0.14347578585147858,
-0.07406573742628098,
0.015156487002968788,
0.0852416381239891,
-0.07543177157640457,
0.13715475797653198,
0.08304458856582642,
-0.08623186498880386,
0.13798224925994873,
0.05116650089621544,
-0.1066754013299942,
-0.06983242928981781,
-0.048950597643852234,
-0.025254033505916595,
-0.15076644718647003,
0.01948326826095581,
-0.11081180721521378,
-0.016837529838085175,
0.11109226942062378,
0.05234749987721443,
-0.059653252363204956,
0.14809384942054749,
0.0068874661810696125,
-0.06891002506017685,
0.028920866549015045,
-0.00502518517896533,
0.03015810064971447,
0.019744226709008217,
-0.008191308006644249,
0.039864324033260345,
-0.004565212409943342,
0.06604740023612976,
-0.01457685511559248,
-0.030719073489308357,
0.018653780221939087,
-0.02787281759083271,
0.003855639835819602,
-0.02195822075009346,
0.014246874488890171,
0.03491482883691788,
0.12789413332939148,
0.019229993224143982,
-0.10686139762401581,
-0.05194300785660744,
0.16925127804279327,
-0.04646851122379303,
-0.10085401684045792,
-0.16610375046730042,
0.24402855336666107,
0.04865215718746185,
0.02699458971619606,
0.032326940447092056,
-0.059320978820323944,
-0.08094162493944168,
0.19491955637931824,
0.0858725756406784,
0.024441760033369064,
-0.030545314773917198,
0.013935292139649391,
-0.011364668607711792,
-0.04659528657793999,
0.2325005829334259,
0.024918332695961,
0.26855719089508057,
0.010400408878922462,
0.008746229112148285,
-0.04283691942691803,
-0.012068807147443295,
-0.020917804911732674,
0.11570532619953156,
-0.05467389151453972,
-0.029561756178736687,
-0.05381260812282562,
0.0258290097117424,
0.02169971354305744,
-0.08305791020393372,
0.06904539465904236,
-0.07558473199605942,
-0.11015132814645767,
-0.034600693732500076,
0.0337611623108387,
-0.044133979827165604,
0.06490810960531235,
-0.03091280907392502,
0.021760571748018265,
0.09089627116918564,
-0.007776711601763964,
-0.10224408656358719,
-0.14062902331352234,
0.12769150733947754,
-0.022964952513575554,
0.1458432972431183,
0.008000260218977928,
0.0799497589468956,
0.10151848942041397,
0.04498749226331711,
-0.060476627200841904,
0.08647575974464417,
0.020894307643175125,
0.04659738391637802,
0.04099155217409134,
0.11439603567123413,
-0.06493434309959412,
0.09958329051733017,
-0.061322733759880066,
-0.08214814215898514,
0.011865707114338875,
-0.09718723595142365,
0.014988647773861885,
-0.12625901401042938,
-0.019264571368694305,
-0.09524548798799515,
0.09902344644069672,
0.21614715456962585,
-0.03303314372897148,
-0.009908312000334263,
-0.08695191144943237,
0.07837149500846863,
0.0020020438823848963,
0.06088189408183098,
-0.05818837881088257,
-0.20195096731185913,
-0.029398195445537567,
-0.05603533238172531,
0.011473014019429684,
-0.23976439237594604,
-0.03100600838661194,
-0.02066764421761036,
-0.04817761108279228,
-0.060142651200294495,
0.14568227529525757,
0.09211186319589615,
0.05157162994146347,
-0.02436929941177368,
-0.13942904770374298,
-0.06201106682419777,
0.07101012021303177,
-0.13405433297157288,
-0.10269121080636978
] |
null | null |
transformers
|
# CodeTrans model for source code summarization python
Pretrained model on programming language python using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the python code snippets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_python_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_python_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) '''
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/python/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_python_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization python
====================================================
Pretrained model on programming language python using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the python code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08027796447277069,
0.07091106474399567,
-0.0016581208910793066,
0.10376740247011185,
0.05355704203248024,
0.021505070850253105,
0.05541248619556427,
0.09923932701349258,
-0.040377598255872726,
0.06404246389865875,
0.07399428635835648,
-0.024454232305288315,
0.06408105045557022,
0.18088006973266602,
0.03213256224989891,
-0.1876005232334137,
-0.018347900360822678,
0.0268156249076128,
-0.037639468908309937,
0.10998609662055969,
0.09357334673404694,
-0.07807065546512604,
0.0644303485751152,
-0.036041587591171265,
-0.11661338806152344,
0.05574705824255943,
-0.04315726086497307,
-0.046653661876916885,
0.09171193838119507,
0.04568900540471077,
0.12150100618600845,
-0.03976554051041603,
0.07510367035865784,
-0.2123117595911026,
-0.0034559450577944517,
0.021807771176099777,
0.05355382710695267,
0.022708097472786903,
0.06630729883909225,
0.05742573365569115,
0.13661740720272064,
-0.03841621056199074,
0.05689539760351181,
0.04908793792128563,
-0.06288456171751022,
-0.09213779866695404,
-0.052543576806783676,
0.06880706548690796,
0.10834921896457672,
0.0904906764626503,
-0.01689007878303528,
0.028187677264213562,
-0.07848697155714035,
0.0923115462064743,
0.12136485427618027,
-0.2253274768590927,
-0.0236851554363966,
0.10118971019983292,
0.09390264004468918,
0.05455433949828148,
-0.07716941833496094,
-0.038465164601802826,
0.10158079117536545,
0.041653111577034,
0.047596946358680725,
-0.08269160240888596,
-0.04853622987866402,
-0.008926684968173504,
-0.06335369497537613,
-0.05636649206280708,
0.13664603233337402,
0.03804820403456688,
-0.06388214975595474,
-0.09826025366783142,
-0.04842717945575714,
-0.19233928620815277,
0.039225827902555466,
0.032483067363500595,
0.014524913392961025,
-0.012116270139813423,
0.015089115127921104,
0.004151415545493364,
-0.09249050915241241,
-0.09060371667146683,
0.0017970645567402244,
0.08209746330976486,
0.07795894891023636,
0.02484603226184845,
-0.011878118850290775,
0.08261873573064804,
-0.012283151969313622,
-0.051775701344013214,
-0.02561418153345585,
0.01491021178662777,
-0.12484254688024521,
0.018881583586335182,
-0.017575757578015327,
-0.06972460448741913,
-0.013328813947737217,
0.11670082807540894,
-0.060157131403684616,
0.07956983149051666,
0.1205659806728363,
0.0004089072172064334,
0.0059938509948551655,
0.23441846668720245,
0.04315376281738281,
-0.1402360051870346,
-0.00046941329492256045,
0.02170882374048233,
0.0035232827067375183,
-0.005176561418920755,
-0.060507528483867645,
-0.03304460272192955,
0.018063684925436974,
0.06282421946525574,
-0.12556390464305878,
0.01946856826543808,
-0.03962457552552223,
-0.010917828418314457,
0.014570222236216068,
-0.13190844655036926,
0.03808074817061424,
0.006814506370574236,
-0.05762314423918724,
-0.0308131854981184,
0.06260731816291809,
-0.11592449992895126,
-0.11660365015268326,
0.046448055654764175,
-0.051651593297719955,
-0.025432096794247627,
-0.12066063284873962,
-0.13378959894180298,
-0.016015879809856415,
-0.045979708433151245,
0.021984608843922615,
-0.10593599826097488,
-0.09908251464366913,
-0.007190029136836529,
0.03681480884552002,
-0.004666693974286318,
-0.014708686619997025,
-0.04970918223261833,
0.010916879400610924,
-0.0027648776303976774,
-0.018169913440942764,
0.005315275862812996,
-0.042760975658893585,
0.09229802340269089,
0.10347496718168259,
0.051401976495981216,
0.018338819965720177,
0.026928018778562546,
-0.07054530829191208,
0.06972889602184296,
-0.05936938896775246,
0.058778002858161926,
-0.023671044036746025,
0.06760332733392715,
-0.09395130723714828,
-0.08725767582654953,
0.04984394833445549,
0.04874401167035103,
0.05513248220086098,
0.019578788429498672,
-0.10440332442522049,
0.013400879688560963,
0.15582841634750366,
-0.09155961126089096,
-0.12557531893253326,
0.11867119371891022,
-0.00647890567779541,
0.0033199856989085674,
0.07510753720998764,
0.12316180765628815,
0.1359090507030487,
-0.10914997011423111,
-0.0478876456618309,
0.08705092966556549,
0.07464197278022766,
-0.031199516728520393,
0.10009222477674484,
0.013725348748266697,
0.02919013798236847,
0.021370356902480125,
0.04718347638845444,
0.07270452380180359,
-0.005104390904307365,
-0.03521778807044029,
-0.026732468977570534,
-0.07866835594177246,
-0.011691558174788952,
-0.017594892531633377,
0.023899128660559654,
-0.07456392794847488,
-0.08595431596040726,
0.023918291553854942,
0.18116667866706848,
-0.10101793706417084,
0.022716976702213287,
-0.1003018319606781,
-0.037617504596710205,
-0.08986768126487732,
0.010439664125442505,
-0.10091584920883179,
0.010636492632329464,
0.04668291658163071,
-0.053723156452178955,
0.06817423552274704,
0.0677509605884552,
0.0029551817569881678,
0.0371478833258152,
-0.034909918904304504,
-0.04434190317988396,
-0.05903082340955734,
-0.05524515360593796,
-0.12700243294239044,
-0.00497553450986743,
-0.09474005550146103,
-0.02683829702436924,
-0.0465814433991909,
-0.17134614288806915,
0.0166244525462389,
-0.027715865522623062,
0.022527502849698067,
0.004209957551211119,
-0.023722467944025993,
0.025909138843417168,
0.05288045480847359,
-0.05994122847914696,
-0.0888490304350853,
0.003818048629909754,
0.016301311552524567,
-0.10864630341529846,
-0.06483879685401917,
-0.12912115454673767,
-0.052883148193359375,
0.07927794009447098,
0.08543608337640762,
-0.08691243082284927,
0.02770431712269783,
-0.02029176615178585,
-0.050459615886211395,
-0.045729462057352066,
-0.07553435862064362,
0.17052531242370605,
0.012363261543214321,
0.17428606748580933,
-0.12848292291164398,
-0.05099701136350632,
-0.03633734956383705,
-0.019485201686620712,
0.02126064896583557,
0.1457587331533432,
0.0010525175603106618,
-0.08119301497936249,
0.04669266194105148,
-0.015834569931030273,
-0.0671839788556099,
0.149527445435524,
-0.005906817503273487,
-0.09946013987064362,
0.01403060182929039,
0.09902964532375336,
-0.010565747506916523,
0.14043933153152466,
-0.08347351849079132,
-0.00698004849255085,
-0.001940865651704371,
0.023135721683502197,
0.03819835186004639,
-0.1287803202867508,
0.027152111753821373,
0.06747446209192276,
-0.0584472194314003,
-0.06806980818510056,
-0.040142204612493515,
-0.036983225494623184,
0.030828334391117096,
0.0053376248106360435,
-0.001979782711714506,
-0.01624825783073902,
-0.021188952028751373,
-0.09217135608196259,
0.21032878756523132,
-0.08725229650735855,
-0.21521882712841034,
-0.171031653881073,
-0.0005977169494144619,
-0.0417587012052536,
-0.007579641882330179,
0.050678152590990067,
-0.11581899225711823,
-0.06653931736946106,
-0.08663249015808105,
0.15557268261909485,
-0.1405661255121231,
0.007082299795001745,
-0.008653245866298676,
0.03032696060836315,
0.03431371599435806,
-0.17773504555225372,
0.0330859050154686,
0.0007489559939131141,
-0.007477063685655594,
0.00752295833081007,
-0.06820246577262878,
0.08174385130405426,
0.12983819842338562,
-0.08314371854066849,
0.012461054138839245,
-0.007935053668916225,
0.18242692947387695,
-0.05327410623431206,
0.011613952927291393,
0.18696628510951996,
0.013244545087218285,
0.039075471460819244,
0.04143884405493736,
0.01557639054954052,
-0.09484006464481354,
0.05969666317105293,
0.0499521866440773,
-0.032550130039453506,
-0.24141301214694977,
-0.005334379617124796,
-0.06990920752286911,
0.05471576377749443,
0.13204294443130493,
0.050088778138160706,
-0.13302233815193176,
0.03771763667464256,
-0.009306377731263638,
0.1445835679769516,
-0.03461989387869835,
0.07090131938457489,
0.021289940923452377,
0.014828013256192207,
0.009394918568432331,
-0.092383474111557,
-0.004365681670606136,
0.07029610127210617,
0.11425181478261948,
0.20389620959758759,
-0.04935872182250023,
0.18295101821422577,
0.011157643049955368,
0.07207157462835312,
0.025866396725177765,
0.10968580096960068,
-0.12149090319871902,
0.0016784699400886893,
0.008581150323152542,
-0.005678183864802122,
-0.06525766104459763,
0.05027378350496292,
-0.019138451665639877,
0.06182410940527916,
-0.06263178586959839,
0.018449271097779274,
0.014697813428938389,
0.16205739974975586,
0.07673727720975876,
-0.1950179785490036,
-0.11753039807081223,
0.018875692039728165,
-0.12044641375541687,
-0.1161557137966156,
0.06226525828242302,
0.1786324828863144,
-0.05894731730222702,
0.025527963414788246,
-0.019900968298316002,
0.13703125715255737,
-0.07030811905860901,
-0.027405336499214172,
0.012450500391423702,
0.06686347723007202,
0.00578118022531271,
0.12706215679645538,
-0.24699857831001282,
0.07420821487903595,
0.01207030937075615,
0.09816475957632065,
-0.013653269037604332,
0.07517097890377045,
-0.03841768205165863,
0.005275545176118612,
0.06801841408014297,
-0.0018430951749905944,
-0.06544483453035355,
-0.21822451055049896,
-0.05465655401349068,
0.023636840283870697,
0.06050598621368408,
-0.008906502276659012,
0.1073601245880127,
-0.022111760452389717,
0.06626813113689423,
-0.02898714505136013,
-0.1345108151435852,
-0.04591630399227142,
-0.13722804188728333,
-0.024017110466957092,
-0.01666732132434845,
-0.017360396683216095,
-0.030156835913658142,
0.019815457984805107,
-0.0028138908091932535,
0.21418361365795135,
-0.16060803830623627,
-0.10790520161390305,
-0.09357291460037231,
0.08122482895851135,
0.1324990838766098,
-0.09802622348070145,
0.029122324660420418,
0.01635023020207882,
0.05443476885557175,
-0.04013710096478462,
-0.062499020248651505,
0.020830031484365463,
-0.05628720670938492,
-0.07078497856855392,
-0.022221725434064865,
0.0942525640130043,
-0.01792384870350361,
0.05050741881132126,
0.004206730052828789,
-0.07749398052692413,
-0.06091497465968132,
-0.12550364434719086,
-0.09159703552722931,
-0.022255010902881622,
0.026057669892907143,
0.008014864288270473,
-0.0868108794093132,
0.09091541916131973,
-0.02660503052175045,
-0.07910474389791489,
0.0902218446135521,
0.19351013004779816,
-0.06322859227657318,
0.007006991188973188,
0.08913171291351318,
-0.05408496409654617,
-0.15191450715065002,
-0.06594394147396088,
0.05313006788492203,
0.08721239119768143,
-0.020559683442115784,
-0.15638089179992676,
0.07604856044054031,
0.04003625735640526,
0.03481825441122055,
0.020946765318512917,
-0.30050021409988403,
-0.1238311156630516,
0.0576271153986454,
0.0696321427822113,
0.044411640614271164,
-0.12022814899682999,
-0.03557148203253746,
-0.06045396625995636,
-0.07338003814220428,
0.033645689487457275,
0.05677526071667671,
0.13443106412887573,
-0.03364836797118187,
0.051030233502388,
0.033144887536764145,
-0.023869646713137627,
0.10558462888002396,
-0.003532898146659136,
0.08911661803722382,
-0.017010342329740524,
0.023251283913850784,
0.05446748435497284,
-0.06303893774747849,
0.17234797775745392,
-0.1749199777841568,
0.08020477741956711,
-0.23409989476203918,
-0.05510955676436424,
-0.007191800512373447,
-0.008416767232120037,
-0.03679168224334717,
-0.06061292439699173,
-0.10745439678430557,
-0.004078750032931566,
0.05796213820576668,
-0.025235909968614578,
0.07409537583589554,
-0.021255487576127052,
-0.04463747516274452,
0.04026402533054352,
0.07707880437374115,
-0.01672324724495411,
-0.16242137551307678,
0.021535083651542664,
0.03280748799443245,
0.08148599416017532,
-0.1956765055656433,
0.013876196928322315,
0.12021607160568237,
0.018233541399240494,
0.11086821556091309,
0.01634969189763069,
-0.07593566179275513,
0.04585789889097214,
0.0711502730846405,
-0.030618952587246895,
-0.0975593775510788,
-0.0017953483620658517,
-0.02246673032641411,
-0.10339139401912689,
0.029857976362109184,
0.0930291935801506,
-0.05320432409644127,
-0.02409679815173149,
-0.013967085629701614,
0.0046400208957493305,
-0.07243592292070389,
0.19796279072761536,
0.02529532089829445,
0.0852154865860939,
-0.06101449206471443,
0.07088971883058548,
0.09928195178508759,
-0.09565763175487518,
0.009661821648478508,
0.16755107045173645,
-0.07483458518981934,
-0.019910447299480438,
0.05419449508190155,
0.08648903667926788,
-0.06334146857261658,
-0.05237076058983803,
-0.09279465675354004,
-0.06799820810556412,
0.010418381541967392,
0.02227003686130047,
0.06957116723060608,
0.07427108287811279,
-0.042261626571416855,
0.01909385435283184,
-0.10371509939432144,
0.09653357416391373,
0.07836729288101196,
0.05354081094264984,
-0.14765699207782745,
0.16076098382472992,
0.03473426774144173,
0.08929914981126785,
0.0021981289610266685,
0.029513318091630936,
-0.09221027791500092,
0.046772412955760956,
-0.036604367196559906,
0.03738585487008095,
-0.010924341157078743,
0.05004864186048508,
-0.017862802371382713,
0.02703690528869629,
-0.02845711261034012,
0.0445052795112133,
-0.04648973420262337,
-0.033047474920749664,
-0.025691036134958267,
0.026524851098656654,
-0.051704924553632736,
-0.012766622006893158,
0.01220620796084404,
-0.08660230040550232,
0.09650794416666031,
-0.059584736824035645,
-0.004663402214646339,
0.012386418879032135,
-0.007963132113218307,
0.06104275956749916,
0.02549969032406807,
0.04959952458739281,
-0.010192186571657658,
-0.01058298721909523,
0.04382503777742386,
0.008637462742626667,
-0.00010772049427032471,
-0.0027107507921755314,
0.044588085263967514,
-0.14667177200317383,
-0.10289861261844635,
-0.09766805917024612,
-0.07133569568395615,
-0.06746722757816315,
0.07908904552459717,
0.06905290484428406,
0.07101491093635559,
0.1055888757109642,
-0.03153947368264198,
0.011539899744093418,
-0.1325821876525879,
-0.04044737294316292,
0.04464735463261604,
-0.011973117478191853,
-0.07985622435808182,
-0.05108089745044708,
0.052097201347351074,
-0.042381711304187775,
0.12130643427371979,
-0.01072694268077612,
0.04149359092116356,
-0.010643010027706623,
-0.05686714127659798,
0.005439449567347765,
-0.0026440133806318045,
0.22948935627937317,
-0.08608534187078476,
0.015694787725806236,
-0.0071912724524736404,
-0.00690863560885191,
0.048310935497283936,
0.13816846907138824,
0.08664969354867935,
0.13283807039260864,
0.03632509708404541,
0.10063283890485764,
-0.05820764973759651,
-0.033161845058202744,
-0.17969705164432526,
0.0283673033118248,
0.001272843568585813,
0.040148913860321045,
-0.023293599486351013,
0.1214074194431305,
0.14331592619419098,
-0.12061470746994019,
0.09165789932012558,
0.026544945314526558,
-0.101189024746418,
-0.05035251006484032,
-0.04494907334446907,
-0.0470585897564888,
-0.09853938966989517,
0.02236059308052063,
-0.1091507226228714,
0.025206224992871284,
0.09020894020795822,
0.04214083403348923,
-0.02251187339425087,
0.15341779589653015,
-0.012079929001629353,
-0.07056616991758347,
0.008267484605312347,
0.026269735768437386,
0.04042711853981018,
0.1055363118648529,
0.011429816484451294,
0.06251640617847443,
-0.06081930547952652,
0.086126908659935,
0.016800444573163986,
0.005060297902673483,
0.021657027304172516,
-0.012408229522407055,
-0.006765077821910381,
-0.042503472417593,
0.0029047823045402765,
0.08045846968889236,
0.16925811767578125,
0.03946463763713837,
-0.053819525986909866,
-0.055613983422517776,
0.18056413531303406,
-0.05632007122039795,
-0.07840082049369812,
-0.1239084005355835,
0.17085114121437073,
0.056275591254234314,
0.02814527601003647,
0.007704302668571472,
-0.08576769381761551,
-0.04827987775206566,
0.22206003963947296,
0.017821885645389557,
-0.007967127487063408,
-0.03937520086765289,
-0.01209363155066967,
-0.008116660639643669,
-0.03859813138842583,
0.14205215871334076,
0.01772981323301792,
0.21537290513515472,
-0.00025618929066695273,
-0.00030610314570367336,
-0.0370461568236351,
-0.03153655678033829,
-0.05099361017346382,
0.18442167341709137,
-0.03386460244655609,
0.028445305302739143,
-0.09492450207471848,
0.0005245872307568789,
0.04714144766330719,
-0.10854468494653702,
0.10484902560710907,
-0.08850599080324173,
-0.07458822429180145,
0.030113032087683678,
0.07229446619749069,
-0.026489080861210823,
0.04199130833148956,
-0.016608048230409622,
0.04287169873714447,
0.030963655561208725,
-0.02564043365418911,
-0.10097435116767883,
-0.1315382868051529,
0.06285586208105087,
-0.01940135285258293,
0.15933512151241302,
0.024496424943208694,
0.07903190702199936,
0.09395018965005875,
0.010178790427744389,
-0.06115098297595978,
0.10987704247236252,
0.0381496399641037,
0.020021751523017883,
0.07947186380624771,
0.129474937915802,
-0.03962066024541855,
0.1416381448507309,
0.00015608753892593086,
-0.028481576591730118,
-0.034168586134910583,
-0.021009283140301704,
0.002099961508065462,
-0.14364393055438995,
-0.0071601420640945435,
-0.06613275408744812,
0.12594637274742126,
0.2010553777217865,
-0.05372657999396324,
-0.022823071107268333,
-0.04094264283776283,
0.06856974214315414,
-0.008983531035482883,
0.08309970051050186,
-0.007925174199044704,
-0.1690853387117386,
0.008958005346357822,
-0.03020533360540867,
0.010128824971616268,
-0.18482749164104462,
-0.04464269429445267,
-0.03534797579050064,
-0.031251147389411926,
-0.0875798910856247,
0.14684854447841644,
0.07521407306194305,
0.018950214609503746,
-0.04453122615814209,
-0.1916903406381607,
-0.02819770947098732,
0.050017330795526505,
-0.14346164464950562,
-0.12040025740861893
] |
null | null |
transformers
|
# CodeTrans model for source code summarization python
Pretrained model on programming language python using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the python code snippets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_python_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_python_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = '''with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == " ; Include this text " : line = line + " Include below " out_file . write ( line ) '''
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/python/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "'with open ( CODE_STRING , CODE_STRING ) as in_file : buf = in_file . readlines ( ) with open ( CODE_STRING , CODE_STRING ) as out_file : for line in buf : if line == \" ; Include this text \" : line = line + \" Include below \" out_file . write ( line ) '"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_python_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization python
====================================================
Pretrained model on programming language python using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the python code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.07580780237913132,
0.07301829755306244,
-0.00149509496986866,
0.10911087691783905,
0.06102689355611801,
0.01890529878437519,
0.03455503284931183,
0.10433420538902283,
-0.051695071160793304,
0.06163419410586357,
0.06384504586458206,
-0.037589333951473236,
0.060599442571401596,
0.17400656640529633,
0.02628050185739994,
-0.20297038555145264,
-0.02528456225991249,
0.02816866897046566,
-0.05027467757463455,
0.10825125128030777,
0.08888652920722961,
-0.07025005668401718,
0.07046197354793549,
-0.04149521887302399,
-0.10222478210926056,
0.058683842420578,
-0.03665480762720108,
-0.04407693073153496,
0.09304409474134445,
0.05500590801239014,
0.12345973402261734,
-0.04789692535996437,
0.06621108949184418,
-0.2164877951145172,
-0.0031490460969507694,
0.030731135979294777,
0.05113350600004196,
0.025826826691627502,
0.05992906913161278,
0.04971781373023987,
0.15739788115024567,
-0.03335835784673691,
0.05749988183379173,
0.0534319132566452,
-0.06657592207193375,
-0.09581258147954941,
-0.046478670090436935,
0.04750942438840866,
0.09648100286722183,
0.10188265889883041,
-0.01488514244556427,
0.023280078545212746,
-0.07829318195581436,
0.0914180725812912,
0.12074841558933258,
-0.229562908411026,
-0.021209150552749634,
0.10390133410692215,
0.09296085685491562,
0.05956890806555748,
-0.0820954442024231,
-0.03810952231287956,
0.10068479925394058,
0.043430786579847336,
0.05865459144115448,
-0.08398231118917465,
-0.0526314452290535,
-0.009872366674244404,
-0.06443477421998978,
-0.053380176424980164,
0.1353713870048523,
0.029452189803123474,
-0.057474974542856216,
-0.09962153434753418,
-0.05971285328269005,
-0.19435174763202667,
0.03525708615779877,
0.03025275096297264,
0.018533824011683464,
0.0019949523266404867,
0.004334462806582451,
-0.002140642376616597,
-0.08978370577096939,
-0.08827174454927444,
0.00016353999671991915,
0.06406742334365845,
0.07262858003377914,
0.02727137692272663,
-0.01633754000067711,
0.08224283158779144,
-0.015598023310303688,
-0.0508892685174942,
-0.024267729371786118,
0.01505062822252512,
-0.11216223984956741,
0.023626022040843964,
-0.015934573486447334,
-0.06917551159858704,
-0.015484738163650036,
0.1131606474518776,
-0.0742274820804596,
0.0820571705698967,
0.11307784169912338,
0.0018134466372430325,
0.0008554586675018072,
0.23928417265415192,
0.04547838121652603,
-0.15400353074073792,
0.008905777707695961,
0.012334167025983334,
0.007936825975775719,
-0.0029832206200808287,
-0.06527009606361389,
-0.03852783888578415,
0.015128224156796932,
0.0659368485212326,
-0.12578780949115753,
0.023611653596162796,
-0.04168972745537758,
-0.005677252076566219,
0.019973214715719223,
-0.12743473052978516,
0.03896263614296913,
0.0031765950843691826,
-0.06286269426345825,
-0.019517455250024796,
0.0745168924331665,
-0.12146269530057907,
-0.11838343739509583,
0.04449420049786568,
-0.04934202507138252,
-0.028232915326952934,
-0.12739330530166626,
-0.13548052310943604,
-0.019096503034234047,
-0.04066643863916397,
0.01731587015092373,
-0.1100853756070137,
-0.10014516860246658,
-0.005353463813662529,
0.03945385292172432,
-0.004431547597050667,
-0.010415281169116497,
-0.058381617069244385,
0.0026847077533602715,
0.005514288321137428,
-0.01726260967552662,
0.002669147914275527,
-0.046709198504686356,
0.09210165590047836,
0.09426652640104294,
0.04881470277905464,
0.00713615445420146,
0.028939122334122658,
-0.08353681117296219,
0.06941883265972137,
-0.07308751344680786,
0.06697040051221848,
-0.01822010613977909,
0.06581593304872513,
-0.09782201796770096,
-0.08925147354602814,
0.03590131178498268,
0.053022246807813644,
0.05892687290906906,
0.02152804657816887,
-0.10952701419591904,
0.016028670594096184,
0.15248708426952362,
-0.10231097787618637,
-0.11472562700510025,
0.11962280422449112,
-0.007125846575945616,
0.007497068960219622,
0.08156269043684006,
0.1269843876361847,
0.1423662304878235,
-0.1103026270866394,
-0.04396682605147362,
0.09035617113113403,
0.06012991815805435,
-0.04241978004574776,
0.08819237351417542,
0.013393708504736423,
0.025298956781625748,
0.025098182260990143,
0.04564771056175232,
0.07664854824542999,
0.00024227786343544722,
-0.03303341567516327,
-0.031703341752290726,
-0.07877327501773834,
-0.015280392952263355,
-0.007088004145771265,
0.020584726706147194,
-0.07033929973840714,
-0.08848512917757034,
0.026472896337509155,
0.17491519451141357,
-0.1070101335644722,
0.02651372365653515,
-0.08947445452213287,
-0.03841151297092438,
-0.0763251930475235,
0.0142450500279665,
-0.10195580869913101,
0.0038996876683086157,
0.045664217323064804,
-0.0327596478164196,
0.06518128514289856,
0.06989730894565582,
0.0073151299729943275,
0.02832392230629921,
-0.04431929439306259,
-0.04671667516231537,
-0.045240093022584915,
-0.06509874016046524,
-0.12022792547941208,
-0.0011936095543205738,
-0.08594454079866409,
-0.023248212411999702,
-0.05251513421535492,
-0.1719827800989151,
0.010348446667194366,
-0.026166344061493874,
0.020217815414071083,
0.009771454147994518,
-0.020576581358909607,
0.021141482517123222,
0.05292164906859398,
-0.05727861821651459,
-0.09249678999185562,
0.009084120392799377,
0.02249462902545929,
-0.10012045502662659,
-0.0458916537463665,
-0.12795475125312805,
-0.04861323535442352,
0.08172775059938431,
0.09176730364561081,
-0.09240317344665527,
0.016042640432715416,
-0.017028454691171646,
-0.0476471409201622,
-0.049306925386190414,
-0.07111844420433044,
0.1913973093032837,
0.0063096145167946815,
0.17752841114997864,
-0.1276124268770218,
-0.04907074198126793,
-0.04230848699808121,
-0.0155801335349679,
0.024347811937332153,
0.14693646132946014,
-0.005106509663164616,
-0.07934363931417465,
0.04965214431285858,
-0.03150032088160515,
-0.07214582711458206,
0.15592466294765472,
-0.0035859723575413227,
-0.0923948660492897,
0.013018428348004818,
0.09609317779541016,
-0.004014418926090002,
0.14156277477741241,
-0.06900011748075485,
-0.002277067396789789,
-0.00657650688663125,
0.023335648700594902,
0.04362853989005089,
-0.12464188784360886,
0.027810286730527878,
0.06278540194034576,
-0.05872653052210808,
-0.05673228204250336,
-0.04141775518655777,
-0.04035016521811485,
0.03142256289720535,
0.009516150690615177,
0.0012391889467835426,
-0.019822336733341217,
-0.023027468472719193,
-0.08867187052965164,
0.21224801242351532,
-0.09355122596025467,
-0.2242194265127182,
-0.1790461391210556,
0.013576366007328033,
-0.02708584815263748,
-0.007734267506748438,
0.04577334597706795,
-0.11344289779663086,
-0.07588960230350494,
-0.0928489938378334,
0.15779393911361694,
-0.149368554353714,
0.0018672366859391332,
-0.03368522599339485,
0.040542539209127426,
0.03523537144064903,
-0.17968489229679108,
0.031659822911024094,
-0.0042595406994223595,
-0.01095596980303526,
0.001768004964105785,
-0.07197102159261703,
0.07865553349256516,
0.13107477128505707,
-0.08121377229690552,
0.0096964156255126,
-0.013645165599882603,
0.1640729010105133,
-0.05653483793139458,
0.014538676477968693,
0.1841609627008438,
0.014673281461000443,
0.041668858379125595,
0.04380324110388756,
0.00931619107723236,
-0.09382433444261551,
0.06622868031263351,
0.048121798783540726,
-0.038231998682022095,
-0.23735538125038147,
-0.015959547832608223,
-0.07049116492271423,
0.06253716349601746,
0.13488738238811493,
0.048892855644226074,
-0.133185014128685,
0.028910009190440178,
-0.009726033546030521,
0.14412334561347961,
-0.026290209963917732,
0.07031036168336868,
0.03172540292143822,
0.013233364559710026,
0.010055825114250183,
-0.09242716431617737,
-0.003231082344427705,
0.0699002593755722,
0.10587474703788757,
0.20471614599227905,
-0.06855626404285431,
0.18752628564834595,
-0.0018607918173074722,
0.08010166883468628,
0.03652854636311531,
0.09981276839971542,
-0.127972811460495,
0.008981308899819851,
0.00843221414834261,
-0.0072012608870863914,
-0.06649816036224365,
0.04998350143432617,
-0.02812717854976654,
0.06223950535058975,
-0.0637374296784401,
0.021007459610700607,
0.013975752517580986,
0.1622449904680252,
0.06828928738832474,
-0.19519738852977753,
-0.11542457342147827,
0.015988115221261978,
-0.11675748229026794,
-0.11190823465585709,
0.07079581171274185,
0.19377678632736206,
-0.05694000422954559,
0.01937766745686531,
-0.019921228289604187,
0.1385228931903839,
-0.08728962391614914,
-0.03300166130065918,
0.015225842595100403,
0.0828917846083641,
0.0053307125344872475,
0.12175525724887848,
-0.2500254213809967,
0.0669390857219696,
0.01233318168669939,
0.09937960654497147,
-0.005393241997808218,
0.07222629338502884,
-0.03795885667204857,
-0.0003342694544699043,
0.06627446413040161,
-0.0012289212318137288,
-0.06330307573080063,
-0.2057167887687683,
-0.05472274497151375,
0.022501220926642418,
0.05973431468009949,
-0.0019394417759031057,
0.10123132169246674,
-0.030800431966781616,
0.05736405774950981,
-0.020185712724924088,
-0.1567041426897049,
-0.03657686710357666,
-0.14329135417938232,
-0.04014863818883896,
-0.015868958085775375,
-0.011525570414960384,
-0.02917775884270668,
0.027121251448988914,
0.015539737418293953,
0.22192427515983582,
-0.14723964035511017,
-0.1054425910115242,
-0.09347779303789139,
0.08205942064523697,
0.13522124290466309,
-0.09929957240819931,
0.03938974067568779,
0.020806139335036278,
0.05324926599860191,
-0.04002612829208374,
-0.07343817502260208,
0.03124845214188099,
-0.05206010490655899,
-0.05987127497792244,
-0.020947400480508804,
0.09908819198608398,
-0.011527362279593945,
0.04853542521595955,
0.002978340722620487,
-0.0726330503821373,
-0.058371856808662415,
-0.12345772981643677,
-0.1035444438457489,
-0.006348957773298025,
0.02922077104449272,
0.01058398000895977,
-0.08966853469610214,
0.08312743157148361,
-0.02194220945239067,
-0.07248055934906006,
0.09134624153375626,
0.1667202115058899,
-0.06937150657176971,
0.00633148243650794,
0.08279944956302643,
-0.05932164192199707,
-0.1618158519268036,
-0.054206278175115585,
0.05146943777799606,
0.0827309787273407,
-0.028037726879119873,
-0.15516529977321625,
0.06967441737651825,
0.03846818953752518,
0.03822189196944237,
0.030014829710125923,
-0.3049512803554535,
-0.12084124982357025,
0.042662955820560455,
0.06968189030885696,
0.05072929710149765,
-0.11386820673942566,
-0.03589506819844246,
-0.061016716063022614,
-0.061896320432424545,
0.03965412452816963,
0.06279905140399933,
0.12822286784648895,
-0.03413965925574303,
0.045295197516679764,
0.03763188049197197,
-0.022247646003961563,
0.08928769826889038,
-0.013356601819396019,
0.09026549756526947,
-0.019059211015701294,
0.028712257742881775,
0.05897076055407524,
-0.06113264337182045,
0.17604656517505646,
-0.18359872698783875,
0.08286791294813156,
-0.2148977518081665,
-0.05479135364294052,
-0.007174285128712654,
-0.002701825462281704,
-0.03074626997113228,
-0.060625623911619186,
-0.11539319902658463,
0.0032436170149594545,
0.05840889737010002,
-0.023387040942907333,
0.06945603340864182,
-0.02307788096368313,
-0.05012880638241768,
0.03848469629883766,
0.07718061655759811,
-0.00991769414395094,
-0.16049951314926147,
0.030007077381014824,
0.03376084566116333,
0.08710421621799469,
-0.20429830253124237,
0.012721247971057892,
0.11975976079702377,
0.017620276659727097,
0.10761550813913345,
0.0163718331605196,
-0.07469560950994492,
0.0445762500166893,
0.0678069144487381,
-0.02912798710167408,
-0.09654869884252548,
-0.007791520562022924,
-0.024386465549468994,
-0.09544429928064346,
0.02226748876273632,
0.09311609715223312,
-0.061239130795001984,
-0.01753116212785244,
-0.008907785639166832,
0.006471999920904636,
-0.06824169307947159,
0.1869649738073349,
0.02356630191206932,
0.0744791030883789,
-0.05974770709872246,
0.0754547119140625,
0.09574026614427567,
-0.10646890848875046,
0.013199210166931152,
0.1741427630186081,
-0.07693920284509659,
-0.01838178187608719,
0.06453321129083633,
0.09113943576812744,
-0.06524600088596344,
-0.05026217922568321,
-0.08758444339036942,
-0.06515368819236755,
0.009258908219635487,
0.031854599714279175,
0.06898550689220428,
0.07722637057304382,
-0.04962660372257233,
0.021035708487033844,
-0.10881754755973816,
0.09236546605825424,
0.07599377632141113,
0.0530051551759243,
-0.1424144059419632,
0.15825243294239044,
0.039247266948223114,
0.06923714280128479,
0.002224572468549013,
0.02460520714521408,
-0.09793760627508163,
0.043830566108226776,
-0.023565979674458504,
0.04498368874192238,
-0.008978837169706821,
0.05191723257303238,
-0.02586962841451168,
0.02972853183746338,
-0.027440058067440987,
0.04664812237024307,
-0.03939994052052498,
-0.0361805334687233,
-0.029957305639982224,
0.019761567935347557,
-0.05510234460234642,
-0.015573522076010704,
0.007158683147281408,
-0.08336468786001205,
0.0924539864063263,
-0.05587778612971306,
-0.0014118686085566878,
0.016128286719322205,
-0.0038648229092359543,
0.05937504768371582,
0.03343787044286728,
0.04663224518299103,
-0.010078173130750656,
0.0031250081956386566,
0.04179918393492699,
0.009618059732019901,
-0.0029864339157938957,
-0.006295738276094198,
0.05143261328339577,
-0.1501445770263672,
-0.09389086067676544,
-0.09416930377483368,
-0.06759575754404068,
-0.06751839071512222,
0.07602759450674057,
0.07387971878051758,
0.07620804011821747,
0.1073555201292038,
-0.038215648382902145,
0.014905480667948723,
-0.13384070992469788,
-0.03822889178991318,
0.04634658992290497,
-0.014069883152842522,
-0.07627763599157333,
-0.04396788403391838,
0.05355233699083328,
-0.040301110595464706,
0.11653176695108414,
-0.0005805511609651148,
0.050425633788108826,
-0.012833170592784882,
-0.050379522144794464,
0.005494068842381239,
-0.0009326954022981226,
0.22287645936012268,
-0.08382193744182587,
0.013524445705115795,
-0.004749109968543053,
0.003996436484158039,
0.054900843650102615,
0.1417817771434784,
0.08478610217571259,
0.12561668455600739,
0.04103465378284454,
0.10193004459142685,
-0.06332148611545563,
-0.02747129648923874,
-0.17818604409694672,
0.03697904199361801,
-0.005793158896267414,
0.03688128665089607,
-0.025836003944277763,
0.13202159106731415,
0.13841238617897034,
-0.12452424317598343,
0.09747099131345749,
0.028313640505075455,
-0.10141771286725998,
-0.04978952929377556,
-0.06229596212506294,
-0.04675116389989853,
-0.11102081090211868,
0.015674538910388947,
-0.11232171207666397,
0.021991603076457977,
0.08790146559476852,
0.039402127265930176,
-0.0254326481372118,
0.14795635640621185,
-0.013703801669180393,
-0.07915143668651581,
0.01734597608447075,
0.03329751640558243,
0.04161204770207405,
0.10783616453409195,
0.016431542113423347,
0.05601390078663826,
-0.06939326226711273,
0.07946950942277908,
0.0211578831076622,
0.008637168444693089,
0.024581395089626312,
-0.006798480171710253,
-0.0005649842787533998,
-0.04520849511027336,
0.014017710462212563,
0.0739215835928917,
0.17253413796424866,
0.045751430094242096,
-0.057335011661052704,
-0.05345163121819496,
0.19856007397174835,
-0.05985233932733536,
-0.07328643649816513,
-0.12605123221874237,
0.1804223507642746,
0.053054723888635635,
0.028555074706673622,
0.004592975601553917,
-0.08391644060611725,
-0.04287208989262581,
0.22856508195400238,
0.02813832275569439,
-0.012200294993817806,
-0.036678247153759,
-0.01613309048116207,
-0.010401293635368347,
-0.03169409558176994,
0.14496032893657684,
0.025215672329068184,
0.2347448319196701,
-0.001848883810453117,
-0.01462074089795351,
-0.0386945940554142,
-0.029260752722620964,
-0.043610580265522,
0.18604899942874908,
-0.04007880017161369,
0.026652108877897263,
-0.09369712322950363,
-0.003274379763752222,
0.03168099746108055,
-0.10578325390815735,
0.10195542871952057,
-0.10252600163221359,
-0.0684082955121994,
0.022756924852728844,
0.07022695243358612,
-0.03005881980061531,
0.04467632994055748,
-0.013036971911787987,
0.04673442617058754,
0.03299936279654503,
-0.020985735580325127,
-0.11447785049676895,
-0.14170946180820465,
0.04962119087576866,
-0.017436377704143524,
0.14598935842514038,
0.02003452368080616,
0.06564504653215408,
0.09288572520017624,
0.021863751113414764,
-0.05995047092437744,
0.10405636578798294,
0.033195860683918,
0.004557047504931688,
0.07320349663496017,
0.11851289868354797,
-0.04261312261223793,
0.15616145730018616,
0.0013705701567232609,
-0.02631823904812336,
-0.027208644896745682,
-0.01827840320765972,
-0.0005037814844399691,
-0.14331920444965363,
-0.010651858523488045,
-0.0664537325501442,
0.1273086816072464,
0.20344021916389465,
-0.04491395130753517,
-0.022195111960172653,
-0.04390387237071991,
0.06361190229654312,
-0.016499748453497887,
0.08260833472013474,
-0.0018853949150070548,
-0.160588338971138,
-0.0032630066853016615,
-0.01865031011402607,
0.0046578142791986465,
-0.18364030122756958,
-0.04199624061584473,
-0.039357468485832214,
-0.03192152827978134,
-0.0917878970503807,
0.14576129615306854,
0.07108793407678604,
0.019612429663538933,
-0.04098113626241684,
-0.17578725516796112,
-0.01832110807299614,
0.053131476044654846,
-0.1415153592824936,
-0.11924762278795242
] |
null | null |
transformers
|
# CodeTrans model for source code summarization sql
Pretrained model on programming language sql using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_sql_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_sql_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "select time ( col0 ) from tab0"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/source%20code%20summarization/sql/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_sql_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization sql
=================================================
Pretrained model on programming language sql using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
146
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 120,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.14150746166706085,
0.0192634928971529,
-0.0009251083829440176,
0.1353294998407364,
0.11713334172964096,
0.015150378458201885,
0.06481426954269409,
0.04591074585914612,
-0.0322674885392189,
0.021645819768309593,
0.04650293290615082,
0.012075209990143776,
0.044758446514606476,
0.20009109377861023,
0.01418483629822731,
-0.15295587480068207,
-0.016956835985183716,
0.019751980900764465,
-0.08532576262950897,
0.12277925759553909,
0.09769263863563538,
-0.07478201389312744,
0.05045466497540474,
-0.04313862696290016,
-0.20707300305366516,
0.06203262880444527,
0.0042955693788826466,
-0.059077419340610504,
0.10623740404844284,
0.06692224740982056,
0.1282586008310318,
-0.010148796252906322,
0.0425143763422966,
-0.13084478676319122,
0.009122180752456188,
0.01938965916633606,
0.03750799968838692,
0.023622650653123856,
0.06421608477830887,
0.03968937322497368,
0.13752268254756927,
-0.007427734322845936,
0.03394985944032669,
0.0638696476817131,
-0.06766839325428009,
-0.09693513810634613,
-0.02908044494688511,
0.030893893912434578,
0.052852001041173935,
0.09618114680051804,
-0.010303606279194355,
0.09769134223461151,
-0.151717871427536,
0.12082991749048233,
0.08631479740142822,
-0.2528076171875,
-0.011858050711452961,
0.09309926629066467,
0.05520164594054222,
0.05966467410326004,
-0.04490584507584572,
-0.05018172785639763,
0.07963386923074722,
0.05384497344493866,
0.03959996998310089,
-0.08363650739192963,
-0.08191712945699692,
0.03396470099687576,
-0.09590332210063934,
-0.07343379408121109,
0.21823503077030182,
0.013096261769533157,
-0.07778468728065491,
-0.04951823502779007,
-0.04337894544005394,
-0.11500784009695053,
0.03575757145881653,
0.037964191287755966,
0.0040986193343997,
-0.0190596766769886,
0.0012897527776658535,
0.026870157569646835,
-0.09566749632358551,
-0.14130744338035583,
0.006539724767208099,
0.0740116685628891,
0.04775843024253845,
0.03865144029259682,
-0.06620106101036072,
0.10975144803524017,
0.014935319311916828,
-0.04010274261236191,
-0.012662619352340698,
-0.02939409203827381,
-0.13017860054969788,
0.040325917303562164,
-0.06449669599533081,
-0.1834290325641632,
-0.000005014415364712477,
0.012680364772677422,
-0.037508346140384674,
0.05767162889242172,
0.05462902784347534,
0.026603691279888153,
0.032260626554489136,
0.19634075462818146,
0.02248532511293888,
-0.11482677608728409,
0.05526943504810333,
0.03941858559846878,
-0.0408446341753006,
-0.011892175301909447,
-0.06096155568957329,
-0.07884582877159119,
0.057979170233011246,
0.09228172898292542,
-0.13767531514167786,
0.0473153293132782,
-0.06806955486536026,
-0.042878247797489166,
-0.011800417676568031,
-0.1678648442029953,
-0.0037255696952342987,
0.02340872958302498,
-0.06704428046941757,
-0.05214161425828934,
0.09377780556678772,
-0.15495584905147552,
-0.14999136328697205,
-0.032391324639320374,
-0.08133641630411148,
-0.04721217229962349,
-0.1582101732492447,
-0.14201533794403076,
-0.018157340586185455,
-0.03713143616914749,
0.005555359646677971,
-0.06978287547826767,
-0.17370352149009705,
-0.01871858909726143,
0.027072878554463387,
0.004666144028306007,
-0.004307194612920284,
-0.07364224642515182,
0.001038295216858387,
-0.015899792313575745,
-0.038867224007844925,
0.006634094752371311,
-0.0471503809094429,
0.11088930815458298,
0.09778953343629837,
0.0295493695884943,
-0.0369364358484745,
0.04842935875058174,
-0.07679621875286102,
0.06719313561916351,
-0.10873966664075851,
0.11719170212745667,
-0.06937530636787415,
0.07325617223978043,
-0.04294021427631378,
-0.09619176387786865,
0.04268272593617439,
0.05861901864409447,
0.07138632982969284,
0.05895261466503143,
-0.16774620115756989,
-0.025758842006325722,
0.19100970029830933,
-0.12346457690000534,
-0.10697900503873825,
0.11083284020423889,
-0.046519551426172256,
0.05221163108944893,
0.09114280343055725,
0.13291223347187042,
0.14150461554527283,
-0.022539939731359482,
0.0038735358975827694,
0.06303969025611877,
0.03947100043296814,
-0.11479490250349045,
0.0843164473772049,
0.045877885073423386,
-0.09270927309989929,
0.0633322149515152,
-0.005391696002334356,
0.09748315811157227,
-0.01356218010187149,
-0.036682527512311935,
-0.04953598603606224,
-0.07510732859373093,
-0.002857009880244732,
-0.0013218976091593504,
0.06488344818353653,
-0.06484813243150711,
-0.06635712087154388,
0.0778820812702179,
0.1558556705713272,
-0.1150406152009964,
-0.010262835770845413,
-0.08926897495985031,
0.03093353845179081,
-0.06096457317471504,
0.017750157043337822,
-0.17274028062820435,
0.01660979352891445,
0.06069066748023033,
-0.02788713574409485,
0.07233341038227081,
0.12524056434631348,
0.01431768573820591,
0.054272349923849106,
0.00459689274430275,
-0.004360913299024105,
-0.09234532713890076,
-0.05773451551795006,
-0.07119628041982651,
-0.05996006354689598,
-0.09553468972444534,
-0.04554562643170357,
-0.0005304248770698905,
-0.1889188140630722,
0.013752005994319916,
0.011277968995273113,
0.019425909966230392,
0.0029262807220220566,
-0.015663187950849533,
0.022199541330337524,
0.06573569029569626,
-0.05412308871746063,
-0.034745439887046814,
0.026470819488167763,
0.025183452293276787,
-0.07350575923919678,
-0.05126539617776871,
-0.0868673324584961,
0.004131339490413666,
0.10800102353096008,
0.05482087284326553,
-0.07482295483350754,
-0.0015914775431156158,
-0.019112080335617065,
-0.037832118570804596,
0.01806410402059555,
-0.06874097883701324,
0.16369210183620453,
0.0027980725280940533,
0.19546474516391754,
-0.1498306393623352,
-0.0249146930873394,
-0.027484556660056114,
0.022672191262245178,
0.056718610227108,
0.13489636778831482,
0.014943894930183887,
-0.13605082035064697,
0.06444454193115234,
0.004464366007596254,
-0.07097937166690826,
0.21158073842525482,
-0.049605801701545715,
-0.09053567796945572,
0.01282753236591816,
0.09493224322795868,
-0.024035470560193062,
0.16112270951271057,
-0.16521522402763367,
-0.021215302869677544,
0.011512371711432934,
0.0071516805328428745,
0.06556427478790283,
-0.13368600606918335,
0.009587258100509644,
0.017786841839551926,
-0.06810078024864197,
-0.1195487380027771,
-0.027197379618883133,
-0.0038144418504089117,
0.035604897886514664,
-0.0014418831560760736,
0.00007389835081994534,
0.017322588711977005,
-0.04398486390709877,
-0.10580994188785553,
0.22429315745830536,
-0.10803097486495972,
-0.23313744366168976,
-0.20402006804943085,
0.11635955423116684,
-0.0526270791888237,
0.0014624366303905845,
0.02348354272544384,
-0.086826391518116,
-0.06488628685474396,
-0.05659790337085724,
0.1730959266424179,
-0.07425229996442795,
-0.008854053914546967,
-0.022623809054493904,
0.06399723887443542,
0.02527938224375248,
-0.2040068656206131,
0.036917611956596375,
-0.016099657863378525,
-0.016248794272542,
-0.009250089526176453,
-0.08939272910356522,
0.08473145961761475,
0.1475866138935089,
-0.061541374772787094,
0.019632285460829735,
-0.005343076307326555,
0.21009561419487,
-0.02764347568154335,
-0.060655977576971054,
0.13155530393123627,
-0.010592110455036163,
0.01115352101624012,
0.026415668427944183,
0.0036776280030608177,
-0.0890667736530304,
0.05250818282365799,
0.005744848400354385,
-0.029565928503870964,
-0.2735719382762909,
-0.024477887898683548,
-0.08553172647953033,
0.04462767392396927,
0.03342990204691887,
0.04453679546713829,
-0.11001008003950119,
0.032871607691049576,
0.04439379274845123,
0.12847813963890076,
-0.018433135002851486,
0.032162606716156006,
0.0764906257390976,
-0.004387045279145241,
0.016592472791671753,
-0.09726637601852417,
0.0018839589320123196,
0.08527366816997528,
0.09973318129777908,
0.2674618065357208,
-0.0982867106795311,
0.20382998883724213,
0.03700357303023338,
0.0653182715177536,
0.04670396447181702,
0.15269002318382263,
-0.10685458034276962,
0.03163740038871765,
-0.0030006645247340202,
-0.008382024243474007,
-0.1374659240245819,
0.028797481209039688,
-0.03316438943147659,
0.067739337682724,
-0.10929447412490845,
-0.022327445447444916,
0.009607376530766487,
0.17880332469940186,
0.03265734389424324,
-0.22008666396141052,
-0.12891410291194916,
0.02016649767756462,
-0.10180162638425827,
-0.09529122710227966,
0.05823134258389473,
0.22459740936756134,
-0.07666005194187164,
-0.02135705202817917,
-0.013339650817215443,
0.13076262176036835,
-0.03787623718380928,
-0.021988870576024055,
-0.042012427002191544,
0.07280934602022171,
0.01833990029990673,
0.13644112646579742,
-0.24227911233901978,
0.14741896092891693,
0.0001434664591215551,
0.060554541647434235,
-0.035277530550956726,
0.06596706062555313,
-0.03247413411736488,
0.043545860797166824,
0.04821721464395523,
-0.007940849289298058,
-0.039730820804834366,
-0.17775645852088928,
-0.006070527248084545,
0.035804618149995804,
0.04159862920641899,
0.04457846283912659,
0.07265720516443253,
-0.006287123076617718,
0.0458478145301342,
-0.007291487418115139,
-0.11971431225538254,
-0.07169067859649658,
-0.10762488842010498,
-0.01785988360643387,
-0.046363089233636856,
-0.02556927502155304,
-0.05549721047282219,
-0.008174329996109009,
0.07527003437280655,
0.18746842443943024,
-0.10940033942461014,
-0.08574198186397552,
-0.08850537240505219,
0.07390017062425613,
0.1332874447107315,
-0.07843877375125885,
0.058562856167554855,
-0.004912613891065121,
0.033575836569070816,
0.006724199745804071,
-0.08984271436929703,
0.05789950489997864,
-0.03704354912042618,
-0.05818506330251694,
-0.02697555348277092,
0.09575895220041275,
0.0067076971754431725,
0.027554228901863098,
0.0014119278639554977,
-0.07901926338672638,
-0.0421060211956501,
-0.1256837546825409,
-0.11158140748739243,
-0.04136066511273384,
0.0019024887587875128,
0.06644266843795776,
-0.13651813566684723,
-0.048606988042593,
-0.0005191932432353497,
-0.030762596055865288,
0.14223545789718628,
0.14356361329555511,
-0.06965134292840958,
0.037811197340488434,
0.12162609398365021,
-0.055987995117902756,
-0.185407355427742,
0.010371426120400429,
0.06697163730859756,
0.11913701146841049,
-0.030238619074225426,
-0.18744802474975586,
0.06030154600739479,
0.026390934363007545,
0.03735730051994324,
0.038191426545381546,
-0.30562540888786316,
-0.12145154923200607,
0.0665467232465744,
0.1252269148826599,
0.07367010414600372,
-0.10970687866210938,
-0.044005852192640305,
-0.059034451842308044,
-0.11317066103219986,
0.09958015382289886,
-0.019513089209794998,
0.13605089485645294,
-0.0377008281648159,
0.04541619494557381,
0.03318091481924057,
-0.047598596662282944,
0.07055673748254776,
0.023467740043997765,
0.10167863965034485,
-0.031095821410417557,
0.02312542498111725,
0.12936441600322723,
-0.02787182107567787,
0.16788560152053833,
-0.13308225572109222,
0.10932005196809769,
-0.2207392156124115,
-0.06876801699399948,
-0.06623683124780655,
0.010413728654384613,
-0.03260029852390289,
-0.037681080400943756,
-0.06785212457180023,
0.03110036812722683,
-0.0005354411550797522,
-0.010423574596643448,
0.01723559945821762,
-0.034045252948999405,
-0.025396525859832764,
0.10768510401248932,
0.10945125669240952,
-0.002713470021262765,
-0.09246191382408142,
0.044344522058963776,
0.047211654484272,
0.09022935479879379,
-0.1964247077703476,
0.030995870009064674,
0.11451885849237442,
0.024009589105844498,
0.11216261237859726,
0.04853689670562744,
-0.09673725068569183,
0.04613706097006798,
0.08760572969913483,
-0.08146809786558151,
-0.09801211953163147,
-0.01951628364622593,
-0.05456678196787834,
-0.07714308053255081,
0.05707192048430443,
0.09453631937503815,
-0.036338046193122864,
-0.019478794187307358,
-0.023658860474824905,
-0.02936621941626072,
-0.10640876740217209,
0.20746669173240662,
0.052351560443639755,
0.083382748067379,
-0.07030472904443741,
0.07238410413265228,
0.07853780686855316,
-0.08421630412340164,
0.00848371721804142,
0.1834973394870758,
-0.1116962805390358,
-0.043887484818696976,
0.02669452503323555,
0.16018661856651306,
-0.04287440702319145,
-0.05237080529332161,
-0.12933407723903656,
-0.08109313994646072,
0.037217069417238235,
0.16085956990718842,
0.08406341820955276,
0.10375460982322693,
-0.051416460424661636,
0.00258481758646667,
-0.07173751294612885,
0.08051887899637222,
0.08631192147731781,
0.029565829783678055,
-0.11738599091768265,
0.13638298213481903,
0.034579768776893616,
0.10974681377410889,
-0.028607429936528206,
-0.00937506090849638,
-0.11388802528381348,
0.05657866969704628,
-0.10558240860700607,
0.03408949077129364,
-0.014331640675663948,
0.048962052911520004,
-0.022037509828805923,
-0.005287758074700832,
-0.028320766985416412,
0.06467455625534058,
-0.08729340136051178,
-0.003307904815301299,
-0.004756573121994734,
0.04188067466020584,
-0.049675147980451584,
-0.009869289584457874,
0.03511770814657211,
-0.08351071178913116,
0.1237001121044159,
-0.007801925763487816,
-0.028801245614886284,
0.0832785964012146,
-0.04842458665370941,
0.03773847967386246,
0.008450335822999477,
0.059037141501903534,
0.01075501088052988,
0.03615967929363251,
0.07796800136566162,
0.033309608697891235,
0.05506831407546997,
0.007054723799228668,
0.10322129726409912,
-0.1325102597475052,
-0.10165420919656754,
-0.03088582120835781,
-0.09804898500442505,
-0.06317431479692459,
0.08740077167749405,
0.0637826919555664,
0.09969580173492432,
0.0993933230638504,
-0.030080147087574005,
0.011147675104439259,
-0.14234799146652222,
-0.053772710263729095,
0.027228107675909996,
-0.03019619546830654,
-0.09707100689411163,
-0.06371042877435684,
0.048432283103466034,
-0.025502080097794533,
0.14475876092910767,
0.0015174616128206253,
0.05354446917772293,
-0.021790392696857452,
-0.04819311946630478,
0.02905944548547268,
0.028678899630904198,
0.22872841358184814,
-0.06411296129226685,
0.03629588335752487,
0.003053752239793539,
0.011472710408270359,
-0.007393501698970795,
0.11521068960428238,
0.11983820796012878,
0.1426120102405548,
-0.014592981897294521,
0.09945894032716751,
0.021193698048591614,
-0.006161944009363651,
-0.09773507714271545,
0.021453823894262314,
0.0025441243778914213,
0.06370625644922256,
-0.0677260309457779,
0.1708623766899109,
0.067732073366642,
-0.10880985856056213,
0.10995589196681976,
0.01654711551964283,
-0.12770698964595795,
-0.04582592472434044,
-0.001220440841279924,
-0.030243728309869766,
-0.13402751088142395,
0.023598812520503998,
-0.12402824312448502,
-0.015180514194071293,
0.08022883534431458,
0.04716258496046066,
-0.06586149334907532,
0.16849297285079956,
0.02512015774846077,
-0.058241620659828186,
0.04297759011387825,
0.004957451485097408,
0.03165746480226517,
0.03748781979084015,
0.01959262229502201,
0.04319535940885544,
-0.022964272648096085,
0.039156220853328705,
0.019117046147584915,
-0.020330261439085007,
-0.0040030148811638355,
-0.0101248100399971,
0.007547825574874878,
-0.03200504556298256,
0.02755242958664894,
0.045926284044981,
0.15284758806228638,
0.029262429103255272,
-0.07660365104675293,
-0.03441678360104561,
0.17332400381565094,
-0.04668554663658142,
-0.08208616822957993,
-0.1362709403038025,
0.17662738263607025,
0.033881332725286484,
0.0223989337682724,
0.020213916897773743,
-0.09444574266672134,
-0.0391414538025856,
0.1983002871274948,
0.08398529142141342,
-0.0394241139292717,
-0.02856709249317646,
-0.003792110364884138,
-0.0056769466027617455,
-0.04707521200180054,
0.1986977607011795,
0.023762505501508713,
0.2509269714355469,
0.005358383059501648,
-0.009679269976913929,
-0.06062117591500282,
-0.034393712878227234,
-0.006958439946174622,
0.14619527757167816,
-0.045899271965026855,
-0.017847442999482155,
-0.08148515969514847,
0.008706934750080109,
-0.004133032169193029,
-0.10816032439470291,
0.07102379202842712,
-0.1231723353266716,
-0.09827591478824615,
-0.03581322357058525,
0.03553266078233719,
-0.03160335496068001,
0.023039402440190315,
-0.03681199625134468,
0.04874369874596596,
0.06498567759990692,
-0.025116674602031708,
-0.1174573302268982,
-0.15367984771728516,
0.1004619151353836,
-0.044740229845047,
0.14566245675086975,
-0.012207170948386192,
0.12534773349761963,
0.09424270689487457,
0.04229748621582985,
-0.05586804077029228,
0.09539926052093506,
0.035317301750183105,
0.033494919538497925,
0.04692482203245163,
0.10641667991876602,
-0.0452771931886673,
0.16863803565502167,
-0.05561057850718498,
-0.03472407907247543,
-0.008766548708081245,
-0.05596684664487839,
-0.014999350532889366,
-0.15863709151744843,
-0.0031435389537364244,
-0.10514868050813675,
0.10411030054092407,
0.19334805011749268,
-0.04222315177321434,
-0.01988520659506321,
-0.09754098951816559,
0.10001549124717712,
-0.0225503109395504,
0.05296994000673294,
-0.03522520512342453,
-0.18397298455238342,
0.00543932244181633,
0.00971018522977829,
0.009272013790905476,
-0.25296562910079956,
-0.01727038435637951,
-0.03767528012394905,
-0.01900182105600834,
-0.06401313096284866,
0.15873096883296967,
0.0918809100985527,
0.052331794053316116,
-0.03505522385239601,
-0.14650265872478485,
-0.035720642656087875,
0.055947817862033844,
-0.12680596113204956,
-0.13119396567344666
] |
null | null |
transformers
|
# CodeTrans model for source code summarization sql
Pretrained model on programming language sql using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the sql code snippets.
## Intended uses & limitations
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_sql_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_sql_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "select time ( col0 ) from tab0"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/source%20code%20summarization/sql/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_sql_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization sql
=================================================
Pretrained model on programming language sql using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the source code summarization task for the sql code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 100 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09301870316267014,
0.07630050927400589,
-0.0014476686483249068,
0.1041579321026802,
0.04817010089755058,
0.020649610087275505,
0.058112937957048416,
0.0917515680193901,
-0.03080238588154316,
0.06593430787324905,
0.07159009575843811,
-0.03290485963225365,
0.06204814463853836,
0.18618221580982208,
0.02907133661210537,
-0.17472922801971436,
-0.01760326698422432,
0.026201365515589714,
-0.040561337023973465,
0.10577713698148727,
0.09681995213031769,
-0.08646571636199951,
0.06419854611158371,
-0.03468802943825722,
-0.11048070341348648,
0.0570775642991066,
-0.04716544225811958,
-0.042730070650577545,
0.09240549057722092,
0.06115959212183952,
0.11395405232906342,
-0.02751237154006958,
0.07607817649841309,
-0.21624146401882172,
0.0013577017234638333,
0.014407802373170853,
0.050600480288267136,
0.023943984881043434,
0.06503386050462723,
0.06735078990459442,
0.11541495472192764,
-0.039583105593919754,
0.03663202375173569,
0.05143994465470314,
-0.0613514743745327,
-0.069892019033432,
-0.05601227656006813,
0.07399789243936539,
0.10811495780944824,
0.08907224982976913,
-0.014379393309354782,
0.017847074195742607,
-0.0869641825556755,
0.08786175400018692,
0.12035513669252396,
-0.23066262900829315,
-0.02301531471312046,
0.10413065552711487,
0.09499386698007584,
0.05366353690624237,
-0.08306417614221573,
-0.03181092441082001,
0.09536135196685791,
0.034652743488550186,
0.04669622704386711,
-0.08389437198638916,
-0.04937497526407242,
0.010009770281612873,
-0.05604415759444237,
-0.05480603873729706,
0.14004920423030853,
0.053252238780260086,
-0.054143961519002914,
-0.09926125407218933,
-0.04624873772263527,
-0.18732310831546783,
0.04052721709012985,
0.017238818109035492,
0.021185794845223427,
-0.009634105488657951,
0.01580660045146942,
-0.008141007274389267,
-0.10127854347229004,
-0.10008252412080765,
-0.007956587709486485,
0.06240154802799225,
0.07022407650947571,
0.023726599290966988,
-0.006213766988366842,
0.0726151391863823,
-0.011664553545415401,
-0.05128904804587364,
-0.03225896134972572,
0.0144418366253376,
-0.139519602060318,
0.023200921714305878,
-0.029886459931731224,
-0.06814347952604294,
-0.004671755712479353,
0.10152756422758102,
-0.06854166090488434,
0.08264520019292831,
0.13098470866680145,
0.004568697419017553,
0.0006956496508792043,
0.2403012365102768,
0.042436592280864716,
-0.15187764167785645,
-0.0007024778751656413,
0.02303498424589634,
0.002576621016487479,
-0.002874754834920168,
-0.06929578632116318,
-0.03580757975578308,
0.006886579561978579,
0.0566873736679554,
-0.14135219156742096,
0.018559858202934265,
-0.04270190745592117,
-0.010573876090347767,
0.044905636459589005,
-0.13609784841537476,
0.03427082672715187,
0.011217564344406128,
-0.05442938953638077,
-0.06176963448524475,
0.0569092258810997,
-0.11121848970651627,
-0.11671629548072815,
0.01827222853899002,
-0.04971766471862793,
-0.029195135459303856,
-0.12170800566673279,
-0.1199367567896843,
-0.015357088297605515,
-0.05399512127041817,
0.016396326944231987,
-0.09705794602632523,
-0.11034741252660751,
-0.01188550516963005,
0.03037141263484955,
-0.007559790275990963,
-0.016700584441423416,
-0.053163062781095505,
0.0168379507958889,
-0.004231353290379047,
-0.027802573516964912,
0.011863857507705688,
-0.04319089651107788,
0.08322659134864807,
0.10022959858179092,
0.04998957738280296,
0.01161626074463129,
0.026985036209225655,
-0.0704045444726944,
0.06767573952674866,
-0.06493668258190155,
0.07758862525224686,
-0.024383287876844406,
0.052911803126335144,
-0.09420554339885712,
-0.07261884212493896,
0.03930274397134781,
0.057951927185058594,
0.057376161217689514,
0.02155592292547226,
-0.12250741571187973,
0.01593155227601528,
0.15346719324588776,
-0.09230229258537292,
-0.12927362322807312,
0.11598894000053406,
-0.0072848680429160595,
-0.002797722816467285,
0.07012630999088287,
0.12987171113491058,
0.1282280683517456,
-0.09201536327600479,
-0.045392464846372604,
0.09165249764919281,
0.07029891014099121,
-0.04748856648802757,
0.10299153625965118,
0.014831725507974625,
0.047020286321640015,
0.024444635957479477,
0.04591915383934975,
0.06239668279886246,
-0.003252530237659812,
-0.04206964746117592,
-0.024557773023843765,
-0.08156993985176086,
-0.030801482498645782,
-0.015315425582230091,
0.026982275769114494,
-0.06777259707450867,
-0.06990843266248703,
0.023401910439133644,
0.16476306319236755,
-0.09477272629737854,
0.02600065991282463,
-0.10242203623056412,
-0.062348414212465286,
-0.08167219907045364,
0.0131142633035779,
-0.11015903204679489,
0.007112532388418913,
0.045895665884017944,
-0.058119773864746094,
0.07138650119304657,
0.08491305261850357,
0.0037687895819544792,
0.03486092388629913,
-0.039763350039720535,
-0.04255807027220726,
-0.05014263466000557,
-0.05324936658143997,
-0.13181930780410767,
-0.015284773893654346,
-0.09393256157636642,
-0.029490340501070023,
-0.05213668569922447,
-0.16966849565505981,
0.015060207806527615,
-0.028278954327106476,
0.02357902191579342,
-0.0031425280030816793,
-0.02033175155520439,
0.028887230902910233,
0.05203833058476448,
-0.05348018556833267,
-0.08404508233070374,
0.009278588928282261,
0.025645187124609947,
-0.12966173887252808,
-0.046361275017261505,
-0.11494956165552139,
-0.045998819172382355,
0.07354879379272461,
0.08958029747009277,
-0.07859767228364944,
-0.004077159333974123,
-0.023693492636084557,
-0.05441195145249367,
-0.039949867874383926,
-0.07672165334224701,
0.16944873332977295,
0.012248752638697624,
0.1675572395324707,
-0.13172239065170288,
-0.05323592200875282,
-0.03834737464785576,
-0.013245124369859695,
0.01576269418001175,
0.15439803898334503,
-0.002122604288160801,
-0.10574419051408768,
0.049196917563676834,
-0.008733781985938549,
-0.059000466018915176,
0.15155167877674103,
-0.009516080841422081,
-0.09476828575134277,
0.006570014171302319,
0.08958999067544937,
-0.020102672278881073,
0.14595772325992584,
-0.08566174656152725,
-0.009578605182468891,
-0.0012060115113854408,
0.022867530584335327,
0.03848826885223389,
-0.13267865777015686,
0.02581068128347397,
0.059165533632040024,
-0.06928379833698273,
-0.061151280999183655,
-0.04271269217133522,
-0.0412917323410511,
0.02865017205476761,
0.006972983945161104,
0.005371500737965107,
-0.009332999587059021,
-0.028644181787967682,
-0.09489955008029938,
0.20762836933135986,
-0.08873237669467926,
-0.21324549615383148,
-0.16981719434261322,
0.026563869789242744,
-0.04881110042333603,
0.001952146994881332,
0.05280262976884842,
-0.11809086799621582,
-0.06793254613876343,
-0.09182035177946091,
0.1317964345216751,
-0.11954748630523682,
0.005453170742839575,
-0.0011873574694618583,
0.02985447086393833,
0.03314824029803276,
-0.17510028183460236,
0.030499130487442017,
-0.008565102703869343,
-0.0008675808203406632,
-0.0003126147494185716,
-0.06237614527344704,
0.09341277927160263,
0.12143836170434952,
-0.07239563763141632,
0.018154939636588097,
-0.01024907361716032,
0.17556622624397278,
-0.049365051090717316,
0.019946666434407234,
0.19224907457828522,
0.014834512025117874,
0.03651804476976395,
0.04913659393787384,
0.017219116911292076,
-0.08916042745113373,
0.056448616087436676,
0.05461828410625458,
-0.03269582986831665,
-0.2494659721851349,
-0.00038008333649486303,
-0.07211112976074219,
0.03474891930818558,
0.10750056803226471,
0.053286682814359665,
-0.1384841799736023,
0.034340716898441315,
-0.0076531777158379555,
0.133574977517128,
-0.038035038858652115,
0.05507391691207886,
0.033207014203071594,
0.015324709936976433,
0.014059744775295258,
-0.09482033550739288,
0.002957260934635997,
0.07421096414327621,
0.12047728151082993,
0.20203916728496552,
-0.046057362109422684,
0.20646964013576508,
0.01815693825483322,
0.06778440624475479,
0.026762980967760086,
0.09920819103717804,
-0.11192908883094788,
-0.0005698124878108501,
0.0015958133153617382,
-0.0062739672139286995,
-0.07447277754545212,
0.058656394481658936,
-0.021885689347982407,
0.06107493117451668,
-0.054370198398828506,
0.03604449704289436,
0.02196403034031391,
0.17340461909770966,
0.05913075432181358,
-0.18755769729614258,
-0.12360177934169769,
0.019453108310699463,
-0.11135684698820114,
-0.10600810497999191,
0.059228796511888504,
0.19256824254989624,
-0.04857228696346283,
0.020899692550301552,
-0.005423428490757942,
0.13805672526359558,
-0.0647655799984932,
-0.019005024805665016,
0.017105024307966232,
0.07322555035352707,
0.01226042676717043,
0.1307641863822937,
-0.25944456458091736,
0.08748684823513031,
0.016802040860056877,
0.09255855530500412,
-0.01569591462612152,
0.0738416388630867,
-0.032945722341537476,
-0.001849616877734661,
0.07391899824142456,
-0.0015623735962435603,
-0.09654086083173752,
-0.21451997756958008,
-0.05004323646426201,
0.028755413368344307,
0.05896037444472313,
-0.009171984158456326,
0.09979170560836792,
-0.013893543742597103,
0.0646403506398201,
-0.02482031285762787,
-0.10773583501577377,
-0.057702988386154175,
-0.1382203847169876,
-0.027321362867951393,
-0.0017942313570529222,
-0.021825799718499184,
-0.030316699296236038,
0.022648900747299194,
-0.015233326703310013,
0.21528181433677673,
-0.16463351249694824,
-0.10184665024280548,
-0.09372101724147797,
0.08397896587848663,
0.12849436700344086,
-0.1044306829571724,
0.02101256512105465,
0.016803273931145668,
0.05785501003265381,
-0.03324870392680168,
-0.056146301329135895,
0.024453461170196533,
-0.05734266713261604,
-0.07395222783088684,
-0.02800862491130829,
0.09188580513000488,
-0.013611365109682083,
0.04868536442518234,
0.008037037216126919,
-0.08237644284963608,
-0.055938731878995895,
-0.13178032636642456,
-0.07404348999261856,
-0.020595885813236237,
0.028904812410473824,
0.004469771403819323,
-0.09630798548460007,
0.08403807133436203,
-0.013077862560749054,
-0.0925079956650734,
0.08646775037050247,
0.19050905108451843,
-0.06447415798902512,
0.014610164798796177,
0.10928977280855179,
-0.054668910801410675,
-0.15850049257278442,
-0.06952540576457977,
0.05457129329442978,
0.09217649698257446,
-0.015822244808077812,
-0.1506875455379486,
0.08131620287895203,
0.04030846804380417,
0.02998209372162819,
-0.0052881017327308655,
-0.27654582262039185,
-0.1265544593334198,
0.05564077943563461,
0.06732700765132904,
0.04245854914188385,
-0.12184881418943405,
-0.04676118493080139,
-0.0642561987042427,
-0.07597605884075165,
0.03475162386894226,
0.05574500560760498,
0.1344139724969864,
-0.029846910387277603,
0.027858393266797066,
0.026819244027137756,
-0.02504350058734417,
0.10155832022428513,
0.010433482006192207,
0.09233403950929642,
-0.015095682814717293,
0.028007872402668,
0.0716983750462532,
-0.062272634357213974,
0.1661137342453003,
-0.16733552515506744,
0.0806940421462059,
-0.2192051112651825,
-0.05617767199873924,
-0.005485954228788614,
-0.013162543065845966,
-0.036535393446683884,
-0.05392110347747803,
-0.0996408611536026,
0.004042124375700951,
0.05521366745233536,
-0.025549214333295822,
0.07369748502969742,
-0.03004850074648857,
-0.05333661288022995,
0.06404811143875122,
0.0879826545715332,
-0.02771921269595623,
-0.1464979499578476,
0.016417864710092545,
0.030137352645397186,
0.07725195586681366,
-0.18401792645454407,
0.019874969497323036,
0.11777713149785995,
0.009795916266739368,
0.1123625859618187,
0.016699327155947685,
-0.06530150771141052,
0.0494445338845253,
0.06692487746477127,
-0.02610694244503975,
-0.1043987050652504,
-0.005566565785557032,
-0.037611015141010284,
-0.09808768332004547,
0.03672010824084282,
0.08733832091093063,
-0.04105040803551674,
-0.019573841243982315,
-0.014000223949551582,
0.0004886625101789832,
-0.0646958127617836,
0.1994573473930359,
0.02158852107822895,
0.08859723806381226,
-0.059258487075567245,
0.07776593416929245,
0.09838572144508362,
-0.10171368718147278,
0.013327157124876976,
0.1631554365158081,
-0.08107081800699234,
-0.022509237751364708,
0.047131117433309555,
0.1001308336853981,
-0.06664356589317322,
-0.06161147728562355,
-0.10673186182975769,
-0.07556050270795822,
0.019759876653552055,
0.028309054672718048,
0.06966042518615723,
0.07512980699539185,
-0.03788204491138458,
0.023798083886504173,
-0.08108849078416824,
0.10020217299461365,
0.07297719269990921,
0.04864281788468361,
-0.138814315199852,
0.14635968208312988,
0.030915172770619392,
0.09503509849309921,
0.0010049985721707344,
0.03340325877070427,
-0.09477106481790543,
0.043419163674116135,
-0.04987666755914688,
0.043239932507276535,
-0.016618747264146805,
0.048247113823890686,
-0.02505432441830635,
0.019713908433914185,
-0.029002679511904716,
0.05008465796709061,
-0.04225466027855873,
-0.030058210715651512,
-0.02636175975203514,
0.0428612194955349,
-0.058889780193567276,
-0.016111981123685837,
0.0064397575333714485,
-0.07476218044757843,
0.10134896636009216,
-0.06317799538373947,
-0.008923115208745003,
0.0028344879392534494,
-0.013438362628221512,
0.06814210116863251,
0.0215196143835783,
0.05601956695318222,
-0.01215142197906971,
-0.0039257267490029335,
0.039989668875932693,
0.018752673640847206,
-0.006432651542127132,
-0.008205274119973183,
0.04678938165307045,
-0.13743096590042114,
-0.1016692966222763,
-0.0992528572678566,
-0.06616175174713135,
-0.07117012143135071,
0.08058414608240128,
0.08984042704105377,
0.07314826548099518,
0.09939182549715042,
-0.030184414237737656,
-0.0035561230033636093,
-0.14690251648426056,
-0.034009940922260284,
0.052300937473773956,
-0.010364423505961895,
-0.1089557334780693,
-0.06206231191754341,
0.0574863962829113,
-0.0436059832572937,
0.1372884213924408,
-0.015828223899006844,
0.04726511985063553,
-0.008457621559500694,
-0.0526999868452549,
0.003675724845379591,
-0.0019317443948239088,
0.22584669291973114,
-0.08585873246192932,
0.01496815960854292,
0.009502417407929897,
-0.0027011013589799404,
0.03479059413075447,
0.13853058218955994,
0.09461283683776855,
0.13550938665866852,
0.05443396419286728,
0.11005063354969025,
-0.052088286727666855,
-0.03304097428917885,
-0.17228670418262482,
0.0546182282269001,
-0.006154348142445087,
0.031240450218319893,
-0.029502341523766518,
0.10550689697265625,
0.14040106534957886,
-0.13281920552253723,
0.09768814593553543,
0.016529899090528488,
-0.09883465617895126,
-0.048350151628255844,
-0.05766000971198082,
-0.05362366512417793,
-0.09293057024478912,
0.016893809661269188,
-0.11378297954797745,
0.025204509496688843,
0.08748647570610046,
0.036625705659389496,
-0.020821135491132736,
0.14937089383602142,
-0.014059382490813732,
-0.06140433996915817,
0.0010674134828150272,
0.023549649864435196,
0.04827383905649185,
0.10256090015172958,
0.00969765055924654,
0.0804479718208313,
-0.05448419973254204,
0.07772470265626907,
0.017363516613841057,
0.020354116335511208,
0.012601516209542751,
-0.010500306263566017,
-0.002550664357841015,
-0.046289995312690735,
0.003615819849073887,
0.08442341536283493,
0.15498656034469604,
0.041337914764881134,
-0.052719105035066605,
-0.04661089926958084,
0.17530639469623566,
-0.05488509684801102,
-0.07582894712686539,
-0.12067648768424988,
0.15583285689353943,
0.06062044948339462,
0.02653389796614647,
0.009955613873898983,
-0.08474984765052795,
-0.0386354960501194,
0.2206091433763504,
0.01675277017056942,
-0.03468768671154976,
-0.03851578012108803,
-0.008794181980192661,
-0.006996873766183853,
-0.033667512238025665,
0.1401340663433075,
0.017621971666812897,
0.21424712240695953,
-0.0022135148756206036,
-0.007491094525903463,
-0.03662240505218506,
-0.03853067010641098,
-0.042311251163482666,
0.1976902335882187,
-0.033160340040922165,
0.03619769588112831,
-0.09176043421030045,
-0.00905757024884224,
0.0409141480922699,
-0.11847597360610962,
0.10622180253267288,
-0.09261263906955719,
-0.07361496239900589,
0.031044717878103256,
0.08121152967214584,
-0.02011038549244404,
0.030233101919293404,
-0.01513722538948059,
0.05434968322515488,
0.03464902937412262,
-0.027753282338380814,
-0.09780889749526978,
-0.1178264394402504,
0.05895833671092987,
-0.022646402940154076,
0.15467464923858643,
0.022612201049923897,
0.08370085060596466,
0.09480507671833038,
0.0181270781904459,
-0.06303675472736359,
0.1205298900604248,
0.03469628840684891,
0.012371526099741459,
0.08192339539527893,
0.11304077506065369,
-0.03614293783903122,
0.14675217866897583,
0.005696719512343407,
-0.032485149800777435,
-0.026324473321437836,
-0.009980556555092335,
-0.014033946208655834,
-0.14513519406318665,
0.004247492179274559,
-0.06760242581367493,
0.1328703910112381,
0.1879468858242035,
-0.04663750156760216,
-0.021523309871554375,
-0.03759315237402916,
0.074337437748909,
-0.018970901146531105,
0.0887443944811821,
-0.0070474776439368725,
-0.16157811880111694,
0.024083679541945457,
-0.016603905707597733,
0.012856646440923214,
-0.1734744906425476,
-0.05572296679019928,
-0.03425164148211479,
-0.02941458858549595,
-0.07458082586526871,
0.13932737708091736,
0.08339447528123856,
0.027843397110700607,
-0.04539243131875992,
-0.20040512084960938,
-0.023149937391281128,
0.04561888054013252,
-0.14305847883224487,
-0.12158652395009995
] |
null | null |
transformers
|
# CodeTrans model for source code summarization sql
Pretrained model on programming language sql using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the sql code snippets.
## Intended uses & limitations
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_sql_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_large_source_code_summarization_sql_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "select time ( col0 ) from tab0"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/source%20code%20summarization/sql/large_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
## Evaluation results
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | SQL | C# |
| -------------------- | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 8.45 | 17.55 | 19.74 |
| CodeTrans-ST-Base | 9.12 | 15.00 | 18.65 |
| CodeTrans-TF-Small | 10.06 | 17.71 | 20.40 |
| CodeTrans-TF-Base | 10.94 | 17.66 | 21.12 |
| CodeTrans-TF-Large | 12.41 | 18.40 | 21.43 |
| CodeTrans-MT-Small | 13.11 | 19.15 | 22.39 |
| CodeTrans-MT-Base | **13.37** | 19.24 | 23.20 |
| CodeTrans-MT-Large | 13.24 | 19.40 | **23.57** |
| CodeTrans-MT-TF-Small | 12.10 | 18.25 | 22.03 |
| CodeTrans-MT-TF-Base | 10.64 | 16.91 | 21.40 |
| CodeTrans-MT-TF-Large | 12.14 | **19.98** | 21.10 |
| CODE-NN | -- | 18.40 | 20.50 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "select time ( col0 ) from tab0"}]}
|
summarization
|
SEBIS/code_trans_t5_large_source_code_summarization_sql_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for source code summarization sql
=================================================
Pretrained model on programming language sql using the t5 large model architecture. It was first released in
this repository. This model is trained on tokenized sql code functions: it works best with tokenized sql functions.
Model description
-----------------
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the source code summarization task for the sql code snippets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the sql function or be fine-tuned on other sql code tasks. It can be used on unparsed and untokenized sql code. However, if the sql code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.
Evaluation results
------------------
For the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
43,
61,
87,
111
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate sql function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 200 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing sql code.\n\n\nEvaluation results\n------------------\n\n\nFor the source code summarization tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11969118565320969,
0.08023728430271149,
-0.0011519391555339098,
0.11024146527051926,
0.045122187584638596,
0.027593201026320457,
0.04461585730314255,
0.09747698903083801,
-0.041880253702402115,
0.06227882206439972,
0.04354472458362579,
-0.0740668848156929,
0.06650934368371964,
0.19976010918617249,
0.02143152803182602,
-0.14391005039215088,
-0.03381099924445152,
0.04287223890423775,
-0.09955555200576782,
0.10871686786413193,
0.08377578109502792,
-0.08315080404281616,
0.07871804386377335,
-0.039757221937179565,
-0.12391301244497299,
0.049173641949892044,
-0.02045220509171486,
-0.023364359512925148,
0.09734588861465454,
0.0850638896226883,
0.12417539954185486,
-0.02491239458322525,
0.06778118759393692,
-0.18396779894828796,
0.006183661054819822,
0.028032835572957993,
0.05346091836690903,
0.03897896409034729,
0.05523134022951126,
0.08831267058849335,
0.1169983372092247,
-0.044311217963695526,
0.027717307209968567,
0.05646877363324165,
-0.0623924620449543,
-0.03177548944950104,
-0.059919871389865875,
0.07389196008443832,
0.06573522835969925,
0.09874629229307175,
-0.006200286094099283,
0.04592330753803253,
-0.09248027950525284,
0.08135213702917099,
0.08289910107851028,
-0.22893454134464264,
-0.02510782517492771,
0.09220288693904877,
0.0701606422662735,
0.03517983853816986,
-0.07746268063783646,
-0.03636070340871811,
0.09300558269023895,
0.04030976444482803,
0.05256329104304314,
-0.08323106914758682,
-0.0327167883515358,
0.00453916797414422,
-0.055351778864860535,
-0.05309734866023064,
0.1812915802001953,
0.04702990502119064,
-0.052032213658094406,
-0.09266408532857895,
-0.054965488612651825,
-0.1682426780462265,
0.04648097977042198,
0.005386314820498228,
0.009679770097136497,
-0.0013036943273618817,
-0.008065671660006046,
-0.013285425491631031,
-0.09964310377836227,
-0.11242375522851944,
0.0047919307835400105,
0.007477161008864641,
0.05595173314213753,
0.03456509858369827,
-0.03436146676540375,
0.08221053332090378,
0.03140072524547577,
-0.04190608114004135,
-0.007749253883957863,
0.005410549696534872,
-0.12973658740520477,
0.007124172057956457,
-0.02300100028514862,
-0.08726362138986588,
-0.0023902987595647573,
0.057792939245700836,
-0.09787116199731827,
0.07463427633047104,
0.11177204549312592,
0.011906429193913937,
0.008911623619496822,
0.2099258452653885,
0.03755200654268265,
-0.15346300601959229,
0.03290407359600067,
0.02483247220516205,
-0.004142468795180321,
0.015463882125914097,
-0.0618228018283844,
-0.04852021858096123,
0.02152761071920395,
0.05640998110175133,
-0.13689643144607544,
0.034986525774002075,
-0.056003935635089874,
-0.018164198845624924,
0.05215724557638168,
-0.13194887340068817,
0.026266513392329216,
0.015600866638123989,
-0.061666350811719894,
-0.048729099333286285,
0.08976615965366364,
-0.1416952759027481,
-0.12569572031497955,
0.002715751528739929,
-0.054584018886089325,
-0.04424021393060684,
-0.11705416440963745,
-0.11249277740716934,
-0.004139692522585392,
-0.015950756147503853,
-0.0001405386719852686,
-0.10522665828466415,
-0.10640092939138412,
-0.009298787452280521,
0.03893793374300003,
-0.006331015378236771,
-0.03463663160800934,
-0.040291111916303635,
0.012387022376060486,
-0.004599358886480331,
-0.024457383900880814,
0.0012163614155724645,
-0.038523051887750626,
0.09209291636943817,
0.07269749790430069,
0.04036233574151993,
-0.025813495740294456,
0.025597894564270973,
-0.090933658182621,
0.0765642449259758,
-0.0944393053650856,
0.08037648350000381,
-0.00567961297929287,
0.043118566274642944,
-0.09316837042570114,
-0.07149774581193924,
-0.005379326641559601,
0.05710708349943161,
0.07725102454423904,
0.04245653748512268,
-0.1607590913772583,
0.028279978781938553,
0.15688934922218323,
-0.11209610849618912,
-0.12922480702400208,
0.1138775497674942,
-0.014997181482613087,
0.031085092574357986,
0.07577154785394669,
0.12546049058437347,
0.15798227488994598,
-0.07853243499994278,
-0.03318510949611664,
0.09271339327096939,
0.04112425073981285,
-0.06559930741786957,
0.06263095140457153,
0.02032693661749363,
-0.012337331660091877,
0.019607387483119965,
0.06742747128009796,
0.06816129386425018,
-0.003834692994132638,
-0.043495308607816696,
-0.038574542850255966,
-0.09529165178537369,
-0.050357166677713394,
-0.009419610723853111,
0.022485122084617615,
-0.05366906896233559,
-0.06211991235613823,
0.03197978809475899,
0.15628421306610107,
-0.08930947631597519,
0.01599416509270668,
-0.08694960176944733,
-0.04631206765770912,
-0.057640910148620605,
0.02075638435781002,
-0.12212703377008438,
-0.007593251299113035,
0.047677285969257355,
-0.04285501316189766,
0.07416162639856339,
0.10956287384033203,
0.00422622449696064,
0.02978503331542015,
-0.05259402468800545,
-0.031399041414260864,
-0.03144659847021103,
-0.061831410974264145,
-0.11129046976566315,
-0.022903451696038246,
-0.088686503469944,
-0.01777881383895874,
-0.026639029383659363,
-0.17661593854427338,
0.0007954513421282172,
0.003978285007178783,
0.030477656051516533,
0.009383377619087696,
-0.025097712874412537,
0.033782199025154114,
0.04438008740544319,
-0.050345320254564285,
-0.07927785068750381,
0.01779904030263424,
0.050087545067071915,
-0.11330699920654297,
-0.036789607256650925,
-0.09260176122188568,
-0.058880116790533066,
0.08472613245248795,
0.1029721274971962,
-0.11590256541967392,
-0.03697742521762848,
-0.024862321093678474,
-0.045886069536209106,
-0.04441521316766739,
-0.07348381727933884,
0.1591721773147583,
0.007088053971529007,
0.16029717028141022,
-0.14026419818401337,
-0.05185556784272194,
-0.04045192152261734,
0.012187303975224495,
0.035497065633535385,
0.14235080778598785,
0.01935119926929474,
-0.13710230588912964,
0.04394681379199028,
-0.035188060253858566,
-0.046322353184223175,
0.16946980357170105,
-0.022987956181168556,
-0.06953712552785873,
0.004174470901489258,
0.10367732495069504,
-0.0047893160954117775,
0.18618834018707275,
-0.05297002196311951,
0.011248926632106304,
-0.009829985909163952,
0.011864813044667244,
0.04163290932774544,
-0.1326434165239334,
0.02752288430929184,
0.04407760500907898,
-0.06803283840417862,
-0.04143717885017395,
-0.036455512046813965,
-0.027332965284585953,
0.04477224126458168,
0.01875717006623745,
0.033736974000930786,
-0.007632770109921694,
-0.03267466649413109,
-0.10526254028081894,
0.1856122761964798,
-0.08055819571018219,
-0.22547724843025208,
-0.16840091347694397,
0.1132662296295166,
-0.023983526974916458,
-0.003850616980344057,
0.028893543407320976,
-0.09479445964097977,
-0.06042177975177765,
-0.0957144945859909,
0.12967421114444733,
-0.09087899327278137,
0.0040765260346233845,
-0.033961113542318344,
0.056881133466959,
0.06672153621912003,
-0.1702185720205307,
0.0359359048306942,
-0.014335650019347668,
0.009090174920856953,
-0.019229810684919357,
-0.06830575317144394,
0.08795022964477539,
0.1156400740146637,
-0.07213618606328964,
0.01299016922712326,
-0.010896896943449974,
0.18433745205402374,
-0.04928913712501526,
0.053056441247463226,
0.16200220584869385,
0.010508108884096146,
0.02876267023384571,
0.05280933901667595,
0.010305354371666908,
-0.0898013710975647,
0.0674145370721817,
0.046555519104003906,
-0.037707921117544174,
-0.22025233507156372,
-0.029719440266489983,
-0.08203154057264328,
0.04303774610161781,
0.10783085972070694,
0.0472433865070343,
-0.15546733140945435,
0.03223268687725067,
-0.008322560228407383,
0.15341751277446747,
-0.03069142997264862,
0.04337391257286072,
0.027620453387498856,
0.02030901238322258,
0.0006375717930495739,
-0.10002239048480988,
0.008380020968616009,
0.08547814190387726,
0.11274631321430206,
0.19423384964466095,
-0.09790655225515366,
0.18714024126529694,
0.0035671258810907602,
0.09983134269714355,
0.044269461184740067,
0.10976999998092651,
-0.11358823627233505,
0.020091449841856956,
0.0032745967619121075,
-0.014851911924779415,
-0.08692934364080429,
0.05069858580827713,
-0.030000168830156326,
0.07825209200382233,
-0.07047931849956512,
0.042228490114212036,
0.019464978948235512,
0.18676647543907166,
0.07097060233354568,
-0.17429940402507782,
-0.13736805319786072,
0.005402950569987297,
-0.08791203796863556,
-0.09892978519201279,
0.06919576972723007,
0.2330641746520996,
-0.050593651831150055,
0.016998156905174255,
-0.006910642609000206,
0.13125170767307281,
-0.07480094581842422,
-0.01738920994102955,
0.038860272616147995,
0.07964972406625748,
0.008203313685953617,
0.12124741822481155,
-0.24205827713012695,
0.08808241039514542,
0.019140657037496567,
0.07833833992481232,
-0.02006080374121666,
0.05965879559516907,
-0.03370724990963936,
-0.005457346327602863,
0.07033717632293701,
0.011340034194290638,
-0.07437217980623245,
-0.19526608288288116,
-0.04701854661107063,
0.028910985216498375,
0.04341542348265648,
0.014023352414369583,
0.08631457388401031,
-0.02419600635766983,
0.04565152898430824,
-0.018064161762595177,
-0.10354346036911011,
-0.06824512779712677,
-0.14311254024505615,
-0.05119336396455765,
-0.003382202936336398,
-0.04970186576247215,
-0.036061111837625504,
0.04472251608967781,
0.05495936796069145,
0.21918149292469025,
-0.13837449252605438,
-0.0808972641825676,
-0.09318684041500092,
0.08514375239610672,
0.13528752326965332,
-0.08741454780101776,
0.030549516901373863,
0.015352629125118256,
0.061224572360515594,
-0.02606748230755329,
-0.0769106075167656,
0.03574135899543762,
-0.05074035003781319,
-0.07185772806406021,
-0.03397456184029579,
0.11170737445354462,
-0.007869742810726166,
0.054411567747592926,
0.021592196077108383,
-0.08912254124879837,
-0.031395602971315384,
-0.12425969541072845,
-0.08330811560153961,
-0.04494994506239891,
0.027466164901852608,
0.012483756989240646,
-0.13450291752815247,
0.058391500264406204,
0.005314232315868139,
-0.08131474256515503,
0.0894165113568306,
0.13812561333179474,
-0.07074704021215439,
0.020013101398944855,
0.07236190885305405,
-0.0515044741332531,
-0.18909138441085815,
-0.029819456860423088,
0.04540925472974777,
0.08526472002267838,
-0.025405755266547203,
-0.1464519500732422,
0.07389260083436966,
0.011298594996333122,
0.03330489248037338,
0.005949843674898148,
-0.24627520143985748,
-0.12611344456672668,
0.01452472060918808,
0.06400955468416214,
0.021332185715436935,
-0.10738090425729752,
-0.05054100975394249,
-0.06446757167577744,
-0.03312109783291817,
0.07993008941411972,
0.046401333063840866,
0.11601405590772629,
-0.025011103600263596,
0.026674747467041016,
0.04293432459235191,
-0.023286446928977966,
0.059648800641298294,
-0.009577179327607155,
0.09400969743728638,
-0.0238609928637743,
0.028511598706245422,
0.06457299739122391,
-0.059901539236307144,
0.17618699371814728,
-0.17042258381843567,
0.10092359036207199,
-0.1844867467880249,
-0.044368259608745575,
-0.02805912494659424,
0.0010064879897981882,
-0.0322878435254097,
-0.042172227054834366,
-0.11498469114303589,
0.041893135756254196,
0.05623816326260567,
-0.027198245748877525,
0.010722016915678978,
-0.018552251160144806,
-0.054832346737384796,
0.07476025074720383,
0.09965342283248901,
-0.006076320074498653,
-0.13616709411144257,
0.04131759703159332,
0.028622809797525406,
0.09134690463542938,
-0.1698082834482193,
0.026644060388207436,
0.10789638757705688,
0.014728114940226078,
0.09231072664260864,
0.016084078699350357,
-0.07400566339492798,
0.025750866159796715,
0.06857144087553024,
-0.06011556461453438,
-0.08759147673845291,
-0.0229891836643219,
-0.03258129209280014,
-0.0933443084359169,
0.016920000314712524,
0.0819188803434372,
-0.04993908479809761,
-0.0021391024347394705,
-0.006552629638463259,
-0.002263980684801936,
-0.0749034583568573,
0.18909981846809387,
0.022785769775509834,
0.07767876982688904,
-0.05257401242852211,
0.08162286877632141,
0.0924273207783699,
-0.10543199628591537,
0.0283904317766428,
0.16046789288520813,
-0.0936938226222992,
-0.025368334725499153,
0.08732137829065323,
0.11318113654851913,
-0.0386664979159832,
-0.055838070809841156,
-0.09467168897390366,
-0.08866580575704575,
0.01927550509572029,
0.07337775826454163,
0.064877949655056,
0.08924581855535507,
-0.033149588853120804,
0.008235149085521698,
-0.10407907515764236,
0.09589873999357224,
0.08550944179296494,
0.043026868253946304,
-0.12645173072814941,
0.15223069489002228,
0.027012109756469727,
0.07910396158695221,
-0.002870277501642704,
0.0362766869366169,
-0.11848339438438416,
0.03194228932261467,
-0.06250396370887756,
0.05113425478339195,
-0.001969743287190795,
0.04374115541577339,
-0.04032498598098755,
0.03606600686907768,
-0.02691327966749668,
0.04322671890258789,
-0.031955767422914505,
-0.02501397393643856,
-0.03226786106824875,
0.03020234778523445,
-0.05516958236694336,
-0.012853818945586681,
0.007801314815878868,
-0.08300934731960297,
0.09345796704292297,
-0.06051553413271904,
-0.016402242705225945,
0.002486326964572072,
0.007691385690122843,
0.03432561457157135,
0.008599418215453625,
0.06219301000237465,
-0.0009797345846891403,
0.005659829825162888,
0.035965465009212494,
0.026403721421957016,
-0.008373922668397427,
-0.009817291051149368,
0.0907631367444992,
-0.1253151148557663,
-0.08886025846004486,
-0.07871710509061813,
-0.06637779623270035,
-0.06569422781467438,
0.07220438867807388,
0.09577129036188126,
0.06598799675703049,
0.1016596332192421,
-0.04187103733420372,
-0.003823069389909506,
-0.18057098984718323,
-0.039914995431900024,
0.052285853773355484,
0.0007907289545983076,
-0.12142428755760193,
-0.04692201316356659,
0.06883257627487183,
-0.027854369953274727,
0.1251930445432663,
-0.013223597779870033,
0.035011064261198044,
-0.003951463848352432,
-0.04229632019996643,
-0.03157614544034004,
0.008043015375733376,
0.20107266306877136,
-0.09697537869215012,
0.00825404655188322,
-0.00229801912792027,
-0.0009367395541630685,
0.02269647642970085,
0.14716248214244843,
0.13043145835399628,
0.13648909330368042,
0.02402927726507187,
0.08852533251047134,
-0.035270582884550095,
-0.027339709922671318,
-0.1189565658569336,
0.06837424635887146,
-0.029931312426924706,
0.038245897740125656,
-0.04080767557024956,
0.12395646423101425,
0.07125787436962128,
-0.13961000740528107,
0.10494975000619888,
-0.01194628607481718,
-0.1030743271112442,
-0.0368223637342453,
-0.07717273384332657,
-0.04515568166971207,
-0.09556875377893448,
-0.0023339316248893738,
-0.10639692842960358,
-0.003653146792203188,
0.0660194605588913,
0.028742332011461258,
-0.033089712262153625,
0.17229171097278595,
-0.04610788822174072,
-0.05696727707982063,
0.01875362917780876,
0.049360353499650955,
0.03140240162611008,
0.08526156097650528,
0.017337925732135773,
0.061769235879182816,
-0.04431067034602165,
0.06225694343447685,
0.028318339958786964,
0.01589086838066578,
0.016772491857409477,
0.023050464689731598,
0.003975577186793089,
-0.0474197156727314,
-0.009971708059310913,
0.07755383104085922,
0.13374006748199463,
0.03787901625037193,
-0.04447130113840103,
-0.05080406740307808,
0.17340964078903198,
-0.05288701876997948,
-0.07088273763656616,
-0.13011276721954346,
0.15717042982578278,
0.023301444947719574,
0.015038687735795975,
0.020817004144191742,
-0.08449529856443405,
-0.015372790396213531,
0.2377549409866333,
0.06795855611562729,
-0.0768517330288887,
-0.029422760009765625,
-0.01628495566546917,
-0.009173501282930374,
-0.042400848120450974,
0.16110317409038544,
0.01574563793838024,
0.23440727591514587,
0.010057690553367138,
-0.0116060059517622,
-0.040880028158426285,
-0.041651058942079544,
-0.010052929632365704,
0.2035224735736847,
-0.03030434437096119,
0.04064805805683136,
-0.09643026441335678,
-0.013608100824058056,
0.017749710008502007,
-0.14655084908008575,
0.11214494705200195,
-0.12596093118190765,
-0.06979145854711533,
0.006421835161745548,
0.04743122309446335,
-0.03731273487210274,
0.036183953285217285,
-0.029405120760202408,
0.07312800735235214,
0.03793759644031525,
-0.028525885194540024,
-0.11452925950288773,
-0.13530364632606506,
0.06348146498203278,
-0.01298269908875227,
0.13835091888904572,
0.011374437250196934,
0.08190617710351944,
0.08901868015527725,
0.017210576683282852,
-0.07371993362903595,
0.08905576169490814,
0.034393440932035446,
-0.002537925960496068,
0.05295965075492859,
0.0956486314535141,
-0.04727177694439888,
0.18029999732971191,
0.006782247684895992,
-0.039674509316682816,
-0.028601830825209618,
-0.007890609093010426,
-0.010938254185020924,
-0.16028070449829102,
0.0057618082500994205,
-0.07170559465885162,
0.14471736550331116,
0.20002011954784393,
-0.045826446264982224,
-0.011327668093144894,
-0.053453344851732254,
0.08715138584375381,
-0.00962060410529375,
0.07702765613794327,
0.0013678097166121006,
-0.15776662528514862,
0.006642594002187252,
-0.04194845259189606,
0.008259287104010582,
-0.19981881976127625,
-0.06161845475435257,
-0.04311720281839371,
-0.028073979541659355,
-0.08783337473869324,
0.1531226933002472,
0.07759659737348557,
0.04672754183411598,
-0.04658239334821701,
-0.11499668657779694,
-0.02152913063764572,
0.04064545780420303,
-0.12678274512290955,
-0.11949104815721512
] |
null | null |
transformers
|
# CodeTrans transfer learning pre-trained model
Pretrained model on programming languages using the t5 large model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-large` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain.
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
It could be used to fine-tune other tasks in the software development domain.
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{}
|
feature-extraction
|
SEBIS/code_trans_t5_large_transfer_learning_pretrain
|
[
"transformers",
"pytorch",
"t5",
"feature-extraction",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us
|
# CodeTrans transfer learning pre-trained model
Pretrained model on programming languages using the t5 large model architecture. It was first released in
this repository.
## Model description
This CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain.
The model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
It could be used to fine-tune other tasks in the software development domain.
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
|
[
"# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 large model architecture. It was first released in\nthis repository.",
"## Model description\n\nThis CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn"
] |
[
"TAGS\n#transformers #pytorch #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n",
"# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 large model architecture. It was first released in\nthis repository.",
"## Model description\n\nThis CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn"
] |
[
39,
38,
168
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #feature-extraction #endpoints_compatible #text-generation-inference #region-us \n# CodeTrans transfer learning pre-trained model\nPretrained model on programming languages using the t5 large model architecture. It was first released in\nthis repository.## Model description\n\nThis CodeTrans model is based on the 't5-large' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. \n\nThe model was trained on a single TPU Pod V3-8 for 240,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training. \n\nIt could be used to fine-tune other tasks in the software development domain.\n\n\n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn"
] |
[
-0.06812790781259537,
-0.07945852726697922,
0.00043117188033647835,
0.10181204229593277,
0.16370056569576263,
0.03632139414548874,
0.12221328169107437,
-0.015543529763817787,
-0.08257314562797546,
-0.01222257036715746,
0.055361609905958176,
0.032559581100940704,
0.011749843135476112,
0.18016588687896729,
0.06025205925107002,
-0.22915241122245789,
0.029687941074371338,
0.019223982468247414,
-0.04680325463414192,
0.10394173860549927,
0.09399310499429703,
-0.0677153617143631,
0.08830861747264862,
-0.013198938220739365,
-0.18392899632453918,
0.017029346898198128,
-0.0298304446041584,
-0.06293051689863205,
0.10690025240182877,
0.06563351303339005,
0.13411618769168854,
0.0071020727045834064,
0.0778079628944397,
-0.025480983778834343,
0.011139384470880032,
0.06691388040781021,
-0.014101910404860973,
0.008144023828208447,
0.0320020392537117,
0.11461915820837021,
0.19147272408008575,
0.08059918135404587,
0.06231558322906494,
0.05213161185383797,
-0.08603829145431519,
-0.0026625297032296658,
0.039721615612506866,
0.09424149990081787,
0.07134397327899933,
0.12813018262386322,
0.013757693581283092,
0.16872429847717285,
-0.09959898889064789,
0.12527957558631897,
-0.016201207414269447,
-0.317757785320282,
-0.04709992930293083,
0.15015994012355804,
0.0748414546251297,
0.1295749843120575,
0.030948204919695854,
-0.03586738184094429,
0.03283413499593735,
0.0757024958729744,
0.1268938034772873,
-0.06248294189572334,
-0.06559748202562332,
-0.08403461426496506,
-0.11020352691411972,
-0.07471843808889389,
0.2666582465171814,
0.015920206904411316,
-0.023532893508672714,
-0.08975405246019363,
-0.10891661792993546,
-0.09985041618347168,
0.054113347083330154,
-0.04051804915070534,
0.0012082243338227272,
0.056639961898326874,
0.09699881076812744,
-0.04327520355582237,
-0.11633482575416565,
-0.09375890344381332,
-0.011209962889552116,
0.1309460699558258,
0.05184952914714813,
0.05813460424542427,
-0.08025514334440231,
0.0886128768324852,
0.03834398090839386,
-0.049644775688648224,
0.017918137833476067,
-0.05939373001456261,
-0.11692019551992416,
0.019108226522803307,
-0.039560262113809586,
-0.19344285130500793,
-0.027049804106354713,
0.040457434952259064,
0.04717124253511429,
0.013433963060379028,
0.13788877427577972,
0.03351276367902756,
0.039763305336236954,
0.12479991465806961,
-0.05447438359260559,
-0.05459246784448624,
0.05305001512169838,
-0.018254568800330162,
-0.04601525142788887,
-0.03732552379369736,
-0.14314229786396027,
-0.023671869188547134,
0.02325812727212906,
0.02614663541316986,
-0.05812501162290573,
0.10416531562805176,
-0.017966989427804947,
-0.0840616300702095,
0.16005705296993256,
-0.08313874900341034,
-0.08122484385967255,
-0.011448626406490803,
-0.0007757097482681274,
-0.019196489825844765,
0.07952994853258133,
-0.05891614034771919,
-0.12724241614341736,
-0.05691635608673096,
-0.07320798188447952,
-0.101875901222229,
-0.12295060604810715,
-0.15294179320335388,
-0.045157793909311295,
-0.13466542959213257,
0.030205948278307915,
-0.213526651263237,
-0.1342976838350296,
-0.06233392655849457,
0.06807772815227509,
0.0237649567425251,
-0.07870805263519287,
0.0014960638945922256,
-0.030045107007026672,
-0.04573468118906021,
-0.05587843060493469,
-0.02659035101532936,
-0.0394088514149189,
0.077348493039608,
-0.041225992143154144,
0.030108029022812843,
-0.10059928148984909,
0.04416443035006523,
-0.07845113426446915,
-0.013852383941411972,
-0.12508806586265564,
0.07434640824794769,
0.051577769219875336,
0.1200682669878006,
-0.05815788730978966,
-0.09440989047288895,
-0.04756271094083786,
0.0535358265042305,
-0.00041647159378044307,
0.028088513761758804,
-0.06880633533000946,
0.013852303847670555,
0.08486492931842804,
-0.14696282148361206,
-0.13635532557964325,
0.08195273578166962,
0.0047169700264930725,
0.1780560165643692,
0.07535004615783691,
0.14755523204803467,
0.15139497816562653,
-0.043427761644124985,
0.08848536014556885,
0.03285190463066101,
-0.07880287617444992,
-0.17556685209274292,
0.03592396154999733,
0.08357623219490051,
-0.06400936841964722,
0.014038640074431896,
-0.08612372726202011,
0.11191436648368835,
-0.018334010615944862,
-0.03189733624458313,
-0.006086536217480898,
-0.10163045674562454,
-0.07355780899524689,
0.020187003538012505,
0.11598300188779831,
0.029518330469727516,
-0.05543886870145798,
0.012546161189675331,
0.08973422646522522,
-0.12241104245185852,
0.015021946281194687,
-0.10633563250303268,
-0.05265400558710098,
-0.03134867176413536,
0.01252766977995634,
-0.21105410158634186,
-0.04016902297735214,
0.07376409322023392,
0.04754587262868881,
0.018311530351638794,
0.227713942527771,
0.01718110963702202,
0.0441165566444397,
0.009103084914386272,
-0.04571295157074928,
-0.10046333074569702,
-0.018122883513569832,
-0.06740538775920868,
-0.016657358035445213,
-0.09962544590234756,
-0.055027544498443604,
-0.08312201499938965,
-0.1054411232471466,
0.028388844802975655,
-0.07321159541606903,
0.009459671564400196,
0.03954363241791725,
-0.011850106529891491,
-0.00702937226742506,
0.07048618048429489,
-0.05029796063899994,
-0.07325903326272964,
0.07748579233884811,
0.10037028789520264,
-0.03585905581712723,
0.02146664261817932,
-0.15463460981845856,
0.09756146371364594,
0.0850912481546402,
-0.015645237639546394,
-0.10635016113519669,
0.06389801949262619,
-0.054670300334692,
-0.02693946473300457,
0.05338434875011444,
-0.042066365480422974,
0.21022310853004456,
-0.032529182732105255,
0.1585693508386612,
-0.10766304284334183,
0.01389438658952713,
0.0019345579203218222,
0.04325500503182411,
0.09616144001483917,
0.06814549118280411,
0.0546310730278492,
-0.08779171109199524,
0.024241674691438675,
-0.000566445232834667,
-0.01770496554672718,
0.17010906338691711,
-0.03447209298610687,
-0.056414809077978134,
0.03191884607076645,
0.016224516555666924,
0.014354133047163486,
0.044396646320819855,
-0.13625192642211914,
-0.051241595298051834,
0.026303062215447426,
0.035002388060092926,
0.05883685499429703,
-0.07898195087909698,
0.02721540816128254,
0.0281961802393198,
0.013909704983234406,
0.03112872503697872,
0.000664525490719825,
-0.09143707901239395,
0.0483538918197155,
0.0157356858253479,
-0.07133766263723373,
0.025600185617804527,
-0.001865404425188899,
-0.10433939099311829,
0.16398213803768158,
-0.029410576447844505,
-0.22263923287391663,
-0.13306039571762085,
0.16729849576950073,
-0.020682798698544502,
0.047735169529914856,
0.046118978410959244,
-0.024266613647341728,
-0.08066186308860779,
-0.08804534375667572,
0.11885307729244232,
-0.06437776237726212,
0.018802020698785782,
0.007471133954823017,
0.03729921951889992,
-0.02366720698773861,
-0.14605467021465302,
0.024444295093417168,
-0.01719502918422222,
-0.08718222379684448,
0.06435227394104004,
-0.134562686085701,
0.11285755783319473,
0.20293287932872772,
-0.04176943749189377,
0.027090195566415787,
-0.006952623836696148,
0.17381365597248077,
-0.04955620691180229,
0.06692630052566528,
0.24936063587665558,
0.013807668350636959,
-0.021351292729377747,
0.06619832664728165,
-0.027393469586968422,
-0.11277948319911957,
0.10346879065036774,
-0.04610646516084671,
-0.09073060005903244,
-0.1829238086938858,
-0.10194466263055801,
-0.10021506994962692,
-0.000456479552667588,
0.11484156548976898,
0.06299865245819092,
0.06323328614234924,
0.08030321449041367,
0.018900403752923012,
0.12654492259025574,
-0.021297074854373932,
0.05102657154202461,
0.05716070160269737,
0.01748776063323021,
0.051516540348529816,
-0.09063097834587097,
-0.015552793629467487,
0.05536213517189026,
0.027398861944675446,
0.22275422513484955,
-0.05603494867682457,
0.19181843101978302,
0.06368324160575867,
0.06281676143407822,
0.06905961036682129,
0.1780298501253128,
-0.05766728147864342,
0.02036733739078045,
-0.021470775827765465,
-0.014829353429377079,
-0.11067768186330795,
0.07198916375637054,
-0.04942579194903374,
-0.0767379105091095,
-0.05703102424740791,
0.019607912749052048,
-0.037003498524427414,
0.2559061050415039,
-0.020961660891771317,
-0.25499609112739563,
-0.09561201930046082,
-0.03637192025780678,
-0.03321077302098274,
-0.09552233666181564,
0.07201797515153885,
0.22837281227111816,
-0.04111237823963165,
-0.08523888140916824,
0.003834253177046776,
0.12126083672046661,
-0.03521215543150902,
-0.0013963829260319471,
0.05665986239910126,
0.026562906801700592,
0.057377856224775314,
0.07487042248249054,
-0.1920003592967987,
0.10965608805418015,
-0.021884860470891,
0.08939225971698761,
-0.07432679086923599,
0.014726100489497185,
-0.060607634484767914,
0.11755117028951645,
0.08064465969800949,
0.016846375539898872,
0.05008672550320625,
-0.10641637444496155,
-0.05859769508242607,
0.041204631328582764,
0.013742087408900261,
-0.05972225219011307,
0.04045001044869423,
-0.01188942976295948,
0.064398854970932,
0.007940766401588917,
-0.02609197422862053,
-0.09171275049448013,
-0.07214628159999847,
-0.004193893633782864,
0.030703604221343994,
0.05769466981291771,
-0.037075091153383255,
-0.021653348580002785,
0.05054258182644844,
0.15004928410053253,
-0.055575158447027206,
-0.08131840825080872,
-0.09985276311635971,
-0.0644773542881012,
0.09192708879709244,
-0.06183972954750061,
0.0838201567530632,
0.00008604133472545072,
-0.027258705347776413,
-0.005248819477856159,
-0.10590272396802902,
0.08588571101427078,
-0.07828245311975479,
-0.06321294605731964,
-0.005436991807073355,
-0.013540135696530342,
0.038446471095085144,
0.01708165928721428,
0.0193479061126709,
-0.07379131764173508,
-0.03548120707273483,
-0.11913694441318512,
-0.12384633719921112,
0.014904513955116272,
0.07440080493688583,
-0.049125026911497116,
-0.09224098920822144,
0.05981213599443436,
0.014067339710891247,
-0.012206575833261013,
0.09347624331712723,
-0.008903933688998222,
-0.06040284410119057,
0.018419383093714714,
0.1752079874277115,
-0.023631403222680092,
-0.24745070934295654,
-0.06424461305141449,
0.09059647470712662,
0.07596538215875626,
-0.03307736665010452,
-0.17574337124824524,
0.06965576857328415,
-0.012777630239725113,
0.03132981061935425,
-0.010147876106202602,
-0.2936939001083374,
-0.10672125965356827,
0.07703658193349838,
0.14115579426288605,
0.17156310379505157,
-0.14092446863651276,
0.026424653828144073,
0.004504672717303038,
-0.12395340204238892,
0.13980436325073242,
-0.03881178796291351,
0.12247574329376221,
-0.02590896375477314,
-0.06752897799015045,
0.009835075587034225,
-0.04742830619215965,
0.010319812223315239,
0.07080449908971786,
0.08037121593952179,
-0.06210312992334366,
0.0942189171910286,
0.15721595287322998,
-0.03652684763073921,
0.15717147290706635,
0.02559964545071125,
0.09269007295370102,
-0.17030107975006104,
-0.11160840839147568,
-0.06793146580457687,
0.02170969732105732,
0.014654279686510563,
-0.12301469594240189,
-0.03342519327998161,
0.031295038759708405,
0.042379800230264664,
-0.0037846583873033524,
0.05274881049990654,
-0.025567425414919853,
-0.05814099684357643,
0.09427489340305328,
0.08968324214220047,
-0.09587914496660233,
-0.092948317527771,
0.0033337727654725313,
0.006499086506664753,
0.14905571937561035,
-0.1773529201745987,
0.004819729831069708,
0.09617903083562851,
-0.04428746923804283,
0.06113763526082039,
0.037062693387269974,
-0.020055456086993217,
0.005564725026488304,
0.057006049901247025,
-0.09107217192649841,
-0.08733519166707993,
-0.030344679951667786,
-0.07005981355905533,
-0.02533062733709812,
0.04852613806724548,
0.11150431632995605,
-0.11381994187831879,
0.009062804281711578,
-0.03699841350317001,
-0.06655482947826385,
-0.04439592361450195,
0.13420680165290833,
0.03197764232754707,
0.025472790002822876,
-0.06625797599554062,
0.06853051483631134,
0.09024561941623688,
-0.07484471797943115,
0.011377174407243729,
0.08964937180280685,
-0.16689980030059814,
-0.08207936584949493,
0.04419418051838875,
0.024782635271549225,
-0.0384693406522274,
-0.07518623024225235,
-0.0859723836183548,
-0.07463851571083069,
0.027685759589076042,
0.04550307244062424,
0.04693653807044029,
0.07748901098966599,
-0.09907905012369156,
-0.007434487342834473,
-0.08566318452358246,
-0.016519863158464432,
0.030910270288586617,
0.04102391004562378,
-0.17933687567710876,
0.18827661871910095,
0.07582543790340424,
0.07459496706724167,
-0.05670950189232826,
-0.022585704922676086,
-0.10488678514957428,
0.033657751977443695,
0.02591746672987938,
0.01255518477410078,
-0.02738923393189907,
0.011501160450279713,
-0.030307462438941002,
-0.02748817764222622,
-0.03815946727991104,
0.08071210235357285,
-0.05059779807925224,
0.008883536793291569,
-0.022992495447397232,
0.04727437347173691,
0.0544750913977623,
-0.023470256477594376,
-0.03724135458469391,
-0.08885113894939423,
0.12025180459022522,
-0.044325727969408035,
-0.1003669947385788,
0.016238849610090256,
-0.03562550246715546,
0.06109418347477913,
0.051868051290512085,
0.08456289023160934,
-0.015383790247142315,
0.1032792180776596,
0.01207012590020895,
0.05042174085974693,
0.034659843891859055,
-0.018917271867394447,
0.03915885463356972,
-0.10250969231128693,
0.0021479276474565268,
-0.0701301246881485,
-0.02878326177597046,
-0.059389322996139526,
-0.01868256740272045,
0.09335958957672119,
0.11657647788524628,
0.12752757966518402,
-0.012423647567629814,
0.021925391629338264,
-0.13777323067188263,
-0.025710131973028183,
0.07470932602882385,
0.006175062619149685,
-0.02669699303805828,
-0.13059571385383606,
0.03659442067146301,
0.018727552145719528,
0.08957581967115402,
0.026631126180291176,
0.03401074931025505,
-0.023865198716521263,
0.05068065971136093,
0.028265049681067467,
-0.01833180896937847,
0.18896600604057312,
-0.007414734922349453,
0.045523159205913544,
-0.01741822622716427,
0.04204624146223068,
0.028680957853794098,
0.134930819272995,
0.12373877316713333,
0.09023004025220871,
-0.02442678064107895,
0.08286368101835251,
-0.015161557123064995,
-0.002594806719571352,
-0.12433455884456635,
-0.0071974946185946465,
-0.04329432174563408,
0.10356853902339935,
-0.07138728350400925,
0.014220856130123138,
0.12091588228940964,
-0.014490230940282345,
0.04190189018845558,
0.04106300696730614,
-0.09195394814014435,
-0.010354109108448029,
-0.09722821414470673,
-0.052098676562309265,
-0.10500122606754303,
-0.012472033500671387,
-0.07589845359325409,
-0.06487791240215302,
0.14620617032051086,
0.00820166990160942,
-0.0390690416097641,
0.16417266428470612,
-0.01602732390165329,
-0.061228808015584946,
0.01998882181942463,
-0.028068840503692627,
0.058579422533512115,
-0.021710414439439774,
-0.01909146085381508,
0.0023699793964624405,
0.025728337466716766,
0.049009159207344055,
0.014103156514465809,
-0.028476517647504807,
-0.0002540165151003748,
0.009368144907057285,
-0.004256861750036478,
-0.052205950021743774,
0.03546634688973427,
-0.030014798045158386,
0.1711076945066452,
0.02322126366198063,
-0.1128895953297615,
0.01367744617164135,
0.12918248772621155,
-0.013357392512261868,
-0.15706102550029755,
-0.1531982123851776,
0.07124192267656326,
0.04139166697859764,
-0.01873871125280857,
0.029428228735923767,
-0.0274803563952446,
-0.06544648855924606,
0.2612106502056122,
0.10875453054904938,
-0.05666247382760048,
-0.020515328273177147,
0.016257070004940033,
-0.00911786500364542,
-0.0595427043735981,
0.2620750069618225,
0.032078154385089874,
0.25775355100631714,
0.0007102456293068826,
0.00997459888458252,
-0.0979715958237648,
0.011004403233528137,
-0.044233597815036774,
0.002370438538491726,
0.003265331033617258,
-0.039412714540958405,
-0.04413292557001114,
0.03336259350180626,
-0.020572654902935028,
-0.04233133792877197,
0.14129549264907837,
-0.026006711646914482,
-0.04565888270735741,
-0.023303985595703125,
-0.0033864425495266914,
-0.03209860995411873,
0.09152529388666153,
-0.05459900572896004,
0.054423775523900986,
0.05742983520030975,
-0.033909667283296585,
-0.11458897590637207,
-0.04348921403288841,
0.09134045243263245,
-0.049623213708400726,
0.14930656552314758,
-0.035440005362033844,
0.08051396906375885,
0.022525912150740623,
0.03230566903948784,
-0.09046551585197449,
0.12867151200771332,
-0.0371890589594841,
0.030843675136566162,
0.052718207240104675,
-0.036938030272722244,
-0.07287907600402832,
0.06338995695114136,
-0.04425868019461632,
-0.17477096617221832,
-0.0019037339370697737,
0.02968701161444187,
0.03940919041633606,
-0.1037421002984047,
-0.026867927983403206,
-0.06473993510007858,
0.14047707617282867,
0.0840931236743927,
-0.013895775191485882,
-0.050013329833745956,
-0.0856819674372673,
0.059033870697021484,
0.007621164433658123,
0.023734180256724358,
-0.042564086616039276,
-0.16209791600704193,
-0.07283356040716171,
-0.08934062719345093,
0.007237143348902464,
-0.135257288813591,
-0.012870940379798412,
-0.10391809046268463,
-0.022030498832464218,
-0.10833482444286346,
0.07811600714921951,
0.0514385849237442,
0.02257283218204975,
-0.024944638833403587,
0.008737298659980297,
-0.02181970700621605,
0.057735033333301544,
-0.14177252352237701,
-0.1735897660255432
] |
null | null |
transformers
|
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on Api Recommendation Generation dataset.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_api_generation"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_api_generation", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/api%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]}
|
summarization
|
SEBIS/code_trans_t5_small_api_generation
|
[
"transformers",
"pytorch",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on Api Recommendation Generation dataset.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
43,
112
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08988220244646072,
0.0011700736358761787,
-0.0005916694644838572,
0.05899043008685112,
0.1306508183479309,
0.002515610074624419,
0.08523766696453094,
0.05626345053315163,
0.009586567990481853,
-0.05055145546793938,
0.08392369002103806,
0.1345910131931305,
0.032039083540439606,
0.13981787860393524,
-0.035493846982717514,
-0.21899881958961487,
0.003652178216725588,
0.05893169716000557,
-0.14082516729831696,
0.11976668238639832,
0.1355443298816681,
-0.05337446555495262,
0.10154765844345093,
-0.00574476458132267,
-0.22599144279956818,
0.06357136368751526,
-0.032047826796770096,
-0.08813145756721497,
0.13124816119670868,
0.08680086582899094,
0.11173763871192932,
0.03918364644050598,
0.003651048056781292,
-0.22112230956554413,
0.03465387225151062,
-0.03963898867368698,
0.00798027589917183,
0.0511385016143322,
0.047929324209690094,
-0.023854168131947517,
0.19435393810272217,
-0.008722137659788132,
0.004094233270734549,
0.05297274515032768,
-0.1120922639966011,
-0.0804741308093071,
-0.014952254481613636,
-0.010383839718997478,
0.08483249694108963,
0.06383771449327469,
0.017854878678917885,
0.11873018741607666,
-0.1371668130159378,
0.12510235607624054,
0.09367094933986664,
-0.16231712698936462,
-0.01706589385867119,
0.1183757334947586,
0.08906234800815582,
-0.051917724311351776,
-0.052522797137498856,
0.004948686342686415,
0.07585868239402771,
0.023956263437867165,
0.028646130114793777,
-0.1363506019115448,
-0.20030298829078674,
0.07398135960102081,
-0.05543311685323715,
-0.055355966091156006,
0.2960522770881653,
-0.003404274582862854,
-0.03358874097466469,
-0.03888287767767906,
-0.030471699312329292,
0.04032648354768753,
0.011538791470229626,
-0.01169701386243105,
0.008602617308497429,
-0.005970461759716272,
-0.0008233806001953781,
-0.019247429445385933,
-0.10768236964941025,
-0.12401290237903595,
-0.003489702707156539,
0.07162032276391983,
-0.0038181354757398367,
0.027238506823778152,
-0.16689111292362213,
0.09237004816532135,
0.06295876204967499,
-0.09003796428442001,
0.02909884601831436,
-0.07127001881599426,
-0.023978563025593758,
-0.010045590810477734,
-0.04352947697043419,
-0.1783963292837143,
0.09366919845342636,
0.024683896452188492,
-0.06293532252311707,
0.04624905809760094,
0.01036843005567789,
0.08267419785261154,
0.06778648495674133,
0.17335742712020874,
-0.006172779016196728,
-0.07085349410772324,
0.03781725838780403,
-0.030799640342593193,
-0.05727875977754593,
0.008080757223069668,
-0.07241763174533844,
-0.03846518695354462,
0.017628883942961693,
0.12061984837055206,
-0.10832293331623077,
0.07351330667734146,
-0.06310901045799255,
-0.03801019489765167,
0.015371949411928654,
-0.13754089176654816,
-0.03307241201400757,
-0.0005734566366299987,
-0.06501809507608414,
-0.042082641273736954,
0.10656028240919113,
-0.05201490968465805,
-0.11328966170549393,
-0.033891648054122925,
-0.0763145461678505,
-0.0019025900401175022,
-0.09781812876462936,
-0.07802879065275192,
0.013595929369330406,
0.032407328486442566,
0.0674436166882515,
-0.12299999594688416,
-0.18766485154628754,
0.0029779518954455853,
0.07971084117889404,
-0.009050820954144001,
0.048650551587343216,
-0.09571053087711334,
-0.014416955411434174,
-0.04566057771444321,
-0.02493700198829174,
0.04697229713201523,
-0.0691295862197876,
0.076988585293293,
0.0912989005446434,
0.055363066494464874,
-0.07416970282793045,
0.05298998951911926,
-0.13486631214618683,
0.06589051336050034,
-0.17332588136196136,
0.09350299835205078,
-0.0423872210085392,
0.11809054017066956,
-0.0999697893857956,
-0.05775216966867447,
0.04809259995818138,
0.06353237479925156,
0.0544382780790329,
0.11810354143381119,
-0.15005548298358917,
-0.03506895899772644,
0.1340782791376114,
-0.12139983475208282,
-0.21874426305294037,
0.06680528074502945,
-0.06771309673786163,
0.20661048591136932,
0.048259980976581573,
0.20647209882736206,
0.14005877077579498,
-0.021974284201860428,
0.07561580836772919,
0.08897511661052704,
-0.04537069424986839,
-0.07797744125127792,
0.06760623306035995,
0.06577382981777191,
-0.13850486278533936,
0.06226293742656708,
-0.020350849255919456,
0.10892961919307709,
-0.04030809924006462,
-0.042194515466690063,
-0.005163861438632011,
-0.06631151586771011,
0.027598446235060692,
-0.011642970144748688,
0.08378174901008606,
-0.005563246086239815,
0.023399831727147102,
0.062048859894275665,
0.10612904280424118,
-0.12023647129535675,
-0.0063996268436312675,
-0.09589648246765137,
0.03322671353816986,
-0.1138787642121315,
0.0316421315073967,
-0.20985278487205505,
0.01042892411351204,
0.017232825979590416,
0.013946432620286942,
0.035599738359451294,
0.05838555470108986,
-0.0046605938114225864,
0.01833873800933361,
0.01261906698346138,
-0.0030659872572869062,
0.013366339728236198,
-0.009624860249459743,
-0.021903768181800842,
-0.09792579710483551,
-0.04479599371552467,
-0.0554245300590992,
-0.02976255863904953,
-0.17547526955604553,
-0.00862764474004507,
0.021648604422807693,
0.06666557490825653,
0.027495773509144783,
0.04464271292090416,
0.05138328671455383,
0.0708744004368782,
-0.045034464448690414,
-0.01961781457066536,
0.06347786635160446,
0.02339843474328518,
-0.10447156429290771,
0.08036870509386063,
-0.051065847277641296,
0.04928697273135185,
0.09023527055978775,
-0.1629306972026825,
-0.044646527618169785,
-0.054830100387334824,
-0.04070757329463959,
-0.03686822950839996,
0.0017091723857447505,
-0.022382348775863647,
0.19204093515872955,
-0.003622691845521331,
0.1718650609254837,
-0.12088833004236221,
-0.03901643678545952,
-0.03273335471749306,
-0.028472842648625374,
0.02933143824338913,
0.13339662551879883,
0.07930441200733185,
-0.23419880867004395,
0.057449933141469955,
0.07695648074150085,
-0.011981246992945671,
0.22084607183933258,
-0.04155745729804039,
-0.02490404061973095,
-0.027774875983595848,
0.07246080785989761,
-0.043176744133234024,
0.14724025130271912,
-0.21634073555469513,
-0.028682934120297432,
0.02062462642788887,
-0.0023216218687593937,
0.1161193698644638,
-0.12181691080331802,
-0.0029110510367900133,
0.020291658118367195,
-0.038947973400354385,
-0.10302789509296417,
0.043862540274858475,
0.006902191787958145,
0.031534381210803986,
-0.009307844564318657,
-0.020717762410640717,
0.033244602382183075,
-0.03723016381263733,
-0.11484067142009735,
0.22788545489311218,
-0.08601211756467819,
-0.2627450227737427,
-0.2007770985364914,
0.0725664421916008,
-0.024979613721370697,
-0.0038654087111353874,
0.06609509140253067,
-0.04399188980460167,
-0.053622011095285416,
-0.04148766025900841,
0.11764882504940033,
-0.01823277585208416,
-0.04601171612739563,
-0.0068123782984912395,
0.07681731134653091,
-0.00985840056091547,
-0.1948174238204956,
-0.01845640502870083,
0.01508887019008398,
0.0734006017446518,
0.014961613342165947,
-0.1438610702753067,
0.11306153237819672,
0.07972381263971329,
-0.05730101838707924,
0.045192357152700424,
-0.029691923409700394,
0.20673920214176178,
-0.05622434243559837,
-0.06149457395076752,
0.16587112843990326,
-0.09742705523967743,
0.004476348403841257,
0.02629779651761055,
0.002926769433543086,
-0.11107775568962097,
0.04144812747836113,
-0.042793046683073044,
-0.059498678892850876,
-0.2564241290092468,
-0.08758801966905594,
-0.08535675704479218,
0.10409896075725555,
0.017392896115779877,
0.027256008237600327,
-0.07204180955886841,
0.05894485116004944,
0.08984824270009995,
0.1400626301765442,
-0.003977205604314804,
0.06246063485741615,
0.051382195204496384,
-0.002249479992315173,
-0.005410353187471628,
-0.1111409142613411,
-0.05401334911584854,
0.03520138934254646,
0.09211356937885284,
0.20483897626399994,
0.003053132677450776,
0.15499401092529297,
0.08056806027889252,
0.031697649508714676,
0.037230536341667175,
0.18254926800727844,
-0.1213223785161972,
0.025860963389277458,
-0.025737104937434196,
-0.04536980763077736,
-0.14329972863197327,
0.02518175169825554,
-0.06510483473539352,
0.049405500292778015,
-0.13586650788784027,
-0.04694370925426483,
0.07781799882650375,
0.08304891735315323,
-0.018380483612418175,
-0.24728569388389587,
-0.10838811844587326,
0.03534451499581337,
-0.07308349013328552,
-0.06392835825681686,
0.05326922982931137,
0.1690331995487213,
-0.12794910371303558,
-0.012324780225753784,
-0.03803946077823639,
0.16267995536327362,
-0.07427699863910675,
0.03671019896864891,
-0.053080152720212936,
-0.032441165298223495,
0.0173173900693655,
0.16845768690109253,
-0.2036825716495514,
0.24104878306388855,
0.0008919218671508133,
0.017730826511979103,
-0.061873339116573334,
0.03128684684634209,
0.0038835995364934206,
0.09420344233512878,
0.11658381670713425,
-0.012938634492456913,
-0.04281483218073845,
-0.14855900406837463,
0.04441894218325615,
0.09592119604349136,
0.056150346994400024,
-0.021073482930660248,
0.04965851455926895,
-0.0214788056910038,
0.031354472041130066,
-0.018751030787825584,
-0.06616666167974472,
-0.10020891577005386,
-0.09267476201057434,
-0.0004967356217093766,
-0.033205654472112656,
0.0666838064789772,
-0.024917716160416603,
0.024112574756145477,
0.09423704445362091,
0.17187823355197906,
-0.0624428316950798,
-0.06778028607368469,
-0.09843280166387558,
0.026975272223353386,
0.12988562881946564,
-0.08251581341028214,
-0.009367221966385841,
0.002617024816572666,
0.02918660268187523,
0.002559931017458439,
-0.12893110513687134,
0.04852449521422386,
-0.0667879655957222,
0.002700514392927289,
-0.02691095881164074,
0.0885205864906311,
-0.0197609756141901,
-0.022972820326685905,
0.06565316766500473,
-0.07976388186216354,
-0.053998738527297974,
-0.14578494429588318,
-0.11172023415565491,
-0.0412495881319046,
0.062333300709724426,
0.04495782032608986,
-0.1416628360748291,
0.01874513179063797,
-0.007897837087512016,
-0.03840301185846329,
0.20995067059993744,
0.09366059303283691,
-0.01834128424525261,
0.022389624267816544,
0.18313026428222656,
-0.11048799008131027,
-0.2261517345905304,
-0.010685722343623638,
-0.0372881218791008,
0.07377961277961731,
0.013861242681741714,
-0.13055327534675598,
0.0902090072631836,
-0.01533817034214735,
0.04391138628125191,
-0.016050592064857483,
-0.27719244360923767,
-0.09532369673252106,
0.11659956723451614,
0.1268284171819687,
0.09162237495183945,
-0.11435838788747787,
-0.06806322932243347,
-0.08779069036245346,
-0.23685935139656067,
0.16179205477237701,
-0.0934215784072876,
0.09244190901517868,
-0.017621608451008797,
0.05801767110824585,
0.02675817906856537,
-0.052986420691013336,
0.11753298342227936,
0.028388574719429016,
0.10725416243076324,
-0.032378632575273514,
-0.11228156834840775,
0.08992365002632141,
-0.03348778933286667,
0.16141070425510406,
-0.10829054564237595,
0.08636639267206192,
-0.21948057413101196,
-0.03900923579931259,
-0.044250987470149994,
0.03997242450714111,
0.00012633312144316733,
-0.07901515066623688,
-0.04573111981153488,
0.024686012417078018,
0.034621257334947586,
0.014120318926870823,
0.11726387590169907,
-0.040537238121032715,
0.0018987843068316579,
0.1196560189127922,
0.13852165639400482,
-0.05593486502766609,
-0.018360447138547897,
0.03157425299286842,
0.029511328786611557,
0.10870206356048584,
-0.2198941707611084,
0.08119071274995804,
0.11591093242168427,
0.008926580660045147,
0.13063952326774597,
0.0760807916522026,
-0.037028320133686066,
0.028404254466295242,
0.09100260585546494,
-0.13696987926959991,
-0.044500820338726044,
-0.06795801967382431,
-0.03864699602127075,
0.024726448580622673,
0.07894830405712128,
0.1338089257478714,
-0.06801112741231918,
-0.008680530823767185,
-0.005969290155917406,
-0.017782030627131462,
-0.13336512446403503,
0.12633411586284637,
0.04028049483895302,
0.07612276077270508,
-0.08342841267585754,
0.09075327962636948,
0.06085534393787384,
-0.1472235769033432,
-0.02864900976419449,
0.13369402289390564,
-0.13342724740505219,
-0.08411666750907898,
-0.011761167086660862,
0.30839139223098755,
-0.10372736304998398,
-0.10468919575214386,
-0.13637030124664307,
-0.05912001058459282,
-0.005344763398170471,
0.2204441875219345,
0.09662609547376633,
0.10087325423955917,
-0.04939962550997734,
-0.0171208418905735,
-0.0989561527967453,
0.057827189564704895,
0.09446126967668533,
0.011949972249567509,
-0.09254995733499527,
0.08789956569671631,
-0.005276386626064777,
0.157162144780159,
-0.05239340290427208,
-0.03521499037742615,
-0.17254100739955902,
0.07116623967885971,
-0.12618517875671387,
0.06644972413778305,
-0.07561706751585007,
0.026630651205778122,
0.015143880620598793,
0.005463434848934412,
-0.026176314800977707,
0.053015775978565216,
-0.07822185009717941,
0.01756472885608673,
-0.0014960664557293057,
0.0810580924153328,
-0.08505145460367203,
-0.017186280339956284,
0.08159688115119934,
-0.052818428725004196,
0.09426230192184448,
0.01433649193495512,
-0.06947510689496994,
0.10272706300020218,
-0.18468765914440155,
-0.02244098298251629,
0.05156077817082405,
0.009450017474591732,
0.06727570295333862,
-0.05337309092283249,
0.0320371575653553,
0.03282379359006882,
0.03358086198568344,
-0.01762120984494686,
0.11356164515018463,
-0.13208653032779694,
-0.09194155782461166,
-0.030354848131537437,
-0.12243762612342834,
-0.04263189435005188,
0.03260822221636772,
0.048414770513772964,
0.08995033800601959,
0.07838329672813416,
-0.023279324173927307,
0.03567604720592499,
-0.06429809331893921,
-0.017240524291992188,
0.051821865141391754,
-0.08660681545734406,
-0.04302877560257912,
-0.09935514628887177,
0.014219089411199093,
-0.07087372988462448,
0.1903209239244461,
0.018765917047858238,
0.13865819573402405,
-0.016897859051823616,
-0.05085611715912819,
0.0006721297395415604,
0.049318645149469376,
0.21422246098518372,
-0.04361223056912422,
0.04702923074364662,
-0.07533659040927887,
0.06970828771591187,
0.01458415761590004,
0.04787653312087059,
0.09806740283966064,
0.11965150386095047,
-0.01701836660504341,
0.10562299191951752,
0.037623081356287,
0.03721412643790245,
-0.04915336146950722,
-0.0952320322394371,
0.10092632472515106,
0.054734379053115845,
-0.0388081818819046,
0.09442915767431259,
0.1154908686876297,
-0.10611135512590408,
0.09990248829126358,
0.003710903460159898,
-0.10531626641750336,
-0.03984867408871651,
-0.017273206263780594,
-0.047289036214351654,
-0.12293103337287903,
-0.003615979105234146,
-0.13133299350738525,
-0.042646925896406174,
0.06051303446292877,
0.024177702143788338,
-0.0630943700671196,
0.1888929307460785,
-0.0059499978087842464,
-0.07677318155765533,
0.054108500480651855,
-0.003711400553584099,
0.0184246227145195,
-0.022401737049221992,
0.07840927690267563,
-0.007503150962293148,
-0.0052742427214980125,
-0.013551375828683376,
0.049591027200222015,
-0.03501656651496887,
-0.00897204503417015,
-0.07237112522125244,
-0.029233397915959358,
-0.04313994199037552,
0.033677954226732254,
-0.006675780285149813,
0.018627643585205078,
0.023809658363461494,
-0.04091648757457733,
0.0032539258245378733,
0.23888877034187317,
-0.037934187799692154,
-0.08973363041877747,
-0.13817402720451355,
0.18365831673145294,
0.053223203867673874,
0.05742397531867027,
0.01709499955177307,
-0.05927034467458725,
-0.04214084520936012,
0.26330694556236267,
0.1998661458492279,
-0.04872638359665871,
-0.0012914909748360515,
0.002072732662782073,
0.018252229318022728,
-0.0012625836534425616,
0.1328662931919098,
0.03692758455872536,
0.21824029088020325,
-0.019745884463191032,
-0.040373545140028,
-0.05448611453175545,
-0.04440753906965256,
0.03256542980670929,
0.12083006650209427,
0.04014845937490463,
-0.040484581142663956,
-0.04003285616636276,
0.10020245611667633,
-0.15116344392299652,
-0.11100228875875473,
0.02678598463535309,
-0.14024856686592102,
-0.07818781584501266,
-0.07165390253067017,
0.06490820646286011,
-0.033813782036304474,
0.05003548040986061,
-0.043899212032556534,
-0.006536742206662893,
0.0533490888774395,
0.046224456280469894,
-0.13900801539421082,
-0.09470134973526001,
0.05062674358487129,
-0.041878823190927505,
0.11257002502679825,
-0.024257654324173927,
0.10063938796520233,
0.08670765161514282,
0.029194241389632225,
-0.04823720082640648,
0.05319472402334213,
0.05799384042620659,
0.050267335027456284,
0.05831695720553398,
0.06149984151124954,
-0.026680735871195793,
0.12803401052951813,
-0.04794194549322128,
-0.13390997052192688,
0.02498825266957283,
0.00790359079837799,
0.00041334732668474317,
-0.1060834676027298,
-0.028698984533548355,
-0.08879996836185455,
0.09309741854667664,
0.1612338125705719,
-0.040216393768787384,
0.02167477086186409,
-0.07846252620220184,
0.14863023161888123,
0.0011630624067038298,
-0.0260419063270092,
-0.07766015082597733,
-0.12618815898895264,
-0.024407466873526573,
0.030091045424342155,
-0.01566121354699135,
-0.2278585433959961,
-0.0020173247903585434,
-0.052941303700208664,
-0.006896855775266886,
-0.03724443539977074,
0.11083689332008362,
0.12927688658237457,
0.04779813438653946,
-0.027055734768509865,
-0.18318159878253937,
-0.012260167859494686,
0.059595737606287,
-0.10485997051000595,
-0.15216943621635437
] |
null | null |
transformers
|
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_api_generation_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_api_generation_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/api%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]}
|
summarization
|
SEBIS/code_trans_t5_small_api_generation_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1250525861978531,
-0.024141928181052208,
-0.00048119851271621883,
0.13146448135375977,
0.10523828864097595,
0.024544091895222664,
0.05932262912392616,
0.06676238775253296,
-0.025804130360484123,
0.01876448467373848,
0.04442939534783363,
0.010615610517561436,
0.034068573266267776,
0.19697408378124237,
0.007609574124217033,
-0.11561018973588943,
-0.014207489788532257,
0.04446591064333916,
-0.03505248203873634,
0.1281358003616333,
0.09442378580570221,
-0.07456395775079727,
0.05387280881404877,
-0.07045678049325943,
-0.24531003832817078,
0.06087196618318558,
-0.00549357570707798,
-0.06319572776556015,
0.09794674068689346,
0.045887939631938934,
0.12515926361083984,
-0.004427643492817879,
0.021231601014733315,
-0.14554324746131897,
0.010913929902017117,
0.011195719242095947,
0.033363036811351776,
0.016678476706147194,
0.04479771852493286,
0.051789771765470505,
0.13549968600273132,
0.011528140865266323,
0.042561475187540054,
0.06191934272646904,
-0.07564254850149155,
-0.11707017570734024,
-0.007378078531473875,
0.02450980618596077,
0.05118471756577492,
0.10028988122940063,
-0.012456241063773632,
0.11990103870630264,
-0.15123850107192993,
0.12780648469924927,
0.10264765471220016,
-0.2186698168516159,
-0.012402595020830631,
0.1282239556312561,
0.09147883206605911,
0.09810025244951248,
-0.06084540858864784,
-0.06643130630254745,
0.10361729562282562,
0.05249921977519989,
0.046145420521497726,
-0.10134720802307129,
-0.11527016758918762,
0.023104878142476082,
-0.07419119775295258,
-0.06436161696910858,
0.22134725749492645,
0.0007479047635570168,
-0.07692229747772217,
-0.054648179560899734,
-0.02390178292989731,
-0.13472428917884827,
0.036756936460733414,
0.027434315532445908,
0.009028307162225246,
-0.03275957331061363,
0.01640390418469906,
0.02841111645102501,
-0.07287946343421936,
-0.15672163665294647,
0.02843545749783516,
0.09397771954536438,
0.05509132519364357,
0.025633133947849274,
-0.09819329530000687,
0.10536498576402664,
0.03569447994232178,
-0.05969412624835968,
-0.028038449585437775,
-0.017819615080952644,
-0.10281285643577576,
0.03392312675714493,
-0.05131089314818382,
-0.18106676638126373,
0.01761457324028015,
0.013372767716646194,
-0.04847591370344162,
0.04991915076971054,
0.02802625112235546,
0.037872470915317535,
0.022620441392064095,
0.1989530324935913,
0.05758779123425484,
-0.12236824631690979,
0.05465112626552582,
0.04377061501145363,
-0.03624115511775017,
-0.004381765145808458,
-0.06851182878017426,
-0.09738561511039734,
0.09423837810754776,
0.10499630868434906,
-0.13645894825458527,
0.03573499992489815,
-0.0709567740559578,
-0.04248051345348358,
-0.0012832691427320242,
-0.15887996554374695,
0.003965558484196663,
0.02777772955596447,
-0.06717076897621155,
-0.056203946471214294,
0.09294960647821426,
-0.16908203065395355,
-0.14960190653800964,
-0.0432901494204998,
-0.07896245270967484,
-0.038712579756975174,
-0.1675771176815033,
-0.15755204856395721,
-0.010077323764562607,
-0.03912895545363426,
0.018970459699630737,
-0.08653852343559265,
-0.15926620364189148,
-0.026865292340517044,
0.018203306943178177,
0.004149376414716244,
-0.0032133888453245163,
-0.07936229556798935,
-0.01076018251478672,
-0.030521614477038383,
-0.03842349722981453,
0.014062581583857536,
-0.0469084233045578,
0.1207846999168396,
0.10205543041229248,
0.054726846516132355,
-0.02269711345434189,
0.06072287634015083,
-0.08043012768030167,
0.0643787533044815,
-0.11399855464696884,
0.09607747942209244,
-0.05954163521528244,
0.07920441031455994,
-0.034050676971673965,
-0.1053425595164299,
0.07470189034938812,
0.06115886941552162,
0.0661267563700676,
0.03630950301885605,
-0.1406802088022232,
-0.0240354984998703,
0.18656682968139648,
-0.12407059967517853,
-0.13745057582855225,
0.10005934536457062,
-0.03747185692191124,
0.08447053283452988,
0.08307177573442459,
0.1432381421327591,
0.14926064014434814,
-0.02353963628411293,
0.023013601079583168,
0.04989871755242348,
0.0436236746609211,
-0.13146880269050598,
0.07845202833414078,
0.06501322984695435,
-0.08827026188373566,
0.06267134845256805,
-0.015590091235935688,
0.09734285622835159,
-0.011178870685398579,
-0.025451231747865677,
-0.05084840953350067,
-0.08062469959259033,
-0.005184832960367203,
0.006988312117755413,
0.06584856659173965,
-0.08164966851472855,
-0.058147676289081573,
0.0886315107345581,
0.17281047999858856,
-0.1310189813375473,
-0.0019683437421917915,
-0.08146591484546661,
0.03888001665472984,
-0.07827690988779068,
0.029356949031352997,
-0.16248302161693573,
0.0350317507982254,
0.07894647121429443,
-0.027296999469399452,
0.052490998059511185,
0.12790778279304504,
0.012914800085127354,
0.0421074740588665,
0.001666411990299821,
-0.014387060888111591,
-0.12113512307405472,
-0.056305158883333206,
-0.06238644942641258,
-0.062272872775793076,
-0.08926725387573242,
-0.06041405722498894,
-0.03732415288686752,
-0.19255957007408142,
0.012243686243891716,
0.0018380379769951105,
0.002117858035489917,
0.028258729726076126,
-0.013117784634232521,
0.03019171580672264,
0.07657991349697113,
-0.05987739935517311,
-0.036582086235284805,
0.03149382025003433,
0.024092605337500572,
-0.04133981466293335,
-0.0580543652176857,
-0.080625019967556,
0.005839488934725523,
0.10613007098436356,
0.04121369868516922,
-0.07854305952787399,
0.023197820410132408,
-0.021089494228363037,
-0.04788712039589882,
0.008204220794141293,
-0.06369750201702118,
0.14420314133167267,
-0.0060533215291798115,
0.19780884683132172,
-0.16455397009849548,
-0.03958240896463394,
-0.024854794144630432,
0.02493547461926937,
0.0630793571472168,
0.13724346458911896,
-0.012673364952206612,
-0.08204852789640427,
0.06636105477809906,
0.016136523336172104,
-0.10073152184486389,
0.23156113922595978,
-0.04607018455862999,
-0.09203214943408966,
0.022898539900779724,
0.10166893899440765,
-0.01776709221303463,
0.1676165759563446,
-0.2016284316778183,
-0.027964917942881584,
0.016877222806215286,
0.007177744992077351,
0.06666979193687439,
-0.1265154778957367,
0.004029466770589352,
0.009168361313641071,
-0.07458949089050293,
-0.07043329626321793,
-0.006237782072275877,
-0.005905228666961193,
0.03749585896730423,
-0.008455732837319374,
-0.028352661058306694,
0.01585710048675537,
-0.04035058990120888,
-0.10689040273427963,
0.22004766762256622,
-0.09688268601894379,
-0.21868287026882172,
-0.2038431018590927,
0.115360327064991,
-0.06207435950636864,
-0.013393471948802471,
0.037232656031847,
-0.0782359316945076,
-0.0546216145157814,
-0.05827879160642624,
0.17045097053050995,
-0.06073896586894989,
-0.013062610290944576,
-0.015048538334667683,
0.07595575600862503,
0.00938555970788002,
-0.20997372269630432,
0.03505081310868263,
-0.00462334556505084,
-0.014405778609216213,
0.005198711063712835,
-0.09932886064052582,
0.09088687598705292,
0.15210431814193726,
-0.08147291839122772,
0.020574180409312248,
0.006110610440373421,
0.1937338411808014,
-0.03861398622393608,
-0.05546659231185913,
0.13941118121147156,
-0.018215874210000038,
-0.01056045014411211,
0.018268069252371788,
-0.013802295550704002,
-0.0990934744477272,
0.06300550699234009,
-0.009406172670423985,
-0.02461768314242363,
-0.2731727063655853,
-0.008026626892387867,
-0.07929924875497818,
0.05714646726846695,
0.03663308173418045,
0.04189760237932205,
-0.08850366622209549,
0.028427913784980774,
0.05995788797736168,
0.15022392570972443,
-0.0031306648161262274,
0.05426473170518875,
0.053848158568143845,
-0.0012409252813085914,
0.008938460610806942,
-0.09946367889642715,
0.01366699580103159,
0.07234750688076019,
0.11167269945144653,
0.27062517404556274,
-0.0993877425789833,
0.19439885020256042,
0.04970715567469597,
0.048874542117118835,
0.049988921731710434,
0.13389098644256592,
-0.1326027810573578,
0.032909609377384186,
0.004260089714080095,
-0.008397462777793407,
-0.10874970257282257,
0.008436395786702633,
-0.06536896526813507,
0.09169357270002365,
-0.10585083812475204,
-0.05981040745973587,
0.010543395765125751,
0.1502588987350464,
0.041696447879076004,
-0.22294090688228607,
-0.12801581621170044,
0.021521443501114845,
-0.09454316645860672,
-0.10522792488336563,
0.06528669595718384,
0.2435370236635208,
-0.07862833142280579,
-0.04031394422054291,
-0.004162311088293791,
0.13388922810554504,
-0.03446439653635025,
-0.022026507183909416,
-0.03672304376959801,
0.06362719088792801,
0.01655319705605507,
0.13506293296813965,
-0.2962416112422943,
0.12878462672233582,
-0.00905762892216444,
0.06391721218824387,
-0.02929292619228363,
0.048187363892793655,
-0.03951825201511383,
0.07706574350595474,
0.03923438489437103,
-0.009376609697937965,
0.034372102469205856,
-0.16009704768657684,
0.014237726107239723,
0.042021576315164566,
0.018318507820367813,
0.05423922464251518,
0.06290338188409805,
-0.0033300810027867556,
0.05801078677177429,
-0.02026660554111004,
-0.12385781109333038,
-0.07346063107252121,
-0.06618606299161911,
-0.017781464383006096,
-0.030111942440271378,
-0.00989072397351265,
-0.044371332973241806,
-0.008705463260412216,
0.07925377041101456,
0.18231700360774994,
-0.10145474225282669,
-0.07726006209850311,
-0.0746924877166748,
0.05275643616914749,
0.10773860663175583,
-0.0816001147031784,
0.029769448563456535,
-0.0031227334402501583,
0.04539245739579201,
-0.008708333596587181,
-0.07532891631126404,
0.05024607479572296,
-0.03955545648932457,
-0.0697391927242279,
-0.010982493869960308,
0.062471963465213776,
-0.0020037933718413115,
0.02720622904598713,
0.013232327066361904,
-0.09562113136053085,
-0.04431439936161041,
-0.12066469341516495,
-0.12664635479450226,
-0.03704090043902397,
0.014966706745326519,
0.045131534337997437,
-0.14832840859889984,
-0.058649979531764984,
0.0020918045192956924,
-0.038695503026247025,
0.1325712949037552,
0.15857087075710297,
-0.05623859539628029,
0.031690552830696106,
0.14645081758499146,
-0.06290235370397568,
-0.19172006845474243,
0.03271328657865524,
0.04377511143684387,
0.11987694352865219,
-0.04257359728217125,
-0.1631130874156952,
0.04770960286259651,
0.018206164240837097,
0.03600340336561203,
0.05177218094468117,
-0.31273362040519714,
-0.1250671148300171,
0.08064501732587814,
0.1599201261997223,
0.12196075916290283,
-0.12283661961555481,
-0.039408862590789795,
-0.06455492228269577,
-0.15834030508995056,
0.09151070564985275,
-0.050039440393447876,
0.1330408900976181,
-0.07582743465900421,
0.02761186845600605,
0.035052478313446045,
-0.045203063637018204,
0.07246831059455872,
0.03095647320151329,
0.12120386958122253,
-0.04176770895719528,
0.01773536205291748,
0.12367607653141022,
-0.03381863236427307,
0.1830519288778305,
-0.1463480144739151,
0.09773807227611542,
-0.23073627054691315,
-0.058468207716941833,
-0.07531213760375977,
0.00354588171467185,
-0.034864697605371475,
-0.04548010230064392,
-0.07652807235717773,
0.03104107454419136,
-0.0013903771759942174,
-0.006652110256254673,
0.03936343267560005,
-0.03182151913642883,
-0.01690811663866043,
0.10447227954864502,
0.10730645805597305,
-0.01751807890832424,
-0.06955324113368988,
0.05566900596022606,
0.05053050071001053,
0.11481490731239319,
-0.19015619158744812,
0.030357781797647476,
0.10380852967500687,
0.016225866973400116,
0.12611502408981323,
0.0448954775929451,
-0.10409604758024216,
0.042109496891498566,
0.08807861804962158,
-0.07508011907339096,
-0.06640217453241348,
-0.020799072459340096,
-0.07994193583726883,
-0.06897249072790146,
0.05073247104883194,
0.09367702156305313,
-0.05047840625047684,
-0.02087748423218727,
-0.025334849953651428,
-0.01777108572423458,
-0.11233942955732346,
0.18569622933864594,
0.0748288631439209,
0.0849199965596199,
-0.06688179820775986,
0.06525540351867676,
0.08385932445526123,
-0.08211730420589447,
0.007335455156862736,
0.18570947647094727,
-0.1032034158706665,
-0.04762425646185875,
0.07362772524356842,
0.22190289199352264,
-0.02627362497150898,
-0.058596979826688766,
-0.14007316529750824,
-0.0774703174829483,
0.030670825392007828,
0.1662735790014267,
0.10089792311191559,
0.0942331999540329,
-0.026455501094460487,
-0.0021793260239064693,
-0.10644268244504929,
0.09335905313491821,
0.06447571516036987,
0.04839492589235306,
-0.10581076145172119,
0.12920965254306793,
0.03900259733200073,
0.12157872319221497,
-0.02666580118238926,
-0.011181004345417023,
-0.13991622626781464,
0.06393589079380035,
-0.10868944972753525,
0.03532613068819046,
-0.00863941665738821,
0.051094695925712585,
-0.023724615573883057,
0.0017385703977197409,
-0.03241724148392677,
0.06780336052179337,
-0.08211512118577957,
0.0007170885219238698,
0.002777132438495755,
0.05537015572190285,
-0.05237870663404465,
-0.01844259910285473,
0.03328540921211243,
-0.09205944836139679,
0.12280936539173126,
-0.03783721849322319,
-0.02906421385705471,
0.07977476716041565,
-0.05033141002058983,
0.03968869522213936,
0.014403574168682098,
0.04855971410870552,
0.019705623388290405,
0.014573219232261181,
0.07744183391332626,
0.037187155336141586,
0.0524318665266037,
0.025451021268963814,
0.1202487200498581,
-0.13916929066181183,
-0.08565185219049454,
-0.055342983454465866,
-0.1124611422419548,
-0.05594247207045555,
0.10017305612564087,
0.0467347614467144,
0.10526919364929199,
0.09283082187175751,
-0.03229412063956261,
0.011305618099868298,
-0.12651194632053375,
-0.06589744985103607,
0.027911148965358734,
-0.029437774792313576,
-0.08521104604005814,
-0.05512503162026405,
0.03785381466150284,
-0.03152089938521385,
0.12350629270076752,
0.019973820075392723,
0.036271825432777405,
-0.02049288898706436,
-0.06038957089185715,
-0.015027925372123718,
0.0220196470618248,
0.21133552491664886,
-0.0843915268778801,
0.04129766672849655,
0.0004495831672102213,
0.01582704298198223,
0.007703628856688738,
0.11804431676864624,
0.12141864001750946,
0.1677687019109726,
-0.03419868275523186,
0.10080868005752563,
0.017294293269515038,
-0.0019465360091999173,
-0.07072538882493973,
0.01639782451093197,
0.022290140390396118,
0.06195533648133278,
-0.048836372792720795,
0.18782906234264374,
0.09171763807535172,
-0.12375067174434662,
0.11079957336187363,
0.026869308203458786,
-0.13282446563243866,
-0.035742178559303284,
0.02205107919871807,
-0.03678857162594795,
-0.14779318869113922,
0.024363018572330475,
-0.12995591759681702,
-0.01613502763211727,
0.049493785947561264,
0.05026807636022568,
-0.07829629629850388,
0.1704951971769333,
0.03380244970321655,
-0.05973014608025551,
0.054236456751823425,
-0.0022186131682246923,
0.026206575334072113,
0.02118798904120922,
0.03667539358139038,
0.035898301750421524,
-0.03799993544816971,
0.0379997193813324,
0.025128869339823723,
-0.02615053951740265,
-0.018884385004639626,
-0.021888427436351776,
-0.002320418134331703,
-0.01558524090796709,
0.01907043531537056,
0.056299999356269836,
0.1607452929019928,
0.03659430518746376,
-0.07463236153125763,
-0.01717052422463894,
0.17448562383651733,
-0.027748966589570045,
-0.09788884967565536,
-0.1280885934829712,
0.12982043623924255,
0.05338983237743378,
0.010879487730562687,
0.025152545422315598,
-0.08289095014333725,
-0.053718842566013336,
0.21019943058490753,
0.05566851422190666,
-0.030301116406917572,
-0.022857045754790306,
0.0070112524554133415,
-0.0016370376106351614,
-0.03951798006892204,
0.20178323984146118,
0.0225472804158926,
0.22569669783115387,
0.022507773712277412,
-0.004723061341792345,
-0.06782807409763336,
-0.04154159873723984,
0.0032249365467578173,
0.12018943578004837,
-0.039235666394233704,
-0.03981427103281021,
-0.08353675156831741,
-0.00242469715885818,
-0.0029445867985486984,
-0.07687681168317795,
0.0984921082854271,
-0.13689714670181274,
-0.09762362390756607,
-0.04958377406001091,
0.04898512735962868,
-0.059638459235429764,
0.01666136085987091,
-0.0242688599973917,
0.04357941076159477,
0.07080032676458359,
-0.03347295895218849,
-0.10007549077272415,
-0.1716580092906952,
0.09526116400957108,
-0.05337631702423096,
0.13411448895931244,
-0.015680065378546715,
0.15334470570087433,
0.08540792018175125,
0.02523023448884487,
-0.06411808729171753,
0.11392280459403992,
0.030818387866020203,
0.059754498302936554,
0.049459658563137054,
0.12366402894258499,
-0.050843432545661926,
0.1359843760728836,
-0.05042244493961334,
-0.027621576562523842,
-0.0272989384829998,
-0.07680505514144897,
-0.019174454733729362,
-0.16308672726154327,
-0.019796758890151978,
-0.09473086148500443,
0.09385007619857788,
0.19392096996307373,
-0.04478465020656586,
-0.029782570898532867,
-0.09210273623466492,
0.10881425440311432,
-0.01250383723527193,
0.06570686399936676,
-0.0321953259408474,
-0.17497166991233826,
0.00003389837365830317,
0.011070952750742435,
0.013774258084595203,
-0.2756485641002655,
-0.006305762100964785,
-0.0393868088722229,
-0.02762572467327118,
-0.08700205385684967,
0.1599600464105606,
0.09011469036340714,
0.04938714951276779,
-0.041033390909433365,
-0.1652131825685501,
-0.036006033420562744,
0.05896081402897835,
-0.13935540616512299,
-0.14529569447040558
] |
null | null |
transformers
|
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the api recommendation generation task for the java apis.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_api_generation_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_api_generation_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/api%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]}
|
summarization
|
SEBIS/code_trans_t5_small_api_generation_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the api recommendation generation task for the java apis.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08723241835832596,
0.05659574642777443,
-0.0017973099602386355,
0.11430652439594269,
0.05090360715985298,
0.021062461659312248,
0.06479707360267639,
0.09382369369268417,
-0.02232271060347557,
0.0636211633682251,
0.06603548675775528,
-0.031282369047403336,
0.06765757501125336,
0.1857910454273224,
0.0214096587151289,
-0.1758345067501068,
-0.011310974135994911,
0.028101790696382523,
-0.027536122128367424,
0.10851024091243744,
0.09865745157003403,
-0.08621007204055786,
0.06813574582338333,
-0.036820147186517715,
-0.12309646606445312,
0.05383673682808876,
-0.0421154759824276,
-0.05098477378487587,
0.08507172018289566,
0.057840149849653244,
0.11168897151947021,
-0.024538978934288025,
0.07008185982704163,
-0.1934218406677246,
0.001857744762673974,
0.01693219691514969,
0.051791202276945114,
0.02288622036576271,
0.06017240136861801,
0.07741101831197739,
0.12715530395507812,
-0.026264498010277748,
0.03648099675774574,
0.05222686007618904,
-0.06607242673635483,
-0.07432756572961807,
-0.04949154332280159,
0.08162593841552734,
0.11196789145469666,
0.08478140085935593,
-0.014747441746294498,
0.02372068166732788,
-0.09033937007188797,
0.07772272080183029,
0.10750164836645126,
-0.22396503388881683,
-0.022574041038751602,
0.09764895588159561,
0.08469832688570023,
0.07502175867557526,
-0.0889320969581604,
-0.03456881642341614,
0.10997377336025238,
0.03435511887073517,
0.05024620145559311,
-0.08418379724025726,
-0.052917834371328354,
-0.0035872722510248423,
-0.04905607923865318,
-0.05574438348412514,
0.14453421533107758,
0.05339725315570831,
-0.060977961868047714,
-0.10578062385320663,
-0.03683556243777275,
-0.18241381645202637,
0.053306516259908676,
0.018290970474481583,
0.014393930323421955,
-0.01974773406982422,
0.047519098967313766,
-0.0010845751967281103,
-0.09241560846567154,
-0.10877282172441483,
0.0008809777209535241,
0.07571360468864441,
0.07495209574699402,
0.030394375324249268,
-0.009053531102836132,
0.07510734349489212,
-0.0055983951315283775,
-0.06272082030773163,
-0.03344156965613365,
0.008033201098442078,
-0.11812535673379898,
0.024905115365982056,
-0.004382138606160879,
-0.06608998775482178,
-0.011340753175318241,
0.08972668647766113,
-0.06794534623622894,
0.06860323250293732,
0.11517500132322311,
0.009655742906033993,
0.000053877181926509365,
0.21529380977153778,
0.042108528316020966,
-0.138843834400177,
0.004201584029942751,
0.03334864228963852,
0.003911294508725405,
-0.003507542423903942,
-0.06215096637606621,
-0.04491764307022095,
0.006169840693473816,
0.05797937139868736,
-0.1479506492614746,
0.01159923616796732,
-0.04489891976118088,
-0.019900880753993988,
0.06860096007585526,
-0.12678831815719604,
0.029754428192973137,
0.015230400487780571,
-0.07609724253416061,
-0.054893381893634796,
0.054462283849716187,
-0.10911121964454651,
-0.1146460697054863,
0.023006001487374306,
-0.054639656096696854,
-0.027799643576145172,
-0.12411313503980637,
-0.1258075088262558,
-0.01065095141530037,
-0.057987287640571594,
0.0207674540579319,
-0.10433825850486755,
-0.10282646864652634,
-0.018671531230211258,
0.026845300570130348,
-0.010503455065190792,
-0.01990370638668537,
-0.05673402175307274,
0.02129538170993328,
-0.007855555042624474,
-0.03450372815132141,
0.023362526670098305,
-0.04363946244120598,
0.09295334666967392,
0.08964288234710693,
0.06309022009372711,
0.019538845866918564,
0.03417472913861275,
-0.07063496112823486,
0.061248019337654114,
-0.08178434520959854,
0.05824100971221924,
-0.023163871839642525,
0.06204000115394592,
-0.08817984163761139,
-0.08669834583997726,
0.053239982575178146,
0.049221619963645935,
0.050681162625551224,
0.018090752884745598,
-0.10371863096952438,
0.01426124107092619,
0.1315382868051529,
-0.09915801137685776,
-0.13871051371097565,
0.11820364743471146,
-0.004057901445776224,
0.013113475404679775,
0.0630350336432457,
0.14004307985305786,
0.14458997547626495,
-0.08349904417991638,
-0.04218204319477081,
0.07058191299438477,
0.06361942738294601,
-0.06735566258430481,
0.10026593506336212,
0.03407623991370201,
0.028912348672747612,
0.025391768664121628,
0.04558248072862625,
0.06361278891563416,
-0.01007019728422165,
-0.03499675542116165,
-0.020048025995492935,
-0.08492650836706161,
-0.03481181338429451,
-0.014388341456651688,
0.029823774471879005,
-0.07136531919240952,
-0.06943214684724808,
0.017043951898813248,
0.1747598648071289,
-0.09679769724607468,
0.018972322344779968,
-0.08861979842185974,
-0.04618409276008606,
-0.08829998224973679,
0.011759263463318348,
-0.09842894226312637,
0.024760153144598007,
0.05915631726384163,
-0.05908447131514549,
0.06628818064928055,
0.08544036746025085,
0.00039672464481554925,
0.04365075007081032,
-0.03480023890733719,
-0.04490242525935173,
-0.06125125661492348,
-0.058484919369220734,
-0.12067010998725891,
-0.015292387455701828,
-0.09045660495758057,
-0.02585633471608162,
-0.07734818756580353,
-0.17840178310871124,
0.011263847351074219,
-0.039793696254491806,
0.027552174404263496,
0.009296389296650887,
-0.02020600624382496,
0.03996151313185692,
0.0500367134809494,
-0.05202863737940788,
-0.08463167399168015,
0.012733395211398602,
0.013979373499751091,
-0.09917876124382019,
-0.0440538190305233,
-0.11077495664358139,
-0.032682858407497406,
0.07321645319461823,
0.08210889995098114,
-0.06702352315187454,
0.011887206695973873,
-0.028644364327192307,
-0.05969357118010521,
-0.050293318927288055,
-0.06335630267858505,
0.16691868007183075,
0.010046054609119892,
0.17189288139343262,
-0.14693841338157654,
-0.048965297639369965,
-0.026216235011816025,
-0.011088194325566292,
0.025144895538687706,
0.15925663709640503,
-0.0029548206366598606,
-0.09683200716972351,
0.05552743375301361,
-0.01605949178338051,
-0.06251105666160583,
0.16194142401218414,
-0.002821661066263914,
-0.09540016204118729,
0.021503929048776627,
0.10046426951885223,
-0.021780570968985558,
0.1421152949333191,
-0.09749443829059601,
-0.010840397328138351,
0.0070988480001688,
0.030864812433719635,
0.03584590181708336,
-0.1296772062778473,
0.018013615161180496,
0.05092746391892433,
-0.07877826690673828,
-0.05862361192703247,
-0.025361809879541397,
-0.03801391273736954,
0.04359700530767441,
-0.0016679117688909173,
-0.010508797131478786,
-0.012684041634202003,
-0.022690588608384132,
-0.0874021127820015,
0.1995779275894165,
-0.07867391407489777,
-0.2284722775220871,
-0.17579711973667145,
0.04240887984633446,
-0.06083039566874504,
-0.004811845254153013,
0.04842818155884743,
-0.11885784566402435,
-0.06996311992406845,
-0.08784731477499008,
0.14833398163318634,
-0.09965847432613373,
0.00578681193292141,
-0.005023474805057049,
0.042171888053417206,
0.026757216081023216,
-0.1811017543077469,
0.03197113424539566,
-0.012302125804126263,
0.0012540461029857397,
0.0028021319303661585,
-0.05711662024259567,
0.08714079856872559,
0.11884907633066177,
-0.08881024271249771,
0.020051777362823486,
0.0009722618851810694,
0.1697484850883484,
-0.051227934658527374,
0.02551058866083622,
0.2027112990617752,
0.019958874210715294,
0.027068478986620903,
0.051942963153123856,
0.01344252098351717,
-0.09558268636465073,
0.06748300045728683,
0.05393876135349274,
-0.028116198256611824,
-0.2567760944366455,
0.0035966692958027124,
-0.06615976989269257,
0.03712773323059082,
0.11383862793445587,
0.052163977175951004,
-0.12732172012329102,
0.03597794845700264,
-0.003881061216816306,
0.15036921203136444,
-0.03482677787542343,
0.05337297171354294,
0.015518072061240673,
0.018774891272187233,
0.010492230765521526,
-0.09442562609910965,
0.008902131579816341,
0.074312224984169,
0.11700393259525299,
0.22540414333343506,
-0.06106382980942726,
0.19471825659275055,
0.02871408686041832,
0.06792450696229935,
0.03083534725010395,
0.11302244663238525,
-0.12249574810266495,
-0.002190475817769766,
0.0013122453819960356,
-0.008054233156144619,
-0.07031838595867157,
0.049409087747335434,
-0.03727199137210846,
0.08275851607322693,
-0.06899817287921906,
0.02389000542461872,
0.01575368270277977,
0.15172886848449707,
0.06687722355127335,
-0.18795636296272278,
-0.11506412923336029,
0.023512762039899826,
-0.11712351441383362,
-0.1198756992816925,
0.06759945303201675,
0.21416375041007996,
-0.042266201227903366,
0.010324714705348015,
-0.004022859036922455,
0.14014945924282074,
-0.07528289407491684,
-0.02459155209362507,
0.01821104623377323,
0.07440907508134842,
0.0048891836777329445,
0.13113011419773102,
-0.26597532629966736,
0.09820473939180374,
0.009931218810379505,
0.09759018570184708,
-0.021944135427474976,
0.05909363180398941,
-0.037922922521829605,
0.008736133575439453,
0.07305268943309784,
-0.001061558723449707,
-0.05680518224835396,
-0.2107909768819809,
-0.05678025260567665,
0.024872509762644768,
0.044696297496557236,
-0.012576715089380741,
0.08618257939815521,
-0.0017272295663133264,
0.0669003576040268,
-0.03498463332653046,
-0.11578422039747238,
-0.07333382964134216,
-0.11667348444461823,
-0.028577148914337158,
0.005547444336116314,
-0.023849481716752052,
-0.02247895486652851,
0.014152666553854942,
-0.0027017896063625813,
0.20150935649871826,
-0.1485685557126999,
-0.11089924722909927,
-0.08579652011394501,
0.07545777410268784,
0.1280674785375595,
-0.09930535405874252,
0.011548683047294617,
0.01909801922738552,
0.04716078191995621,
-0.03674367070198059,
-0.04615797474980354,
0.015884531661868095,
-0.05146470665931702,
-0.08099696040153503,
-0.02301315777003765,
0.08414362370967865,
-0.012924380600452423,
0.047197774052619934,
0.006387481931596994,
-0.0961650162935257,
-0.04362279549241066,
-0.13553476333618164,
-0.0885925218462944,
-0.025638511404395103,
0.03906557336449623,
-0.00751372380182147,
-0.084501251578331,
0.07986141741275787,
-0.019994454458355904,
-0.09145485609769821,
0.07000907510519028,
0.18557102978229523,
-0.05501747131347656,
0.01908537559211254,
0.11700661480426788,
-0.054212428629398346,
-0.14379027485847473,
-0.05676721781492233,
0.05096343532204628,
0.09673256427049637,
-0.04389474168419838,
-0.13888485729694366,
0.07956346124410629,
0.038631778210401535,
0.027729053050279617,
0.01992499642074108,
-0.27935102581977844,
-0.12848705053329468,
0.04499146342277527,
0.0863717570900917,
0.0834689512848854,
-0.11267140507698059,
-0.04075392335653305,
-0.05359042063355446,
-0.09004471451044083,
0.05503878742456436,
0.0579327717423439,
0.1323004812002182,
-0.04644652083516121,
0.040095873177051544,
0.028981484472751617,
-0.023190069943666458,
0.1066003367304802,
0.015369043685495853,
0.10472536832094193,
-0.022639505565166473,
0.030638672411441803,
0.04422656446695328,
-0.06179172173142433,
0.18248602747917175,
-0.17434075474739075,
0.07517338544130325,
-0.2274235635995865,
-0.062063928693532944,
-0.01514379121363163,
-0.010177510790526867,
-0.03541157394647598,
-0.0588211826980114,
-0.09900342673063278,
0.012608040124177933,
0.03949026018381119,
-0.017227984964847565,
0.08159981667995453,
-0.02060074359178543,
-0.0546259805560112,
0.048658307641744614,
0.09193415194749832,
-0.04201073944568634,
-0.12914371490478516,
0.018672307953238487,
0.03523583710193634,
0.0977805033326149,
-0.21438543498516083,
0.019194813445210457,
0.12358859926462173,
0.0018426973838359118,
0.11290226876735687,
0.011759641580283642,
-0.0719836875796318,
0.04565546289086342,
0.07655882090330124,
-0.03100200742483139,
-0.09621614962816238,
-0.006043584551662207,
-0.03166119009256363,
-0.08641528338193893,
0.028900809586048126,
0.08528722077608109,
-0.06313208490610123,
-0.022334616631269455,
-0.017620202153921127,
0.005604781210422516,
-0.07009472697973251,
0.20033833384513855,
0.03189253434538841,
0.09262261539697647,
-0.06271502375602722,
0.08792094141244888,
0.10160820186138153,
-0.1099134311079979,
0.00696087721735239,
0.16838809847831726,
-0.07811173796653748,
-0.02769594080746174,
0.04672440513968468,
0.11057976633310318,
-0.038938846439123154,
-0.06776046007871628,
-0.10121166706085205,
-0.07624891400337219,
0.022084606811404228,
0.026802103966474533,
0.06917975842952728,
0.07203838974237442,
-0.031777460128068924,
0.010390272364020348,
-0.09477073699235916,
0.0975547507405281,
0.08152726292610168,
0.04836805909872055,
-0.1453023999929428,
0.13423830270767212,
0.0408720001578331,
0.09384780377149582,
-0.00044410742702893913,
0.037852827459573746,
-0.09375283867120743,
0.04375043511390686,
-0.0434543639421463,
0.049600087106227875,
-0.006514084059745073,
0.05487491562962532,
-0.030382413417100906,
0.014004145748913288,
-0.03419378027319908,
0.05211382359266281,
-0.044092386960983276,
-0.02504207007586956,
-0.020322510972619057,
0.04920993000268936,
-0.049556177109479904,
-0.02846677228808403,
0.012945950031280518,
-0.08019669353961945,
0.10403788089752197,
-0.07225730270147324,
-0.017176825553178787,
0.0021582767367362976,
-0.00929222721606493,
0.07342920452356339,
0.026184137910604477,
0.05908818542957306,
-0.0015918957069516182,
-0.012244977056980133,
0.05260338634252548,
0.016994813457131386,
0.0013875215081498027,
-0.004952654708176851,
0.042954154312610626,
-0.14371992647647858,
-0.09603746235370636,
-0.10253948718309402,
-0.07264993339776993,
-0.06899971514940262,
0.08455875515937805,
0.08160894364118576,
0.0726710855960846,
0.09713075309991837,
-0.021318629384040833,
-0.003658006666228175,
-0.14331887662410736,
-0.031048372387886047,
0.05174984410405159,
-0.023315267637372017,
-0.0978747308254242,
-0.05226478353142738,
0.05141547694802284,
-0.039347030222415924,
0.11393854022026062,
0.004682696890085936,
0.04247765243053436,
-0.012516679242253304,
-0.045099686831235886,
-0.021479671820998192,
-0.0049044545739889145,
0.2103062868118286,
-0.09131233394145966,
0.02176101878285408,
0.0019313219236209989,
-0.010724363848567009,
0.04205121472477913,
0.13271000981330872,
0.09632541239261627,
0.15738722681999207,
0.044591519981622696,
0.10357237607240677,
-0.053526345640420914,
-0.04041502624750137,
-0.17435738444328308,
0.05421842262148857,
0.0026104673743247986,
0.03142696991562843,
-0.02511998638510704,
0.09259270876646042,
0.14623446762561798,
-0.12956535816192627,
0.09356977045536041,
0.025243613868951797,
-0.10288877785205841,
-0.051433395594358444,
-0.06293386965990067,
-0.049010977149009705,
-0.09982362389564514,
0.026082636788487434,
-0.11146743595600128,
0.026734596118330956,
0.07714108377695084,
0.04681270942091942,
-0.02476431243121624,
0.1375480443239212,
-0.00922988262027502,
-0.0558285266160965,
0.015536382794380188,
0.022706367075443268,
0.0441061370074749,
0.08763193339109421,
0.012252102605998516,
0.07492522895336151,
-0.0615629144012928,
0.07201002538204193,
0.017398476600646973,
0.0149459233507514,
0.007892213761806488,
-0.00390089419670403,
-0.0057206242345273495,
-0.04576330631971359,
-0.0045921215787529945,
0.08325207978487015,
0.17778390645980835,
0.042387623339891434,
-0.046381235122680664,
-0.04504743590950966,
0.17278669774532318,
-0.04086934030056,
-0.0713535025715828,
-0.11810345202684402,
0.14683294296264648,
0.057758431881666183,
0.02201773039996624,
0.01285497285425663,
-0.07026278227567673,
-0.05765039101243019,
0.2268359512090683,
0.00579944159835577,
-0.02149825356900692,
-0.048824556171894073,
-0.017061308026313782,
-0.008714941330254078,
-0.04435279592871666,
0.15204903483390808,
0.022606773301959038,
0.20275229215621948,
0.002408520318567753,
-0.006764102261513472,
-0.04282424971461296,
-0.02655712328851223,
-0.026401160284876823,
0.18405042588710785,
-0.03589656576514244,
0.03196423873305321,
-0.09861353784799576,
-0.012533643282949924,
0.03584624081850052,
-0.0950300544500351,
0.10465396195650101,
-0.08993155509233475,
-0.08195069432258606,
0.020105114206671715,
0.07483621686697006,
-0.022782836109399796,
0.03182002902030945,
-0.010152509436011314,
0.049451231956481934,
0.03821505606174469,
-0.03027612529695034,
-0.09734570235013962,
-0.1287783533334732,
0.05610816553235054,
-0.019121572375297546,
0.1663789600133896,
0.02124403417110443,
0.09858202934265137,
0.08487023413181305,
0.007687474135309458,
-0.079072505235672,
0.11377256363630295,
0.02724861167371273,
0.03127831220626831,
0.07459128648042679,
0.13242802023887634,
-0.033710166811943054,
0.13141459226608276,
-0.005739252083003521,
-0.04022063687443733,
-0.04160246253013611,
-0.02477497234940529,
-0.009543425403535366,
-0.14310459792613983,
0.009395299479365349,
-0.05569072812795639,
0.12740065157413483,
0.18010373413562775,
-0.04808232933282852,
-0.02807915396988392,
-0.04114500433206558,
0.08199950307607651,
-0.011919009499251842,
0.09299737215042114,
0.0006661086808890104,
-0.16775846481323242,
0.010820966213941574,
-0.02485862746834755,
0.012509562075138092,
-0.19083461165428162,
-0.03985541686415672,
-0.0440237894654274,
-0.03740246966481209,
-0.08365162461996078,
0.13878516852855682,
0.0753311812877655,
0.035558249801397324,
-0.04414229094982147,
-0.21462874114513397,
-0.0256431233137846,
0.05059459060430527,
-0.13417929410934448,
-0.12601684033870697
] |
null | null |
transformers
|
# CodeTrans model for api recommendation generation
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans).
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the api recommendation generation task for the java apis.
## Intended uses & limitations
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_api_generation_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_api_generation_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "parse the uses licence node of this package , if any , and returns the license definition if theres"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/api%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 68.71 |
| CodeTrans-ST-Base | 70.45 |
| CodeTrans-TF-Small | 68.90 |
| CodeTrans-TF-Base | 72.11 |
| CodeTrans-TF-Large | 73.26 |
| CodeTrans-MT-Small | 58.43 |
| CodeTrans-MT-Base | 67.97 |
| CodeTrans-MT-Large | 72.29 |
| CodeTrans-MT-TF-Small | 69.29 |
| CodeTrans-MT-TF-Base | 72.89 |
| CodeTrans-MT-TF-Large | **73.39** |
| State of the art | 54.42 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "parse the uses licence node of this package , if any , and returns the license definition if theres"}]}
|
summarization
|
SEBIS/code_trans_t5_small_api_generation_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for api recommendation generation
=================================================
Pretrained model for api recommendation generation using the t5 small model architecture. It was first released in
this repository.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the api recommendation generation task for the java apis.
Intended uses & limitations
---------------------------
The model could be used to generate api usage for the java programming tasks.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 1,400,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 1,150,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing api recommendation generation data.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08958004415035248,
0.05549061298370361,
-0.0016603379044681787,
0.11427890509366989,
0.05055202916264534,
0.016711128875613213,
0.05750653147697449,
0.09676899760961533,
-0.02694808319211006,
0.062480777502059937,
0.06749491393566132,
-0.031826186925172806,
0.06922107934951782,
0.1835411638021469,
0.01936894655227661,
-0.1811239868402481,
-0.009613669477403164,
0.02975207194685936,
-0.03300212323665619,
0.10813666880130768,
0.09845540672540665,
-0.08659128099679947,
0.06955288350582123,
-0.03877628222107887,
-0.11506932973861694,
0.05352647602558136,
-0.039444029331207275,
-0.052416782826185226,
0.08624465763568878,
0.06069760024547577,
0.11209209263324738,
-0.025814389809966087,
0.07264883816242218,
-0.1954137682914734,
0.0010162212420254946,
0.018374735489487648,
0.049739379435777664,
0.026135528460144997,
0.052685655653476715,
0.0742097869515419,
0.12901100516319275,
-0.027777163311839104,
0.03471726179122925,
0.05315656214952469,
-0.06855570524930954,
-0.08344186842441559,
-0.04831470921635628,
0.08104494959115982,
0.11118471622467041,
0.08643332123756409,
-0.014526051469147205,
0.021642563864588737,
-0.08779647201299667,
0.0765935629606247,
0.11582798510789871,
-0.22833453118801117,
-0.018575891852378845,
0.09492333978414536,
0.08511517196893692,
0.0797128826379776,
-0.09157468378543854,
-0.03299438953399658,
0.11307211965322495,
0.031092628836631775,
0.05723561346530914,
-0.08414028584957123,
-0.04829220846295357,
-0.0027022871654480696,
-0.04932272434234619,
-0.05731093883514404,
0.14812521636486053,
0.05334698036313057,
-0.06145073473453522,
-0.10218401253223419,
-0.04021509736776352,
-0.19325584173202515,
0.05334065109491348,
0.01801362633705139,
0.015317731536924839,
-0.019314497709274292,
0.0450223907828331,
0.000745655270293355,
-0.09207522869110107,
-0.10731413215398788,
-0.005820245482027531,
0.07559455186128616,
0.07355235517024994,
0.033031757920980453,
-0.005677634384483099,
0.07468357682228088,
-0.0008594951359555125,
-0.06375094503164291,
-0.03624245151877403,
0.008513085544109344,
-0.11392930150032043,
0.0271597970277071,
-0.005537393502891064,
-0.06320678442716599,
-0.020820846781134605,
0.09339893609285355,
-0.06854178011417389,
0.06734474003314972,
0.11970173567533493,
0.013255257159471512,
-0.0037943567149341106,
0.22074007987976074,
0.04093695804476738,
-0.14470507204532623,
0.003981142304837704,
0.030846750363707542,
0.004180752206593752,
-0.0025395264383405447,
-0.0644235908985138,
-0.04654702916741371,
0.006762680131942034,
0.0581594742834568,
-0.14950284361839294,
0.013889242894947529,
-0.04235804080963135,
-0.017818033695220947,
0.07043217867612839,
-0.12725752592086792,
0.028882410377264023,
0.014230300672352314,
-0.08333354443311691,
-0.05153413116931915,
0.06294560432434082,
-0.11150292307138443,
-0.11335097998380661,
0.02030862681567669,
-0.05391603335738182,
-0.028961364179849625,
-0.12452223151922226,
-0.13017340004444122,
-0.0130860460922122,
-0.05325687676668167,
0.020019130781292915,
-0.11156951636075974,
-0.1014837846159935,
-0.013829413801431656,
0.027394317090511322,
-0.006164416205137968,
-0.010345964692533016,
-0.0600842610001564,
0.018054276704788208,
-0.006989026442170143,
-0.0360989011824131,
0.015569144859910011,
-0.045612432062625885,
0.09693282842636108,
0.0872286781668663,
0.0666002705693245,
0.013592088595032692,
0.03319121524691582,
-0.07206664234399796,
0.06027889996767044,
-0.08753417432308197,
0.05841398984193802,
-0.024413838982582092,
0.06173458322882652,
-0.09243711084127426,
-0.089876189827919,
0.0521138496696949,
0.050511375069618225,
0.054428841918706894,
0.018130172044038773,
-0.1084955632686615,
0.013952977955341339,
0.1299598664045334,
-0.09758439660072327,
-0.1378110647201538,
0.12003973871469498,
-0.004408749286085367,
0.014745976775884628,
0.06500767916440964,
0.14259284734725952,
0.14465849101543427,
-0.08617116510868073,
-0.04650748893618584,
0.07186411321163177,
0.061029866337776184,
-0.07401248067617416,
0.0981050357222557,
0.03431079164147377,
0.035082992166280746,
0.025498807430267334,
0.045684270560741425,
0.06626230478286743,
-0.010050204582512379,
-0.034812044352293015,
-0.017860636115074158,
-0.08665220439434052,
-0.032185833901166916,
-0.0106061901897192,
0.02836202271282673,
-0.06901086866855621,
-0.07090790569782257,
0.022416766732931137,
0.17176608741283417,
-0.10095442086458206,
0.023098956793546677,
-0.08747220039367676,
-0.041025593876838684,
-0.08630965650081635,
0.009655029512941837,
-0.09566756337881088,
0.028364377096295357,
0.05638407543301582,
-0.04791123792529106,
0.06719326227903366,
0.08780873566865921,
0.0026902868412435055,
0.039869774132966995,
-0.0378061942756176,
-0.04742266237735748,
-0.05331793054938316,
-0.0584271177649498,
-0.12445005029439926,
-0.010592619888484478,
-0.08997789770364761,
-0.022724127396941185,
-0.08396440744400024,
-0.17870202660560608,
0.011224587447941303,
-0.04174414277076721,
0.028986187651753426,
0.009702397510409355,
-0.019286826252937317,
0.03415104001760483,
0.0456218421459198,
-0.05146082863211632,
-0.08650454133749008,
0.011503731831908226,
0.012066109105944633,
-0.10048358887434006,
-0.03915679454803467,
-0.11379977315664291,
-0.03143106773495674,
0.07705403864383698,
0.08139920979738235,
-0.06872610002756119,
0.006249997299164534,
-0.029198305681347847,
-0.05669116973876953,
-0.047226566821336746,
-0.06824977695941925,
0.17030294239521027,
0.008016293868422508,
0.17574343085289001,
-0.14808906614780426,
-0.049403153359889984,
-0.026307817548513412,
-0.012176789343357086,
0.02692338451743126,
0.16027244925498962,
-0.009654484689235687,
-0.09305350482463837,
0.05623087286949158,
-0.014422283507883549,
-0.06463345885276794,
0.15995986759662628,
-0.002045932924374938,
-0.09629131108522415,
0.020955994725227356,
0.09556654840707779,
-0.018544338643550873,
0.14871901273727417,
-0.08502743393182755,
-0.010998214595019817,
0.005894687958061695,
0.03316685929894447,
0.03496817499399185,
-0.1283774971961975,
0.017548611387610435,
0.052064865827560425,
-0.07800886780023575,
-0.05532929673790932,
-0.02678622119128704,
-0.04239191487431526,
0.045232538133859634,
-0.0042083607986569405,
-0.014937479980289936,
-0.015365993604063988,
-0.021446485072374344,
-0.0859965905547142,
0.20174990594387054,
-0.08246880024671555,
-0.2292635589838028,
-0.17848969995975494,
0.04464969038963318,
-0.05439269170165062,
-0.0041871387511491776,
0.046651098877191544,
-0.11902661621570587,
-0.07444968074560165,
-0.09202485531568527,
0.14480547606945038,
-0.10510130971670151,
0.007017058320343494,
-0.011868579313158989,
0.041005074977874756,
0.02528158389031887,
-0.1834186166524887,
0.03112652711570263,
-0.015306133776903152,
0.0027870701160281897,
0.003059978596866131,
-0.06299076229333878,
0.08536161482334137,
0.11943188309669495,
-0.09486758708953857,
0.021054770797491074,
-0.0013289673952385783,
0.16672448813915253,
-0.051148656755685806,
0.03264371305704117,
0.1991247683763504,
0.02259332872927189,
0.029855836182832718,
0.0529562346637249,
0.011839807964861393,
-0.09429331868886948,
0.06946004927158356,
0.049746446311473846,
-0.028339853510260582,
-0.25518447160720825,
0.002327715279534459,
-0.06548617035150528,
0.04154378920793533,
0.11542652547359467,
0.0500517301261425,
-0.12342122197151184,
0.03482350334525108,
-0.004580145701766014,
0.15366721153259277,
-0.03412830829620361,
0.055142179131507874,
0.01580016501247883,
0.017053211107850075,
0.01248955074697733,
-0.09319482743740082,
0.010874527506530285,
0.07120152562856674,
0.11172850430011749,
0.22657951712608337,
-0.06511537730693817,
0.19402450323104858,
0.022281669080257416,
0.07595867663621902,
0.031912703067064285,
0.11534754931926727,
-0.12373961508274078,
-0.0003504145424813032,
0.0007004620274528861,
-0.008288063108921051,
-0.07450514286756516,
0.05178136005997658,
-0.03823476657271385,
0.08622448146343231,
-0.06997781991958618,
0.02458123490214348,
0.01583545282483101,
0.14546312391757965,
0.07079669833183289,
-0.19312407076358795,
-0.11697664856910706,
0.02193513698875904,
-0.11585407704114914,
-0.11752794682979584,
0.06864787638187408,
0.21316783130168915,
-0.037833888083696365,
0.00374054373241961,
-0.002167963655665517,
0.13887502253055573,
-0.07576724141836166,
-0.027673931792378426,
0.015989607200026512,
0.07679671794176102,
0.0016904361546039581,
0.1283477544784546,
-0.26737967133522034,
0.10280989110469818,
0.01080942340195179,
0.09810992330312729,
-0.01834973134100437,
0.05698811635375023,
-0.03694099187850952,
0.004690306726843119,
0.0744551420211792,
0.0017896720673888922,
-0.05860549584031105,
-0.21385769546031952,
-0.056396614760160446,
0.023505577817559242,
0.04723906144499779,
-0.009314191527664661,
0.08487358689308167,
-0.0035191173665225506,
0.06576257199048996,
-0.030172407627105713,
-0.11542420834302902,
-0.07075631618499756,
-0.11555858701467514,
-0.033497247844934464,
0.005658726207911968,
-0.014962404035031796,
-0.02184109389781952,
0.01687016896903515,
-0.00975184515118599,
0.2021731585264206,
-0.152114599943161,
-0.1123652532696724,
-0.0850655734539032,
0.07720308005809784,
0.1302100569009781,
-0.10210244357585907,
0.015776319429278374,
0.02196286991238594,
0.041993651539087296,
-0.03716886043548584,
-0.04860708490014076,
0.01917686127126217,
-0.051061585545539856,
-0.07708842307329178,
-0.02497745119035244,
0.08496744185686111,
-0.009537900798022747,
0.048908431082963943,
0.009463182650506496,
-0.09611067920923233,
-0.04267405718564987,
-0.13428382575511932,
-0.09317074716091156,
-0.020846713334321976,
0.04024092108011246,
-0.007294729817658663,
-0.08425655961036682,
0.07799550145864487,
-0.022093379870057106,
-0.08927900344133377,
0.06777133047580719,
0.1800583153963089,
-0.05837828293442726,
0.017033880576491356,
0.1176736131310463,
-0.05417810007929802,
-0.14498281478881836,
-0.055132512003183365,
0.04924783110618591,
0.09412357211112976,
-0.04398089647293091,
-0.13025110960006714,
0.07821869850158691,
0.0379643440246582,
0.030678730458021164,
0.01899382658302784,
-0.275745153427124,
-0.12866303324699402,
0.04566293582320213,
0.08610524982213974,
0.08888259530067444,
-0.11309996247291565,
-0.040373604744672775,
-0.054348405450582504,
-0.09464602172374725,
0.05437958613038063,
0.06715435534715652,
0.1313750296831131,
-0.04648372903466225,
0.039819613099098206,
0.029221922159194946,
-0.02279888279736042,
0.106954425573349,
0.014081372879445553,
0.10429558157920837,
-0.02421911433339119,
0.030517064034938812,
0.04763001948595047,
-0.0614769384264946,
0.18151015043258667,
-0.17850886285305023,
0.07728643715381622,
-0.23173974454402924,
-0.06062300130724907,
-0.013650370761752129,
-0.009748218581080437,
-0.03502501919865608,
-0.06136947497725487,
-0.10519848763942719,
0.0170424897223711,
0.045476242899894714,
-0.01613673008978367,
0.07822635024785995,
-0.02088850550353527,
-0.05961150676012039,
0.04908764734864235,
0.09232553094625473,
-0.041067078709602356,
-0.12393614649772644,
0.021409515291452408,
0.034450463950634,
0.09949421882629395,
-0.22144313156604767,
0.019250473007559776,
0.129014790058136,
0.0021961242891848087,
0.11345718801021576,
0.012518507428467274,
-0.07198822498321533,
0.045189741998910904,
0.07472764700651169,
-0.025740280747413635,
-0.09384661167860031,
-0.004403171129524708,
-0.026563864201307297,
-0.08443926274776459,
0.030575718730688095,
0.08377430588006973,
-0.06960271298885345,
-0.0197911374270916,
-0.015686606988310814,
0.0033394473139196634,
-0.07043414562940598,
0.2014245092868805,
0.03464258462190628,
0.09114604443311691,
-0.060115665197372437,
0.08906005322933197,
0.09708642214536667,
-0.1082964688539505,
0.008472519926726818,
0.1631878763437271,
-0.07845543324947357,
-0.025493701919913292,
0.0486467219889164,
0.10223618149757385,
-0.047600917518138885,
-0.06499660015106201,
-0.10002963244915009,
-0.07691856473684311,
0.01841442473232746,
0.03256050869822502,
0.06866611540317535,
0.07091546058654785,
-0.031615499407052994,
0.012844179756939411,
-0.09399914741516113,
0.09572922438383102,
0.08555300533771515,
0.04831714928150177,
-0.1396360993385315,
0.1438581645488739,
0.038721244782209396,
0.0872025117278099,
0.00014263333287090063,
0.03815765678882599,
-0.09345278888940811,
0.04556207358837128,
-0.0432903990149498,
0.04929526522755623,
-0.006433937232941389,
0.05261662229895592,
-0.03312227129936218,
0.013021999038755894,
-0.0331108532845974,
0.052008263766765594,
-0.04339245706796646,
-0.02353694476187229,
-0.01717355102300644,
0.0433570072054863,
-0.051640015095472336,
-0.03320571780204773,
0.01255595963448286,
-0.0786769762635231,
0.10219736397266388,
-0.07092613726854324,
-0.01535839680582285,
0.003629392245784402,
-0.007074451074004173,
0.07441669702529907,
0.02642500028014183,
0.0605270154774189,
0.0007305156905204058,
-0.004010811448097229,
0.05143129080533981,
0.016492944210767746,
-0.0004898281185887754,
-0.006540168542414904,
0.04085750877857208,
-0.14232675731182098,
-0.09339852631092072,
-0.10686920583248138,
-0.06384766846895218,
-0.07288140803575516,
0.08517967909574509,
0.08025932312011719,
0.07236211001873016,
0.09431848675012589,
-0.020944729447364807,
-0.005102173425257206,
-0.14439615607261658,
-0.029012951999902725,
0.05429743602871895,
-0.02760211192071438,
-0.09210032969713211,
-0.05299282819032669,
0.05307747423648834,
-0.039125602692365646,
0.10985282808542252,
0.0066169872879981995,
0.041586682200431824,
-0.012995694763958454,
-0.04831352457404137,
-0.023083368316292763,
-0.0065641519613564014,
0.21817754209041595,
-0.08982173353433609,
0.02326151169836521,
-0.001680669840425253,
-0.007335641887038946,
0.044762831181287766,
0.13192033767700195,
0.09418725222349167,
0.1572713702917099,
0.04270851984620094,
0.10620253533124924,
-0.06219033524394035,
-0.038387950509786606,
-0.17004604637622833,
0.05227559432387352,
0.002878490136936307,
0.02751888521015644,
-0.022056611254811287,
0.0955326035618782,
0.14608106017112732,
-0.13246774673461914,
0.09541764110326767,
0.027362290769815445,
-0.10372567176818848,
-0.049826450645923615,
-0.0671912357211113,
-0.04477490112185478,
-0.1028134822845459,
0.025147924199700356,
-0.11150285601615906,
0.02616136707365513,
0.07103823125362396,
0.047784224152565,
-0.022531159222126007,
0.13220787048339844,
-0.016383348032832146,
-0.05858410894870758,
0.018794048577547073,
0.02162669412791729,
0.04245137795805931,
0.08630295097827911,
0.011291131377220154,
0.07533793896436691,
-0.06645901501178741,
0.06931869685649872,
0.01906472258269787,
0.016731033101677895,
0.009055149741470814,
0.0033264257945120335,
-0.004918148275464773,
-0.049583569169044495,
-0.002405837643891573,
0.08031560480594635,
0.1840318888425827,
0.045193132013082504,
-0.043755531311035156,
-0.04527457430958748,
0.17250581085681915,
-0.043562889099121094,
-0.06911970674991608,
-0.11520987749099731,
0.149635449051857,
0.05850192531943321,
0.023179179057478905,
0.011363323777914047,
-0.07345814257860184,
-0.05546868219971657,
0.2359207272529602,
0.00314884539693594,
-0.024123327806591988,
-0.04824208468198776,
-0.019031215459108353,
-0.009473814629018307,
-0.043105803430080414,
0.14834502339363098,
0.025934580713510513,
0.2070062905550003,
0.0010002808412536979,
-0.012724673375487328,
-0.04318699240684509,
-0.028075039386749268,
-0.02297045662999153,
0.1825079470872879,
-0.03919849172234535,
0.030089594423770905,
-0.09984923154115677,
-0.01738397218286991,
0.028759272769093513,
-0.10428875684738159,
0.10720571130514145,
-0.09217580407857895,
-0.07436487823724747,
0.018616221845149994,
0.07423262298107147,
-0.019118132069706917,
0.031005140393972397,
-0.010726910084486008,
0.05273718759417534,
0.04345043748617172,
-0.02844989486038685,
-0.10085774958133698,
-0.1273372918367386,
0.05025399103760719,
-0.019480952993035316,
0.16295884549617767,
0.020360108464956284,
0.09732551872730255,
0.0834309309720993,
0.013544744811952114,
-0.07495319843292236,
0.11322280019521713,
0.02793327532708645,
0.02643348090350628,
0.07386847585439682,
0.1297593116760254,
-0.030763093382120132,
0.13323302567005157,
-0.007418489083647728,
-0.039948709309101105,
-0.03742442652583122,
-0.019587542861700058,
-0.012086877599358559,
-0.14154790341854095,
0.006483667995780706,
-0.05211159959435463,
0.12719663977622986,
0.1764812171459198,
-0.04671325907111168,
-0.027797376736998558,
-0.04230935126543045,
0.07537628710269928,
-0.011556181125342846,
0.08463648706674576,
0.0035995151847600937,
-0.16173237562179565,
0.008067410439252853,
-0.021472809836268425,
0.009722311049699783,
-0.1868862360715866,
-0.03769901022315025,
-0.046711914241313934,
-0.04285123199224472,
-0.08039472997188568,
0.14032433927059174,
0.07472483813762665,
0.03601575642824173,
-0.04247146099805832,
-0.20155665278434753,
-0.02384955994784832,
0.053309522569179535,
-0.1353737711906433,
-0.12452903389930725
] |
null | null |
transformers
|
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on Code Comment Generation dataset.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/code%20comment%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_comment_generation_java
|
[
"transformers",
"pytorch",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on Code Comment Generation dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
43,
112
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08988220244646072,
0.0011700736358761787,
-0.0005916694644838572,
0.05899043008685112,
0.1306508183479309,
0.002515610074624419,
0.08523766696453094,
0.05626345053315163,
0.009586567990481853,
-0.05055145546793938,
0.08392369002103806,
0.1345910131931305,
0.032039083540439606,
0.13981787860393524,
-0.035493846982717514,
-0.21899881958961487,
0.003652178216725588,
0.05893169716000557,
-0.14082516729831696,
0.11976668238639832,
0.1355443298816681,
-0.05337446555495262,
0.10154765844345093,
-0.00574476458132267,
-0.22599144279956818,
0.06357136368751526,
-0.032047826796770096,
-0.08813145756721497,
0.13124816119670868,
0.08680086582899094,
0.11173763871192932,
0.03918364644050598,
0.003651048056781292,
-0.22112230956554413,
0.03465387225151062,
-0.03963898867368698,
0.00798027589917183,
0.0511385016143322,
0.047929324209690094,
-0.023854168131947517,
0.19435393810272217,
-0.008722137659788132,
0.004094233270734549,
0.05297274515032768,
-0.1120922639966011,
-0.0804741308093071,
-0.014952254481613636,
-0.010383839718997478,
0.08483249694108963,
0.06383771449327469,
0.017854878678917885,
0.11873018741607666,
-0.1371668130159378,
0.12510235607624054,
0.09367094933986664,
-0.16231712698936462,
-0.01706589385867119,
0.1183757334947586,
0.08906234800815582,
-0.051917724311351776,
-0.052522797137498856,
0.004948686342686415,
0.07585868239402771,
0.023956263437867165,
0.028646130114793777,
-0.1363506019115448,
-0.20030298829078674,
0.07398135960102081,
-0.05543311685323715,
-0.055355966091156006,
0.2960522770881653,
-0.003404274582862854,
-0.03358874097466469,
-0.03888287767767906,
-0.030471699312329292,
0.04032648354768753,
0.011538791470229626,
-0.01169701386243105,
0.008602617308497429,
-0.005970461759716272,
-0.0008233806001953781,
-0.019247429445385933,
-0.10768236964941025,
-0.12401290237903595,
-0.003489702707156539,
0.07162032276391983,
-0.0038181354757398367,
0.027238506823778152,
-0.16689111292362213,
0.09237004816532135,
0.06295876204967499,
-0.09003796428442001,
0.02909884601831436,
-0.07127001881599426,
-0.023978563025593758,
-0.010045590810477734,
-0.04352947697043419,
-0.1783963292837143,
0.09366919845342636,
0.024683896452188492,
-0.06293532252311707,
0.04624905809760094,
0.01036843005567789,
0.08267419785261154,
0.06778648495674133,
0.17335742712020874,
-0.006172779016196728,
-0.07085349410772324,
0.03781725838780403,
-0.030799640342593193,
-0.05727875977754593,
0.008080757223069668,
-0.07241763174533844,
-0.03846518695354462,
0.017628883942961693,
0.12061984837055206,
-0.10832293331623077,
0.07351330667734146,
-0.06310901045799255,
-0.03801019489765167,
0.015371949411928654,
-0.13754089176654816,
-0.03307241201400757,
-0.0005734566366299987,
-0.06501809507608414,
-0.042082641273736954,
0.10656028240919113,
-0.05201490968465805,
-0.11328966170549393,
-0.033891648054122925,
-0.0763145461678505,
-0.0019025900401175022,
-0.09781812876462936,
-0.07802879065275192,
0.013595929369330406,
0.032407328486442566,
0.0674436166882515,
-0.12299999594688416,
-0.18766485154628754,
0.0029779518954455853,
0.07971084117889404,
-0.009050820954144001,
0.048650551587343216,
-0.09571053087711334,
-0.014416955411434174,
-0.04566057771444321,
-0.02493700198829174,
0.04697229713201523,
-0.0691295862197876,
0.076988585293293,
0.0912989005446434,
0.055363066494464874,
-0.07416970282793045,
0.05298998951911926,
-0.13486631214618683,
0.06589051336050034,
-0.17332588136196136,
0.09350299835205078,
-0.0423872210085392,
0.11809054017066956,
-0.0999697893857956,
-0.05775216966867447,
0.04809259995818138,
0.06353237479925156,
0.0544382780790329,
0.11810354143381119,
-0.15005548298358917,
-0.03506895899772644,
0.1340782791376114,
-0.12139983475208282,
-0.21874426305294037,
0.06680528074502945,
-0.06771309673786163,
0.20661048591136932,
0.048259980976581573,
0.20647209882736206,
0.14005877077579498,
-0.021974284201860428,
0.07561580836772919,
0.08897511661052704,
-0.04537069424986839,
-0.07797744125127792,
0.06760623306035995,
0.06577382981777191,
-0.13850486278533936,
0.06226293742656708,
-0.020350849255919456,
0.10892961919307709,
-0.04030809924006462,
-0.042194515466690063,
-0.005163861438632011,
-0.06631151586771011,
0.027598446235060692,
-0.011642970144748688,
0.08378174901008606,
-0.005563246086239815,
0.023399831727147102,
0.062048859894275665,
0.10612904280424118,
-0.12023647129535675,
-0.0063996268436312675,
-0.09589648246765137,
0.03322671353816986,
-0.1138787642121315,
0.0316421315073967,
-0.20985278487205505,
0.01042892411351204,
0.017232825979590416,
0.013946432620286942,
0.035599738359451294,
0.05838555470108986,
-0.0046605938114225864,
0.01833873800933361,
0.01261906698346138,
-0.0030659872572869062,
0.013366339728236198,
-0.009624860249459743,
-0.021903768181800842,
-0.09792579710483551,
-0.04479599371552467,
-0.0554245300590992,
-0.02976255863904953,
-0.17547526955604553,
-0.00862764474004507,
0.021648604422807693,
0.06666557490825653,
0.027495773509144783,
0.04464271292090416,
0.05138328671455383,
0.0708744004368782,
-0.045034464448690414,
-0.01961781457066536,
0.06347786635160446,
0.02339843474328518,
-0.10447156429290771,
0.08036870509386063,
-0.051065847277641296,
0.04928697273135185,
0.09023527055978775,
-0.1629306972026825,
-0.044646527618169785,
-0.054830100387334824,
-0.04070757329463959,
-0.03686822950839996,
0.0017091723857447505,
-0.022382348775863647,
0.19204093515872955,
-0.003622691845521331,
0.1718650609254837,
-0.12088833004236221,
-0.03901643678545952,
-0.03273335471749306,
-0.028472842648625374,
0.02933143824338913,
0.13339662551879883,
0.07930441200733185,
-0.23419880867004395,
0.057449933141469955,
0.07695648074150085,
-0.011981246992945671,
0.22084607183933258,
-0.04155745729804039,
-0.02490404061973095,
-0.027774875983595848,
0.07246080785989761,
-0.043176744133234024,
0.14724025130271912,
-0.21634073555469513,
-0.028682934120297432,
0.02062462642788887,
-0.0023216218687593937,
0.1161193698644638,
-0.12181691080331802,
-0.0029110510367900133,
0.020291658118367195,
-0.038947973400354385,
-0.10302789509296417,
0.043862540274858475,
0.006902191787958145,
0.031534381210803986,
-0.009307844564318657,
-0.020717762410640717,
0.033244602382183075,
-0.03723016381263733,
-0.11484067142009735,
0.22788545489311218,
-0.08601211756467819,
-0.2627450227737427,
-0.2007770985364914,
0.0725664421916008,
-0.024979613721370697,
-0.0038654087111353874,
0.06609509140253067,
-0.04399188980460167,
-0.053622011095285416,
-0.04148766025900841,
0.11764882504940033,
-0.01823277585208416,
-0.04601171612739563,
-0.0068123782984912395,
0.07681731134653091,
-0.00985840056091547,
-0.1948174238204956,
-0.01845640502870083,
0.01508887019008398,
0.0734006017446518,
0.014961613342165947,
-0.1438610702753067,
0.11306153237819672,
0.07972381263971329,
-0.05730101838707924,
0.045192357152700424,
-0.029691923409700394,
0.20673920214176178,
-0.05622434243559837,
-0.06149457395076752,
0.16587112843990326,
-0.09742705523967743,
0.004476348403841257,
0.02629779651761055,
0.002926769433543086,
-0.11107775568962097,
0.04144812747836113,
-0.042793046683073044,
-0.059498678892850876,
-0.2564241290092468,
-0.08758801966905594,
-0.08535675704479218,
0.10409896075725555,
0.017392896115779877,
0.027256008237600327,
-0.07204180955886841,
0.05894485116004944,
0.08984824270009995,
0.1400626301765442,
-0.003977205604314804,
0.06246063485741615,
0.051382195204496384,
-0.002249479992315173,
-0.005410353187471628,
-0.1111409142613411,
-0.05401334911584854,
0.03520138934254646,
0.09211356937885284,
0.20483897626399994,
0.003053132677450776,
0.15499401092529297,
0.08056806027889252,
0.031697649508714676,
0.037230536341667175,
0.18254926800727844,
-0.1213223785161972,
0.025860963389277458,
-0.025737104937434196,
-0.04536980763077736,
-0.14329972863197327,
0.02518175169825554,
-0.06510483473539352,
0.049405500292778015,
-0.13586650788784027,
-0.04694370925426483,
0.07781799882650375,
0.08304891735315323,
-0.018380483612418175,
-0.24728569388389587,
-0.10838811844587326,
0.03534451499581337,
-0.07308349013328552,
-0.06392835825681686,
0.05326922982931137,
0.1690331995487213,
-0.12794910371303558,
-0.012324780225753784,
-0.03803946077823639,
0.16267995536327362,
-0.07427699863910675,
0.03671019896864891,
-0.053080152720212936,
-0.032441165298223495,
0.0173173900693655,
0.16845768690109253,
-0.2036825716495514,
0.24104878306388855,
0.0008919218671508133,
0.017730826511979103,
-0.061873339116573334,
0.03128684684634209,
0.0038835995364934206,
0.09420344233512878,
0.11658381670713425,
-0.012938634492456913,
-0.04281483218073845,
-0.14855900406837463,
0.04441894218325615,
0.09592119604349136,
0.056150346994400024,
-0.021073482930660248,
0.04965851455926895,
-0.0214788056910038,
0.031354472041130066,
-0.018751030787825584,
-0.06616666167974472,
-0.10020891577005386,
-0.09267476201057434,
-0.0004967356217093766,
-0.033205654472112656,
0.0666838064789772,
-0.024917716160416603,
0.024112574756145477,
0.09423704445362091,
0.17187823355197906,
-0.0624428316950798,
-0.06778028607368469,
-0.09843280166387558,
0.026975272223353386,
0.12988562881946564,
-0.08251581341028214,
-0.009367221966385841,
0.002617024816572666,
0.02918660268187523,
0.002559931017458439,
-0.12893110513687134,
0.04852449521422386,
-0.0667879655957222,
0.002700514392927289,
-0.02691095881164074,
0.0885205864906311,
-0.0197609756141901,
-0.022972820326685905,
0.06565316766500473,
-0.07976388186216354,
-0.053998738527297974,
-0.14578494429588318,
-0.11172023415565491,
-0.0412495881319046,
0.062333300709724426,
0.04495782032608986,
-0.1416628360748291,
0.01874513179063797,
-0.007897837087512016,
-0.03840301185846329,
0.20995067059993744,
0.09366059303283691,
-0.01834128424525261,
0.022389624267816544,
0.18313026428222656,
-0.11048799008131027,
-0.2261517345905304,
-0.010685722343623638,
-0.0372881218791008,
0.07377961277961731,
0.013861242681741714,
-0.13055327534675598,
0.0902090072631836,
-0.01533817034214735,
0.04391138628125191,
-0.016050592064857483,
-0.27719244360923767,
-0.09532369673252106,
0.11659956723451614,
0.1268284171819687,
0.09162237495183945,
-0.11435838788747787,
-0.06806322932243347,
-0.08779069036245346,
-0.23685935139656067,
0.16179205477237701,
-0.0934215784072876,
0.09244190901517868,
-0.017621608451008797,
0.05801767110824585,
0.02675817906856537,
-0.052986420691013336,
0.11753298342227936,
0.028388574719429016,
0.10725416243076324,
-0.032378632575273514,
-0.11228156834840775,
0.08992365002632141,
-0.03348778933286667,
0.16141070425510406,
-0.10829054564237595,
0.08636639267206192,
-0.21948057413101196,
-0.03900923579931259,
-0.044250987470149994,
0.03997242450714111,
0.00012633312144316733,
-0.07901515066623688,
-0.04573111981153488,
0.024686012417078018,
0.034621257334947586,
0.014120318926870823,
0.11726387590169907,
-0.040537238121032715,
0.0018987843068316579,
0.1196560189127922,
0.13852165639400482,
-0.05593486502766609,
-0.018360447138547897,
0.03157425299286842,
0.029511328786611557,
0.10870206356048584,
-0.2198941707611084,
0.08119071274995804,
0.11591093242168427,
0.008926580660045147,
0.13063952326774597,
0.0760807916522026,
-0.037028320133686066,
0.028404254466295242,
0.09100260585546494,
-0.13696987926959991,
-0.044500820338726044,
-0.06795801967382431,
-0.03864699602127075,
0.024726448580622673,
0.07894830405712128,
0.1338089257478714,
-0.06801112741231918,
-0.008680530823767185,
-0.005969290155917406,
-0.017782030627131462,
-0.13336512446403503,
0.12633411586284637,
0.04028049483895302,
0.07612276077270508,
-0.08342841267585754,
0.09075327962636948,
0.06085534393787384,
-0.1472235769033432,
-0.02864900976419449,
0.13369402289390564,
-0.13342724740505219,
-0.08411666750907898,
-0.011761167086660862,
0.30839139223098755,
-0.10372736304998398,
-0.10468919575214386,
-0.13637030124664307,
-0.05912001058459282,
-0.005344763398170471,
0.2204441875219345,
0.09662609547376633,
0.10087325423955917,
-0.04939962550997734,
-0.0171208418905735,
-0.0989561527967453,
0.057827189564704895,
0.09446126967668533,
0.011949972249567509,
-0.09254995733499527,
0.08789956569671631,
-0.005276386626064777,
0.157162144780159,
-0.05239340290427208,
-0.03521499037742615,
-0.17254100739955902,
0.07116623967885971,
-0.12618517875671387,
0.06644972413778305,
-0.07561706751585007,
0.026630651205778122,
0.015143880620598793,
0.005463434848934412,
-0.026176314800977707,
0.053015775978565216,
-0.07822185009717941,
0.01756472885608673,
-0.0014960664557293057,
0.0810580924153328,
-0.08505145460367203,
-0.017186280339956284,
0.08159688115119934,
-0.052818428725004196,
0.09426230192184448,
0.01433649193495512,
-0.06947510689496994,
0.10272706300020218,
-0.18468765914440155,
-0.02244098298251629,
0.05156077817082405,
0.009450017474591732,
0.06727570295333862,
-0.05337309092283249,
0.0320371575653553,
0.03282379359006882,
0.03358086198568344,
-0.01762120984494686,
0.11356164515018463,
-0.13208653032779694,
-0.09194155782461166,
-0.030354848131537437,
-0.12243762612342834,
-0.04263189435005188,
0.03260822221636772,
0.048414770513772964,
0.08995033800601959,
0.07838329672813416,
-0.023279324173927307,
0.03567604720592499,
-0.06429809331893921,
-0.017240524291992188,
0.051821865141391754,
-0.08660681545734406,
-0.04302877560257912,
-0.09935514628887177,
0.014219089411199093,
-0.07087372988462448,
0.1903209239244461,
0.018765917047858238,
0.13865819573402405,
-0.016897859051823616,
-0.05085611715912819,
0.0006721297395415604,
0.049318645149469376,
0.21422246098518372,
-0.04361223056912422,
0.04702923074364662,
-0.07533659040927887,
0.06970828771591187,
0.01458415761590004,
0.04787653312087059,
0.09806740283966064,
0.11965150386095047,
-0.01701836660504341,
0.10562299191951752,
0.037623081356287,
0.03721412643790245,
-0.04915336146950722,
-0.0952320322394371,
0.10092632472515106,
0.054734379053115845,
-0.0388081818819046,
0.09442915767431259,
0.1154908686876297,
-0.10611135512590408,
0.09990248829126358,
0.003710903460159898,
-0.10531626641750336,
-0.03984867408871651,
-0.017273206263780594,
-0.047289036214351654,
-0.12293103337287903,
-0.003615979105234146,
-0.13133299350738525,
-0.042646925896406174,
0.06051303446292877,
0.024177702143788338,
-0.0630943700671196,
0.1888929307460785,
-0.0059499978087842464,
-0.07677318155765533,
0.054108500480651855,
-0.003711400553584099,
0.0184246227145195,
-0.022401737049221992,
0.07840927690267563,
-0.007503150962293148,
-0.0052742427214980125,
-0.013551375828683376,
0.049591027200222015,
-0.03501656651496887,
-0.00897204503417015,
-0.07237112522125244,
-0.029233397915959358,
-0.04313994199037552,
0.033677954226732254,
-0.006675780285149813,
0.018627643585205078,
0.023809658363461494,
-0.04091648757457733,
0.0032539258245378733,
0.23888877034187317,
-0.037934187799692154,
-0.08973363041877747,
-0.13817402720451355,
0.18365831673145294,
0.053223203867673874,
0.05742397531867027,
0.01709499955177307,
-0.05927034467458725,
-0.04214084520936012,
0.26330694556236267,
0.1998661458492279,
-0.04872638359665871,
-0.0012914909748360515,
0.002072732662782073,
0.018252229318022728,
-0.0012625836534425616,
0.1328662931919098,
0.03692758455872536,
0.21824029088020325,
-0.019745884463191032,
-0.040373545140028,
-0.05448611453175545,
-0.04440753906965256,
0.03256542980670929,
0.12083006650209427,
0.04014845937490463,
-0.040484581142663956,
-0.04003285616636276,
0.10020245611667633,
-0.15116344392299652,
-0.11100228875875473,
0.02678598463535309,
-0.14024856686592102,
-0.07818781584501266,
-0.07165390253067017,
0.06490820646286011,
-0.033813782036304474,
0.05003548040986061,
-0.043899212032556534,
-0.006536742206662893,
0.0533490888774395,
0.046224456280469894,
-0.13900801539421082,
-0.09470134973526001,
0.05062674358487129,
-0.041878823190927505,
0.11257002502679825,
-0.024257654324173927,
0.10063938796520233,
0.08670765161514282,
0.029194241389632225,
-0.04823720082640648,
0.05319472402334213,
0.05799384042620659,
0.050267335027456284,
0.05831695720553398,
0.06149984151124954,
-0.026680735871195793,
0.12803401052951813,
-0.04794194549322128,
-0.13390997052192688,
0.02498825266957283,
0.00790359079837799,
0.00041334732668474317,
-0.1060834676027298,
-0.028698984533548355,
-0.08879996836185455,
0.09309741854667664,
0.1612338125705719,
-0.040216393768787384,
0.02167477086186409,
-0.07846252620220184,
0.14863023161888123,
0.0011630624067038298,
-0.0260419063270092,
-0.07766015082597733,
-0.12618815898895264,
-0.024407466873526573,
0.030091045424342155,
-0.01566121354699135,
-0.2278585433959961,
-0.0020173247903585434,
-0.052941303700208664,
-0.006896855775266886,
-0.03724443539977074,
0.11083689332008362,
0.12927688658237457,
0.04779813438653946,
-0.027055734768509865,
-0.18318159878253937,
-0.012260167859494686,
0.059595737606287,
-0.10485997051000595,
-0.15216943621635437
] |
null | null |
transformers
|
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/code%20comment%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_comment_generation_java_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12585723400115967,
-0.026236552745103836,
-0.00039474861114285886,
0.13211920857429504,
0.1052158772945404,
0.024469507858157158,
0.0591035857796669,
0.06566724926233292,
-0.026142850518226624,
0.019144505262374878,
0.043829187750816345,
0.009917661547660828,
0.03194497898221016,
0.19527633488178253,
0.007321162614971399,
-0.11821306496858597,
-0.01441049575805664,
0.04375752434134483,
-0.03412657603621483,
0.12799809873104095,
0.09464237838983536,
-0.07367303222417831,
0.05344003438949585,
-0.06963003426790237,
-0.24587121605873108,
0.06040716543793678,
-0.005330230109393597,
-0.06350749731063843,
0.10005098581314087,
0.046501241624355316,
0.12561079859733582,
-0.004525402095168829,
0.02247852459549904,
-0.14087018370628357,
0.011290756985545158,
0.010874598287045956,
0.03296232968568802,
0.016749490052461624,
0.0447220616042614,
0.054438650608062744,
0.14354965090751648,
0.011080224066972733,
0.04207644984126091,
0.0608341209590435,
-0.075482077896595,
-0.11877140402793884,
-0.008068189024925232,
0.022872159257531166,
0.050509367138147354,
0.10072285681962967,
-0.013219339773058891,
0.12328290939331055,
-0.1501290649175644,
0.12794625759124756,
0.10178865492343903,
-0.2205667495727539,
-0.011539865285158157,
0.12894536554813385,
0.0927937850356102,
0.09769106656312943,
-0.06067175790667534,
-0.0672270804643631,
0.1033853068947792,
0.05210365355014801,
0.0440920889377594,
-0.10100466012954712,
-0.11150025576353073,
0.022468699142336845,
-0.07503936439752579,
-0.06392316520214081,
0.2207186073064804,
0.0015856869285926223,
-0.07739784568548203,
-0.05592091754078865,
-0.02542221173644066,
-0.13201341032981873,
0.0374101847410202,
0.02824513241648674,
0.0077795786783099174,
-0.03308640047907829,
0.017383534461259842,
0.03134448826313019,
-0.07321992516517639,
-0.156215637922287,
0.026762505993247032,
0.09202257543802261,
0.056334733963012695,
0.025806760415434837,
-0.0972677692770958,
0.10525389015674591,
0.035965658724308014,
-0.06084650382399559,
-0.02596239000558853,
-0.018169546499848366,
-0.10348914563655853,
0.03333239629864693,
-0.05156424641609192,
-0.18089841306209564,
0.016890492290258408,
0.011000005528330803,
-0.050191041082143784,
0.050935048609972,
0.027442367747426033,
0.03657933697104454,
0.023178501054644585,
0.19710326194763184,
0.05758211016654968,
-0.12294989079236984,
0.053980328142642975,
0.04302114620804787,
-0.03658386319875717,
-0.004668228793889284,
-0.06836730986833572,
-0.09838003665208817,
0.0920228511095047,
0.103676438331604,
-0.1376495361328125,
0.03673654794692993,
-0.07022888958454132,
-0.04279380291700363,
0.0025264378637075424,
-0.15929849445819855,
0.0032151408959180117,
0.026490429416298866,
-0.06745168566703796,
-0.055226072669029236,
0.09344282001256943,
-0.17019812762737274,
-0.14938797056674957,
-0.04551371559500694,
-0.08038246631622314,
-0.040750619024038315,
-0.16679783165454865,
-0.1562565416097641,
-0.009905598126351833,
-0.03775264322757721,
0.019607996568083763,
-0.0856919214129448,
-0.1567499190568924,
-0.026499656960368156,
0.017171012237668037,
0.0042329286225140095,
-0.0022340528666973114,
-0.07806277275085449,
-0.008744272403419018,
-0.029224151745438576,
-0.03873404487967491,
0.016251839697360992,
-0.04800160229206085,
0.12202376127243042,
0.10361825674772263,
0.05429181084036827,
-0.022768057882785797,
0.06021609902381897,
-0.07989028841257095,
0.06389015913009644,
-0.1128033846616745,
0.09511570632457733,
-0.05922497436404228,
0.07832744717597961,
-0.033712856471538544,
-0.10567363351583481,
0.07752685248851776,
0.06259674578905106,
0.06513503938913345,
0.03469080105423927,
-0.1366705596446991,
-0.02392842061817646,
0.18935242295265198,
-0.12378545105457306,
-0.13703544437885284,
0.1013755202293396,
-0.03913785517215729,
0.08304409682750702,
0.0823255255818367,
0.14087936282157898,
0.14751237630844116,
-0.02574765309691429,
0.02269434556365013,
0.04946866258978844,
0.04473854973912239,
-0.13207510113716125,
0.078522689640522,
0.06608656793832779,
-0.0872223749756813,
0.06117480993270874,
-0.01478028018027544,
0.09855498373508453,
-0.010783523321151733,
-0.0239590834826231,
-0.05118321254849434,
-0.07893981039524078,
-0.005664699710905552,
0.008255927823483944,
0.06537146866321564,
-0.08307012915611267,
-0.06044067069888115,
0.08776631206274033,
0.1739174872636795,
-0.13143737614154816,
-0.0021998193114995956,
-0.08101372420787811,
0.03583746403455734,
-0.07688263058662415,
0.029063282534480095,
-0.16375604271888733,
0.0325649231672287,
0.07844150811433792,
-0.025770071893930435,
0.053648173809051514,
0.13305774331092834,
0.013113963417708874,
0.04403573274612427,
0.001076222280971706,
-0.014596682973206043,
-0.12118953466415405,
-0.05674021691083908,
-0.06240558251738548,
-0.0622333288192749,
-0.08939100056886673,
-0.06176883354783058,
-0.037383995950222015,
-0.19358836114406586,
0.010996212251484394,
0.0024571961257606745,
0.001853487454354763,
0.027467569336295128,
-0.013445385731756687,
0.029417263343930244,
0.07592560350894928,
-0.06042556092143059,
-0.03699016198515892,
0.03189042583107948,
0.023625021800398827,
-0.04196050390601158,
-0.057185664772987366,
-0.0839410275220871,
0.004267454147338867,
0.10733994096517563,
0.043841756880283356,
-0.07947821170091629,
0.02209259383380413,
-0.020838428288698196,
-0.04890529811382294,
0.009075300768017769,
-0.06392823159694672,
0.14520132541656494,
-0.006608355790376663,
0.1983507126569748,
-0.1639021337032318,
-0.03859769552946091,
-0.025112252682447433,
0.02420279197394848,
0.06318081170320511,
0.13747084140777588,
-0.01622438244521618,
-0.08653673529624939,
0.06543482840061188,
0.01683769002556801,
-0.09954216331243515,
0.2318422794342041,
-0.04717247188091278,
-0.09359977394342422,
0.022481702268123627,
0.101055808365345,
-0.01726023107767105,
0.16586996614933014,
-0.20120343565940857,
-0.028124399483203888,
0.01757807657122612,
0.008418082259595394,
0.06603909283876419,
-0.1262398511171341,
0.003571978537365794,
0.00881741289049387,
-0.07274655997753143,
-0.06917236000299454,
-0.0056065721437335014,
-0.006834758911281824,
0.0387575700879097,
-0.008176133967936039,
-0.031433481723070145,
0.016055941581726074,
-0.04024582356214523,
-0.10628940910100937,
0.21987079083919525,
-0.09602946043014526,
-0.21602323651313782,
-0.2040950059890747,
0.11111773550510406,
-0.061122529208660126,
-0.012768196873366833,
0.03598064184188843,
-0.07880234718322754,
-0.055333949625492096,
-0.056401751935482025,
0.17240159213542938,
-0.060665108263492584,
-0.012171231210231781,
-0.01435982808470726,
0.07614044845104218,
0.009249195456504822,
-0.21010664105415344,
0.03442343324422836,
-0.0048103635199368,
-0.016055861487984657,
0.005629680585116148,
-0.10095660388469696,
0.09072472155094147,
0.1529223769903183,
-0.08136115223169327,
0.020464973524212837,
0.007481839973479509,
0.18847158551216125,
-0.03853006288409233,
-0.05531992390751839,
0.14313463866710663,
-0.018187547102570534,
-0.010401840321719646,
0.015083424746990204,
-0.013795833103358746,
-0.09916982799768448,
0.06409025937318802,
-0.009866506792604923,
-0.02596745826303959,
-0.27356770634651184,
-0.00845647044479847,
-0.07922615855932236,
0.05674004554748535,
0.03712105378508568,
0.0413694903254509,
-0.0890732929110527,
0.02837313897907734,
0.0603845976293087,
0.15054595470428467,
-0.004167080391198397,
0.05325125902891159,
0.05589048191905022,
-0.0010682804277166724,
0.007974784821271896,
-0.1004854068160057,
0.012687153182923794,
0.07315637916326523,
0.11156441271305084,
0.27044934034347534,
-0.09981284290552139,
0.19676874577999115,
0.0491618886590004,
0.047487713396549225,
0.04912053421139717,
0.13349290192127228,
-0.13167954981327057,
0.03170354664325714,
0.0028775951359421015,
-0.008881138637661934,
-0.11025122553110123,
0.009262888692319393,
-0.0655902847647667,
0.09022749960422516,
-0.105659618973732,
-0.05811047554016113,
0.009232779033482075,
0.14568792283535004,
0.04282408952713013,
-0.22510534524917603,
-0.1297021061182022,
0.021237995475530624,
-0.09513262659311295,
-0.10587750375270844,
0.06605285406112671,
0.2440856248140335,
-0.07777906954288483,
-0.04009328782558441,
-0.004036057740449905,
0.13317734003067017,
-0.03762878105044365,
-0.022341983392834663,
-0.037523891776800156,
0.06370224803686142,
0.01665172353386879,
0.13540183007717133,
-0.29388996958732605,
0.1311313360929489,
-0.00947359949350357,
0.06169387698173523,
-0.030243918299674988,
0.04973499849438667,
-0.03747805207967758,
0.07633032649755478,
0.03989657759666443,
-0.010491114109754562,
0.0321243479847908,
-0.1579677164554596,
0.014774211682379246,
0.041748303920030594,
0.017377477139234543,
0.0561746247112751,
0.0634833425283432,
-0.0033892886713147163,
0.058095864951610565,
-0.01937994919717312,
-0.12268511205911636,
-0.07125136256217957,
-0.06639190018177032,
-0.018781110644340515,
-0.030792556703090668,
-0.014738111756742,
-0.04478214308619499,
-0.00991110410541296,
0.07789471000432968,
0.1843179613351822,
-0.09354731440544128,
-0.07720760256052017,
-0.07483571767807007,
0.053669627755880356,
0.1080753356218338,
-0.08215166628360748,
0.02862425148487091,
-0.0027896263636648655,
0.0446929857134819,
-0.009759511798620224,
-0.0749555379152298,
0.051999036222696304,
-0.03830121457576752,
-0.06936479359865189,
-0.011952879838645458,
0.06294585019350052,
0.0002386235719313845,
0.027190104126930237,
0.012690755538642406,
-0.0948513075709343,
-0.04458683729171753,
-0.1200832799077034,
-0.1262609213590622,
-0.04303412139415741,
0.018741535022854805,
0.043059092015028,
-0.1465928852558136,
-0.059318192303180695,
0.004468001425266266,
-0.039026785641908646,
0.13075925409793854,
0.15834636986255646,
-0.05470700189471245,
0.029637636616826057,
0.1467912346124649,
-0.06087116897106171,
-0.19049674272537231,
0.032505374401807785,
0.04484245926141739,
0.11986634135246277,
-0.041664693504571915,
-0.16394750773906708,
0.047254327684640884,
0.019671672955155373,
0.036234382539987564,
0.04993234574794769,
-0.3111477196216583,
-0.124298095703125,
0.08194254338741302,
0.16031412780284882,
0.11915947496891022,
-0.12253566086292267,
-0.03827163949608803,
-0.06384580582380295,
-0.1613914966583252,
0.09109246730804443,
-0.049514468759298325,
0.13323689997196198,
-0.07510782778263092,
0.029634306207299232,
0.03460117429494858,
-0.04546914994716644,
0.07300940901041031,
0.033070631325244904,
0.12148083746433258,
-0.04318966343998909,
0.01886908896267414,
0.1260678768157959,
-0.033288780599832535,
0.18241581320762634,
-0.14614808559417725,
0.09777232259511948,
-0.23346085846424103,
-0.058837708085775375,
-0.07596266269683838,
0.0035391291603446007,
-0.035164427012205124,
-0.04677397385239601,
-0.07613455504179001,
0.03184521943330765,
-0.0032446098048239946,
-0.00829712487757206,
0.04233285412192345,
-0.03173864260315895,
-0.016440678387880325,
0.10348431020975113,
0.10533370077610016,
-0.014159985817968845,
-0.06808020174503326,
0.05416124314069748,
0.051319729536771774,
0.11345060914754868,
-0.1951245814561844,
0.029484568163752556,
0.10306665301322937,
0.015445868484675884,
0.12515705823898315,
0.04434192553162575,
-0.10411688685417175,
0.04325774684548378,
0.08766479045152664,
-0.0750451534986496,
-0.06345825642347336,
-0.02016264945268631,
-0.07905860990285873,
-0.06686796247959137,
0.05077603831887245,
0.09408465027809143,
-0.04938921332359314,
-0.020657626911997795,
-0.025196870788931847,
-0.019124053418636322,
-0.11272943019866943,
0.18608106672763824,
0.07443904131650925,
0.08523323386907578,
-0.0665619969367981,
0.06302518397569656,
0.08425268530845642,
-0.0833200067281723,
0.007560765370726585,
0.18762324750423431,
-0.10274427384138107,
-0.047820720821619034,
0.07313095033168793,
0.22005893290042877,
-0.02847026102244854,
-0.05767708271741867,
-0.13919954001903534,
-0.07715959846973419,
0.031016580760478973,
0.16348348557949066,
0.10155127942562103,
0.09458565711975098,
-0.02704859897494316,
-0.0018678705673664808,
-0.10723663866519928,
0.09187246859073639,
0.06407854706048965,
0.0490882508456707,
-0.10423073917627335,
0.12968170642852783,
0.038693755865097046,
0.12076935172080994,
-0.027045009657740593,
-0.010888428427278996,
-0.13848990201950073,
0.06369921565055847,
-0.11170892417430878,
0.033967047929763794,
-0.009921295568346977,
0.051322802901268005,
-0.025050470605492592,
0.0021674176678061485,
-0.030441172420978546,
0.06877367943525314,
-0.0817299485206604,
0.0013249327894300222,
0.0035439508501440287,
0.05621323361992836,
-0.051923129707574844,
-0.018700597807765007,
0.03288073092699051,
-0.09225159883499146,
0.12399701774120331,
-0.037525828927755356,
-0.02854696847498417,
0.08000285923480988,
-0.049286454916000366,
0.040958791971206665,
0.015966204926371574,
0.04931739717721939,
0.019842050969600677,
0.014671484008431435,
0.07819502055644989,
0.03610509634017944,
0.05314377695322037,
0.02479580044746399,
0.12040915340185165,
-0.13896796107292175,
-0.08439724147319794,
-0.05615983530879021,
-0.11293887346982956,
-0.056891068816185,
0.10046899318695068,
0.04846494644880295,
0.10459716618061066,
0.09038447588682175,
-0.032160621136426926,
0.010164521634578705,
-0.12667058408260345,
-0.06632032990455627,
0.028421442955732346,
-0.030612746253609657,
-0.08338314294815063,
-0.05646999925374985,
0.03816431015729904,
-0.03143743425607681,
0.12145933508872986,
0.017713358625769615,
0.03698056936264038,
-0.01978999562561512,
-0.0585256852209568,
-0.013456607237458229,
0.02100241370499134,
0.21142379939556122,
-0.08637963980436325,
0.042071085423231125,
0.002902889857068658,
0.016657989472150803,
0.007424081675708294,
0.11770502477884293,
0.12096460163593292,
0.1649218052625656,
-0.03284616395831108,
0.10017804801464081,
0.018167288973927498,
-0.0020746353548020124,
-0.07430213689804077,
0.01625019870698452,
0.022535011172294617,
0.061738159507513046,
-0.048874128609895706,
0.184814453125,
0.09122838824987411,
-0.12276554852724075,
0.11052931845188141,
0.02492682635784149,
-0.13253112137317657,
-0.033512577414512634,
0.02331276424229145,
-0.035516005009412766,
-0.14496952295303345,
0.024969592690467834,
-0.12931892275810242,
-0.01617809571325779,
0.05060683935880661,
0.05158989503979683,
-0.07924129068851471,
0.1722525656223297,
0.03325875848531723,
-0.060101963579654694,
0.05304422974586487,
-0.0014586496399715543,
0.02623690478503704,
0.022944945842027664,
0.035962484776973724,
0.036550626158714294,
-0.038014043122529984,
0.036747053265571594,
0.024303093552589417,
-0.02565881982445717,
-0.018152577802538872,
-0.02122334949672222,
-0.002317450474947691,
-0.016235489398241043,
0.019578594714403152,
0.05745765194296837,
0.161483496427536,
0.036641500890254974,
-0.07440722733736038,
-0.017855441197752953,
0.17308063805103302,
-0.02772173285484314,
-0.09572111070156097,
-0.12677089869976044,
0.13066083192825317,
0.05207398906350136,
0.01024426519870758,
0.026069123297929764,
-0.08174323290586472,
-0.0536544993519783,
0.2058023363351822,
0.05392878130078316,
-0.03116338700056076,
-0.02291734330356121,
0.007723798044025898,
-0.0017190796788781881,
-0.04019228741526604,
0.20301440358161926,
0.022998707368969917,
0.22599703073501587,
0.02336367964744568,
-0.007510367315262556,
-0.06858722120523453,
-0.04090980812907219,
0.004281101748347282,
0.12014146149158478,
-0.0387139692902565,
-0.039667759090662,
-0.08219575881958008,
-0.0031470749527215958,
-0.0010481227654963732,
-0.08123842626810074,
0.1004742905497551,
-0.13623583316802979,
-0.09835119545459747,
-0.04863959923386574,
0.049633387476205826,
-0.05801283195614815,
0.017373789101839066,
-0.024271216243505478,
0.04456029087305069,
0.07008473575115204,
-0.032589904963970184,
-0.10067441314458847,
-0.1686447411775589,
0.09465698152780533,
-0.0523281991481781,
0.13284459710121155,
-0.01554204523563385,
0.15438929200172424,
0.08540254086256027,
0.025193365290760994,
-0.06347198039293289,
0.11615575104951859,
0.0317261703312397,
0.058613382279872894,
0.048466503620147705,
0.122139111161232,
-0.05044671148061752,
0.1350281685590744,
-0.050135187804698944,
-0.030030010268092155,
-0.027380479499697685,
-0.07585285604000092,
-0.018968338146805763,
-0.16320990025997162,
-0.018677419051527977,
-0.09534351527690887,
0.09463343769311905,
0.1945871263742447,
-0.04327226057648659,
-0.03094247542321682,
-0.09182905405759811,
0.10906106978654861,
-0.011050614528357983,
0.06548495590686798,
-0.032947689294815063,
-0.17345891892910004,
-0.00036524407914839685,
0.010458704084157944,
0.0143352672457695,
-0.27538231015205383,
-0.006719630211591721,
-0.04002055898308754,
-0.028435928747057915,
-0.08767364174127579,
0.15993940830230713,
0.08715731650590897,
0.04870180785655975,
-0.04018097743391991,
-0.1620606780052185,
-0.0360432043671608,
0.05808918923139572,
-0.14037735760211945,
-0.14577636122703552
] |
null | null |
transformers
|
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code comment generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/code%20comment%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_comment_generation_java_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code comment generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
109
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 260,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09880819916725159,
0.07157349586486816,
-0.0013642713893204927,
0.11372318118810654,
0.04783757030963898,
0.020758340135216713,
0.005771959666162729,
0.10993558913469315,
-0.009156361222267151,
0.06303045898675919,
0.06103946268558502,
-0.058976951986551285,
0.054798975586891174,
0.1845139116048813,
0.026393987238407135,
-0.17369471490383148,
-0.029799580574035645,
0.024309009313583374,
-0.042106177657842636,
0.1093650534749031,
0.0827639177441597,
-0.06849294900894165,
0.07174206525087357,
-0.04616755619645119,
-0.12448713183403015,
0.05282515659928322,
-0.029713643714785576,
-0.030539091676473618,
0.08688267320394516,
0.0517239086329937,
0.1031467467546463,
-0.026651432737708092,
0.05758503079414368,
-0.18871872127056122,
0.0013893696013838053,
0.027200207114219666,
0.05875996872782707,
0.028395945206284523,
0.05034855753183365,
0.07537249475717545,
0.14673416316509247,
-0.010867852717638016,
0.043561529368162155,
0.06416447460651398,
-0.06535718590021133,
-0.09342869371175766,
-0.046138945966959,
0.04939843341708183,
0.08716809004545212,
0.0970236137509346,
-0.012784437276422977,
-0.01447028387337923,
-0.0742022916674614,
0.08384151756763458,
0.1139933168888092,
-0.21527421474456787,
-0.022576868534088135,
0.1354551464319229,
0.08418124914169312,
0.06447809934616089,
-0.08331052213907242,
-0.026482146233320236,
0.10661006718873978,
0.037762776017189026,
0.07339408248662949,
-0.09545567631721497,
-0.05245516076683998,
-0.006215323694050312,
-0.04281090199947357,
-0.05342939868569374,
0.14547277987003326,
0.035352546721696854,
-0.04808661714196205,
-0.11288929730653763,
-0.05759089067578316,
-0.1967075765132904,
0.04721920192241669,
0.02241664007306099,
0.011899731121957302,
0.0015592221170663834,
0.023564467206597328,
-0.016302961856126785,
-0.08117926865816116,
-0.1213533952832222,
0.029280541464686394,
0.0034994292072951794,
0.06001264229416847,
0.03665651008486748,
-0.016175469383597374,
0.0788169577717781,
-0.02504917047917843,
-0.05895492434501648,
-0.0313698835670948,
0.012105686590075493,
-0.12343340367078781,
0.03522920981049538,
0.004007277544587851,
-0.047663670033216476,
0.0009223836241289973,
0.07008662074804306,
-0.1000523716211319,
0.08831964433193207,
0.09322671592235565,
0.013425412587821484,
0.0008354202727787197,
0.2139805555343628,
0.04733000695705414,
-0.1692742556333542,
0.025863032788038254,
0.017988307401537895,
0.007306399755179882,
0.003612531116232276,
-0.05519617721438408,
-0.058901507407426834,
0.007057958748191595,
0.07020418345928192,
-0.1331617832183838,
0.011609605513513088,
-0.04864349961280823,
-0.006630829069763422,
0.08752501010894775,
-0.1172640398144722,
0.034398049116134644,
0.008760744705796242,
-0.05889519304037094,
-0.04861081764101982,
0.07509670406579971,
-0.12884458899497986,
-0.11440001428127289,
0.031882718205451965,
-0.04779799282550812,
-0.0324677973985672,
-0.12154266238212585,
-0.11318432539701462,
-0.00505645340308547,
-0.06232477352023125,
0.0005113061633892357,
-0.0936003103852272,
-0.1091664731502533,
-0.019786665216088295,
0.03195246309041977,
-0.014517687261104584,
-0.025056282058358192,
-0.049121081829071045,
0.015811117365956306,
-0.002111595356836915,
-0.030693363398313522,
0.027916284278035164,
-0.03989016264677048,
0.09557599574327469,
0.07935342192649841,
0.057164888828992844,
0.0023022834211587906,
0.028168339282274246,
-0.09409121423959732,
0.08294905722141266,
-0.10065954178571701,
0.06288493424654007,
-0.021840108558535576,
0.06413870304822922,
-0.09455395489931107,
-0.08962702751159668,
0.013745812699198723,
0.04719637334346771,
0.06074976548552513,
0.023370269685983658,
-0.12271164357662201,
0.021617617458105087,
0.15244810283184052,
-0.1147657185792923,
-0.12723468244075775,
0.11460432410240173,
0.0018142398912459612,
0.025975434109568596,
0.0653938502073288,
0.14850583672523499,
0.14821581542491913,
-0.08534502983093262,
-0.01835406944155693,
0.07462698966264725,
0.05815591290593147,
-0.07856045663356781,
0.0662664845585823,
0.01008673943579197,
0.014086829498410225,
0.04145462438464165,
0.06936201453208923,
0.051988862454891205,
0.005895149428397417,
-0.0333535373210907,
-0.047863610088825226,
-0.08520922064781189,
-0.05110834166407585,
0.0008991103386506438,
0.01538991741836071,
-0.06727464497089386,
-0.05741447955369949,
0.00550176901742816,
0.16727939248085022,
-0.09970984607934952,
0.02135825715959072,
-0.064026840031147,
-0.042886897921562195,
-0.07833791524171829,
0.03234439715743065,
-0.10800524801015854,
0.025484278798103333,
0.06198706105351448,
-0.046306636184453964,
0.04477974772453308,
0.08370232582092285,
0.009528149850666523,
0.024641480296850204,
-0.05661869794130325,
-0.036129482090473175,
-0.04232275113463402,
-0.07186023890972137,
-0.10248041898012161,
-0.013020457699894905,
-0.08490181714296341,
-0.0266072079539299,
-0.05883350223302841,
-0.1772213578224182,
0.003994301427155733,
-0.005983369890600443,
0.020564718171954155,
0.02203173190355301,
-0.04067849740386009,
0.04673713445663452,
0.05221809446811676,
-0.04172976687550545,
-0.08922973275184631,
0.020657002925872803,
0.0240232702344656,
-0.0846664234995842,
-0.0343722440302372,
-0.08615569770336151,
-0.03296791389584541,
0.06721804291009903,
0.1007237508893013,
-0.09334851056337357,
-0.011776313185691833,
-0.02132667973637581,
-0.057719886302948,
-0.06521391123533249,
-0.0661117359995842,
0.1494172215461731,
0.004058501683175564,
0.16646675765514374,
-0.1456640362739563,
-0.050111185759305954,
-0.018613355234265327,
0.00934418011456728,
0.034499768167734146,
0.1589762419462204,
0.006901856511831284,
-0.10125558078289032,
0.05210883170366287,
-0.03920994699001312,
-0.06426235288381577,
0.18119412660598755,
-0.0036350495647639036,
-0.07708454877138138,
0.007812931202352047,
0.10911452770233154,
-0.01551466342061758,
0.16023479402065277,
-0.09983258694410324,
-0.0022295613307505846,
-0.003303108736872673,
0.019210143014788628,
0.040231138467788696,
-0.11759710311889648,
0.017467007040977478,
0.03686152398586273,
-0.08171136677265167,
-0.04133559763431549,
-0.010901587083935738,
-0.037928514182567596,
0.04195205867290497,
0.008116459473967552,
0.02256585843861103,
-0.020449526607990265,
-0.026071295142173767,
-0.09090344607830048,
0.18975761532783508,
-0.07676463574171066,
-0.22994092106819153,
-0.17791786789894104,
0.097007617354393,
-0.03725707158446312,
-0.017472494393587112,
0.03820022940635681,
-0.10865839570760727,
-0.06290102005004883,
-0.09669753909111023,
0.1140500083565712,
-0.12033835053443909,
-0.0009412788785994053,
-0.038202255964279175,
0.06235368177294731,
0.048717815428972244,
-0.17127543687820435,
0.02634812705218792,
-0.013088062405586243,
-0.002739141695201397,
-0.014320521615445614,
-0.05494735389947891,
0.09015171974897385,
0.10899612307548523,
-0.06890992075204849,
0.028706993907690048,
0.0023915329948067665,
0.1543045938014984,
-0.0477379709482193,
0.038345251232385635,
0.18506458401679993,
0.03248697891831398,
0.024017302319407463,
0.05266840383410454,
0.013312851078808308,
-0.100676991045475,
0.06452576071023941,
0.058435264974832535,
-0.05306590348482132,
-0.23018549382686615,
-0.014691336080431938,
-0.08293277770280838,
0.06566299498081207,
0.12152563035488129,
0.06113767996430397,
-0.15137165784835815,
0.019658688455820084,
-0.0008721859776414931,
0.1461135447025299,
-0.02662822976708412,
0.05417656898498535,
0.031261079013347626,
0.009573161602020264,
0.006015363615006208,
-0.09487012773752213,
0.026031676679849625,
0.0757814571261406,
0.10449369251728058,
0.2115742564201355,
-0.08838318288326263,
0.19396564364433289,
0.01371009647846222,
0.10845837742090225,
0.04722592234611511,
0.0860375389456749,
-0.13646702468395233,
0.009295637719333172,
0.009019134566187859,
-0.019557900726795197,
-0.06827002018690109,
0.048995304852724075,
-0.0377628467977047,
0.06982972472906113,
-0.05025368928909302,
-0.001771673676557839,
0.012183169834315777,
0.18654680252075195,
0.058403197675943375,
-0.1704249233007431,
-0.12048023194074631,
0.022960953414440155,
-0.08967942744493484,
-0.11948270350694656,
0.07355436682701111,
0.2337626963853836,
-0.048378076404333115,
0.025316162034869194,
-0.008459565229713917,
0.13856440782546997,
-0.10819505155086517,
-0.02326253429055214,
0.022155389189720154,
0.07189968228340149,
0.007247687317430973,
0.11883381754159927,
-0.2618955373764038,
0.07759474217891693,
0.016833659261465073,
0.08730588108301163,
-0.012323944829404354,
0.05444212630391121,
-0.05195554718375206,
0.00033405038993805647,
0.07555986195802689,
0.007659690920263529,
-0.06005236133933067,
-0.20021893084049225,
-0.042385783046483994,
0.02740846574306488,
0.03401291370391846,
-0.014153623953461647,
0.06937918066978455,
-0.014856484718620777,
0.042151208966970444,
-0.0388997346162796,
-0.1409204602241516,
-0.05829593539237976,
-0.12723636627197266,
-0.040567465126514435,
0.014586031436920166,
-0.04152122884988785,
-0.017543477937579155,
0.03674035891890526,
0.05312513932585716,
0.24546843767166138,
-0.1573668271303177,
-0.0988808125257492,
-0.08880709856748581,
0.06969951093196869,
0.14147073030471802,
-0.09236159175634384,
0.02885536663234234,
0.01422572135925293,
0.04286837577819824,
-0.04227553308010101,
-0.06231305003166199,
0.03222854062914848,
-0.05017860233783722,
-0.08237574249505997,
-0.026805609464645386,
0.10925992578268051,
-0.02155066467821598,
0.03544999286532402,
0.001410342869348824,
-0.07587379962205887,
-0.031309276819229126,
-0.13674816489219666,
-0.0788823813199997,
0.019381918013095856,
0.02255411073565483,
-0.01742108166217804,
-0.10101515799760818,
0.08069569617509842,
0.016980893909931183,
-0.0930042713880539,
0.05893849954009056,
0.1672983467578888,
-0.07175147533416748,
0.03352314978837967,
0.11329099535942078,
-0.0547076091170311,
-0.154190331697464,
-0.03174068406224251,
0.03765755519270897,
0.08265583962202072,
-0.044653329998254776,
-0.14727741479873657,
0.055171918123960495,
0.04038917273283005,
0.012204103171825409,
0.03473174199461937,
-0.28405997157096863,
-0.13010090589523315,
0.0007096812478266656,
0.07945814728736877,
0.0843496173620224,
-0.10715801268815994,
-0.05122663825750351,
-0.061462707817554474,
-0.0733630582690239,
0.05327940359711647,
0.058088645339012146,
0.12010979652404785,
-0.05246555060148239,
0.021573953330516815,
0.040681272745132446,
-0.02930390276014805,
0.07162447273731232,
-0.013935158960521221,
0.09500405937433243,
-0.020753856748342514,
0.04941873624920845,
0.04541352018713951,
-0.056476980447769165,
0.19696210324764252,
-0.17289988696575165,
0.09091145545244217,
-0.18412084877490997,
-0.053317002952098846,
-0.026051415130496025,
-0.0014272215776145458,
-0.03415100648999214,
-0.057622119784355164,
-0.1109653189778328,
0.018137823790311813,
0.03564153611660004,
-0.02669551409780979,
0.058931395411491394,
-0.030929094180464745,
-0.039625730365514755,
0.08585626631975174,
0.07045259326696396,
-0.03222116827964783,
-0.13428670167922974,
0.03299181908369064,
0.023653225973248482,
0.10074068605899811,
-0.21506480872631073,
0.015017956495285034,
0.10860446840524673,
0.02366885356605053,
0.09602474421262741,
0.0008024506387300789,
-0.08003084361553192,
0.04050545394420624,
0.07344187796115875,
-0.05656620115041733,
-0.09354139864444733,
-0.010135715827345848,
-0.06506584584712982,
-0.08279445022344589,
0.026622634381055832,
0.08714327216148376,
-0.06472276896238327,
-0.01088064443320036,
-0.005432347767055035,
0.01771026849746704,
-0.06309877336025238,
0.19383420050144196,
0.018574247136712074,
0.0825011283159256,
-0.07159015536308289,
0.08183833211660385,
0.09874314814805984,
-0.11961618810892105,
0.017821045592427254,
0.17930391430854797,
-0.08749659359455109,
-0.022228315472602844,
0.04792289435863495,
0.09708632528781891,
-0.023288073018193245,
-0.05049809440970421,
-0.08635908365249634,
-0.07130935043096542,
0.02176445536315441,
0.022026238963007927,
0.07112658768892288,
0.08247518539428711,
-0.028367094695568085,
0.0033954421523958445,
-0.10472515970468521,
0.09654530137777328,
0.07452144473791122,
0.052248239517211914,
-0.12544554471969604,
0.09832597523927689,
0.052591923624277115,
0.07482987642288208,
0.0015002517029643059,
0.02418207935988903,
-0.10641597956418991,
0.03875064104795456,
-0.025447091087698936,
0.04194926097989082,
-0.008592788130044937,
0.05226721987128258,
-0.046410948038101196,
0.024913974106311798,
-0.029907096177339554,
0.050571709871292114,
-0.03413138538599014,
-0.03020535223186016,
-0.03838960826396942,
0.03556032106280327,
-0.06459449231624603,
-0.02052554488182068,
0.011393632739782333,
-0.07595599442720413,
0.09793318063020706,
-0.0695897787809372,
-0.014222366735339165,
0.0017892165342345834,
0.002922927727922797,
0.07104039192199707,
0.03808113560080528,
0.04153609648346901,
-0.00688360957428813,
0.0034900526516139507,
0.04229839891195297,
0.019763223826885223,
-0.002875213511288166,
-0.002146337181329727,
0.07306712120771408,
-0.15718577802181244,
-0.09288786351680756,
-0.08077673614025116,
-0.07653963565826416,
-0.057654887437820435,
0.07349058240652084,
0.09893335402011871,
0.08123112469911575,
0.08371724933385849,
-0.03291507437825203,
-0.000749334052670747,
-0.15745654702186584,
-0.04586144909262657,
0.047770313918590546,
-0.018039749935269356,
-0.09593619406223297,
-0.0301457978785038,
0.05495632439851761,
-0.04138903692364693,
0.11968877911567688,
0.0006835661479271948,
0.06142204999923706,
-0.011778845451772213,
-0.04466281831264496,
-0.02545088157057762,
-0.004134008660912514,
0.17770259082317352,
-0.09413298219442368,
0.007850175723433495,
-0.0014747476670891047,
0.01388853695243597,
0.03618916869163513,
0.17813971638679504,
0.07694396376609802,
0.12246919423341751,
0.04244345799088478,
0.0772222951054573,
-0.049449726939201355,
-0.02541121281683445,
-0.15402808785438538,
0.08521962910890579,
-0.021891353651881218,
0.0480816476047039,
-0.042419709265232086,
0.11742308735847473,
0.13797974586486816,
-0.13847452402114868,
0.1019086018204689,
0.028460508212447166,
-0.09774617105722427,
-0.05016162618994713,
-0.10912565886974335,
-0.05385701730847359,
-0.10612601041793823,
0.013169161975383759,
-0.1035892516374588,
0.028110235929489136,
0.07549875229597092,
0.041516486555337906,
-0.027804424986243248,
0.14721490442752838,
0.020792700350284576,
-0.059862032532691956,
0.03170754760503769,
0.04759204387664795,
0.029165957123041153,
0.10991220921278,
0.019768958911299706,
0.06866810470819473,
-0.0715164914727211,
0.06261087208986282,
0.03338736295700073,
0.0001737798738759011,
-0.0011135328095406294,
0.014335326850414276,
0.0013691469794139266,
-0.05525178834795952,
0.006775729823857546,
0.07293310016393661,
0.17158366739749908,
0.05623067542910576,
-0.054497942328453064,
-0.04515305534005165,
0.20174801349639893,
-0.045528993010520935,
-0.0416620634496212,
-0.11473109573125839,
0.16082577407360077,
0.05174219235777855,
0.011922982521355152,
0.014570089988410473,
-0.0616583526134491,
-0.03670389950275421,
0.22774328291416168,
0.03685956448316574,
-0.012953951954841614,
-0.03554533049464226,
-0.02393740601837635,
-0.012054547667503357,
-0.03440535441040993,
0.1627906858921051,
0.0035572426859289408,
0.239132821559906,
0.014908659271895885,
0.005625826306641102,
-0.04296520724892616,
-0.049417559057474136,
-0.026508668437600136,
0.19195878505706787,
-0.04620853438973427,
0.027142029255628586,
-0.09641449898481369,
-0.01193202380090952,
0.02165103517472744,
-0.10145389288663864,
0.11912127584218979,
-0.12057802826166153,
-0.07633032649755478,
0.01219265628606081,
0.0630451887845993,
-0.03581636771559715,
0.030190587043762207,
-0.017299678176641464,
0.06664631515741348,
0.02982238121330738,
-0.03491596132516861,
-0.107551209628582,
-0.1677154153585434,
0.03763394430279732,
-0.014461515471339226,
0.1389586329460144,
0.014637526124715805,
0.06963469088077545,
0.0811675488948822,
0.016065461561083794,
-0.08045174926519394,
0.10479303449392319,
0.026938622817397118,
-0.004966534208506346,
0.0518757700920105,
0.13621798157691956,
-0.04157227277755737,
0.15710727870464325,
0.006396337412297726,
-0.015360452234745026,
-0.03798818960785866,
-0.037324655801057816,
0.005227097310125828,
-0.1603068709373474,
0.00442270515486598,
-0.05385543406009674,
0.12634128332138062,
0.1901666522026062,
-0.04160336032509804,
-0.02992251329123974,
-0.05616028606891632,
0.08503245562314987,
-0.020816095173358917,
0.08013254404067993,
0.004833193961530924,
-0.16696108877658844,
0.008517072536051273,
0.017080247402191162,
0.00344392959959805,
-0.1743651181459427,
-0.048512812703847885,
-0.0346134789288044,
-0.028509171679615974,
-0.08372171968221664,
0.14607393741607666,
0.05495736002922058,
0.03705178201198578,
-0.03294089436531067,
-0.19522139430046082,
-0.00603668624535203,
0.05197127163410187,
-0.13539502024650574,
-0.12484771013259888
] |
null | null |
transformers
|
# CodeTrans model for code comment generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code comment generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_comment_generation_java_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/code%20comment%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 37.98 |
| CodeTrans-ST-Base | 38.07 |
| CodeTrans-TF-Small | 38.56 |
| CodeTrans-TF-Base | 39.06 |
| CodeTrans-TF-Large | **39.50** |
| CodeTrans-MT-Small | 20.15 |
| CodeTrans-MT-Base | 27.44 |
| CodeTrans-MT-Large | 34.69 |
| CodeTrans-MT-TF-Small | 38.37 |
| CodeTrans-MT-TF-Base | 38.90 |
| CodeTrans-MT-TF-Large | 39.25 |
| State of the art | 38.17 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "protected String renderUri ( URI uri ) { return uri . toASCIIString ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_comment_generation_java_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code comment generation java
================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code comment generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
109
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 750,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10333279520273209,
0.06834542006254196,
-0.0009456884581595659,
0.11138525605201721,
0.048245370388031006,
0.022077471017837524,
0.011182071641087532,
0.10962633043527603,
-0.012961866334080696,
0.05390144884586334,
0.058436665683984756,
-0.05961168557405472,
0.06164182350039482,
0.1947186291217804,
0.018949756398797035,
-0.16594746708869934,
-0.028778495267033577,
0.030326787382364273,
-0.06493597477674484,
0.11117542535066605,
0.0761711597442627,
-0.08030033111572266,
0.0764077678322792,
-0.05112574249505997,
-0.11937139183282852,
0.04925642907619476,
-0.024735212326049805,
-0.029179394245147705,
0.09161848574876785,
0.06269721686840057,
0.1120266392827034,
-0.020466206595301628,
0.06735508143901825,
-0.18882326781749725,
0.0018331374740228057,
0.0319070927798748,
0.061555344611406326,
0.0378417894244194,
0.03946254774928093,
0.08024489879608154,
0.1377808302640915,
-0.012017403729259968,
0.04144454747438431,
0.06058891862630844,
-0.06667779386043549,
-0.07762786746025085,
-0.055467694997787476,
0.0579313188791275,
0.08971060812473297,
0.10340442508459091,
-0.007372293155640364,
-0.0005640094168484211,
-0.07750362902879715,
0.08252917230129242,
0.11467965692281723,
-0.21442453563213348,
-0.020271968096494675,
0.12018804997205734,
0.08291065692901611,
0.05693487450480461,
-0.08617015928030014,
-0.03206498548388481,
0.10764312744140625,
0.03669687733054161,
0.0807361826300621,
-0.09650862962007523,
-0.03768535330891609,
-0.002335974248126149,
-0.049989208579063416,
-0.05286422371864319,
0.16876907646656036,
0.04189538583159447,
-0.050055574625730515,
-0.11078301072120667,
-0.05374494567513466,
-0.18922218680381775,
0.04662502184510231,
0.010355468839406967,
0.008241446688771248,
-0.005528260953724384,
0.009402696043252945,
-0.018332768231630325,
-0.08416248857975006,
-0.12578555941581726,
0.028659258037805557,
-0.004438887815922499,
0.05417685583233833,
0.039112452417612076,
-0.018286945298314095,
0.08731130510568619,
0.011760330758988857,
-0.05414576828479767,
-0.02087317779660225,
0.008821144700050354,
-0.11200568079948425,
0.020453857257962227,
0.007290815003216267,
-0.05243564769625664,
-0.011730010621249676,
0.0653083473443985,
-0.0987173393368721,
0.07872844487428665,
0.10366348177194595,
0.016975970938801765,
0.0038922957610338926,
0.21535524725914001,
0.037903353571891785,
-0.16380952298641205,
0.026128146797418594,
0.02969132363796234,
-0.00612848624587059,
0.013035848736763,
-0.05373673513531685,
-0.05485929176211357,
0.006454435642808676,
0.07262063771486282,
-0.13498111069202423,
0.020616166293621063,
-0.04912133514881134,
-0.00938254315406084,
0.09147681295871735,
-0.11917158216238022,
0.030136508867144585,
0.008331394754350185,
-0.06802838295698166,
-0.04438753426074982,
0.08774711191654205,
-0.13142286241054535,
-0.11615905910730362,
0.031145809218287468,
-0.04937523603439331,
-0.03759413957595825,
-0.12484697252511978,
-0.11970919370651245,
-0.005763823166489601,
-0.04299933463335037,
-0.0024033619556576014,
-0.09835471957921982,
-0.10259246081113815,
-0.020519912242889404,
0.036284130066633224,
-0.003040978452190757,
-0.02560577541589737,
-0.0524241141974926,
0.0068297856487333775,
-0.003951001446694136,
-0.030590426176786423,
0.028749793767929077,
-0.037292372435331345,
0.09948953241109848,
0.06401101499795914,
0.05565820261836052,
0.005631160922348499,
0.026338333263993263,
-0.09345008432865143,
0.07963477075099945,
-0.11541958153247833,
0.0576142780482769,
-0.016813311725854874,
0.06569090485572815,
-0.10631712526082993,
-0.08907213807106018,
0.009095258079469204,
0.048837535083293915,
0.0722261443734169,
0.034272536635398865,
-0.13083086907863617,
0.027258586138486862,
0.1398191899061203,
-0.10992320626974106,
-0.13503055274486542,
0.10869935154914856,
-0.0036463597789406776,
0.04455847293138504,
0.060416486114263535,
0.1370263695716858,
0.15088814496994019,
-0.09160330891609192,
-0.02463906817138195,
0.07925023138523102,
0.04249795526266098,
-0.08287210017442703,
0.05650128796696663,
0.013982084579765797,
0.002976964693516493,
0.029405023902654648,
0.07208019495010376,
0.05801998823881149,
-0.001049655955284834,
-0.03379175812005997,
-0.0392649807035923,
-0.0918118879199028,
-0.06068740412592888,
0.0006706280983053148,
0.017117654904723167,
-0.051255982369184494,
-0.058192331343889236,
0.01167514082044363,
0.1651785969734192,
-0.10455510765314102,
0.021970296278595924,
-0.06255923956632614,
-0.040044933557510376,
-0.07792859524488449,
0.029055843129754066,
-0.11084452271461487,
0.03351617977023125,
0.0592096745967865,
-0.027024490758776665,
0.03881226107478142,
0.08745688945055008,
0.010873078368604183,
0.018000656738877296,
-0.059352826327085495,
-0.04305431246757507,
-0.03729438781738281,
-0.06858676671981812,
-0.10957524925470352,
-0.017051633447408676,
-0.08977755904197693,
-0.022088710218667984,
-0.061994921416044235,
-0.1777031421661377,
0.00196588272228837,
-0.013252267614006996,
0.022181712090969086,
0.028193704783916473,
-0.03333452716469765,
0.041580792516469955,
0.04933667555451393,
-0.037894539535045624,
-0.08941658586263657,
0.022089174017310143,
0.02649184688925743,
-0.08278580009937286,
-0.032174233347177505,
-0.08552616089582443,
-0.03869988024234772,
0.07191117107868195,
0.09526617079973221,
-0.10766401141881943,
-0.013800705783069134,
-0.029603321105241776,
-0.05018395930528641,
-0.06196524202823639,
-0.05894584208726883,
0.15297025442123413,
0.008495021611452103,
0.16616609692573547,
-0.1454927623271942,
-0.05936947092413902,
-0.01860441081225872,
0.01679825410246849,
0.03795329108834267,
0.16019459068775177,
0.019001739099621773,
-0.10575734078884125,
0.04371015727519989,
-0.03196084126830101,
-0.05217360705137253,
0.1713133454322815,
-0.009746518917381763,
-0.07358204573392868,
0.0045265625230968,
0.10166262090206146,
-0.01203641016036272,
0.1771949678659439,
-0.06441715359687805,
0.0008828800055198371,
-0.00752782030031085,
0.019916947931051254,
0.0379645898938179,
-0.12141767144203186,
0.01934044621884823,
0.03664952144026756,
-0.07678715139627457,
-0.03730485588312149,
-0.0139126842841506,
-0.04270685464143753,
0.04602489620447159,
0.014845154248178005,
0.025927620008587837,
-0.0232082549482584,
-0.028492972254753113,
-0.09754439443349838,
0.1862928569316864,
-0.07274790108203888,
-0.22220055758953094,
-0.16978564858436584,
0.12379492074251175,
-0.013763193041086197,
-0.019185757264494896,
0.02845507487654686,
-0.0954270139336586,
-0.0645638257265091,
-0.10605549812316895,
0.10903438180685043,
-0.1148320659995079,
-0.004145360551774502,
-0.04228353872895241,
0.06577929854393005,
0.04576421156525612,
-0.1668144315481186,
0.029470982030034065,
-0.018873296678066254,
0.0050376541912555695,
-0.02301804907619953,
-0.06014145910739899,
0.08767624199390411,
0.10295755416154861,
-0.07383446395397186,
0.02682337909936905,
-0.0059017036110162735,
0.16011810302734375,
-0.05446610227227211,
0.050076767802238464,
0.17668139934539795,
0.029876256361603737,
0.02094060555100441,
0.05775623768568039,
0.00806126743555069,
-0.09810420125722885,
0.06965639442205429,
0.05262880027294159,
-0.03942453861236572,
-0.21734826266765594,
-0.01479440089315176,
-0.07885687053203583,
0.07420085370540619,
0.12018343806266785,
0.04965483024716377,
-0.15745531022548676,
0.016432208940386772,
-0.002902895212173462,
0.15282288193702698,
-0.02616744488477707,
0.05356995016336441,
0.014058131724596024,
0.00687683280557394,
0.0016564605757594109,
-0.09930931776762009,
0.019190531224012375,
0.06984607875347137,
0.0982823446393013,
0.21360713243484497,
-0.10139891505241394,
0.1741691678762436,
0.008988873101770878,
0.11413770914077759,
0.046361085027456284,
0.09913592785596848,
-0.13796204328536987,
0.012549495324492455,
0.006329206749796867,
-0.01719159632921219,
-0.07532669603824615,
0.045725267380476,
-0.03757977485656738,
0.0773359090089798,
-0.06872887909412384,
0.005097732413560152,
0.012851965613663197,
0.19212113320827484,
0.065489262342453,
-0.16604015231132507,
-0.127957284450531,
0.015736045315861702,
-0.08997121453285217,
-0.11340396106243134,
0.07338578253984451,
0.23875953257083893,
-0.05074185132980347,
0.010006068274378777,
-0.010905244387686253,
0.13457614183425903,
-0.10670772939920425,
-0.022976722568273544,
0.026612797752022743,
0.06895704567432404,
-0.0006664046668447554,
0.11755328625440598,
-0.26519057154655457,
0.09137510508298874,
0.018740074709057808,
0.09470801055431366,
-0.01309682335704565,
0.04739423096179962,
-0.05128353089094162,
-0.004290448036044836,
0.08172275871038437,
0.012757785618305206,
-0.056755054742097855,
-0.20610734820365906,
-0.040838465094566345,
0.02797938510775566,
0.03678644075989723,
-0.008700951002538204,
0.07386166602373123,
-0.018281619995832443,
0.03636922687292099,
-0.02956855297088623,
-0.13282650709152222,
-0.06475120782852173,
-0.12884421646595,
-0.04555825516581535,
0.0009346152655780315,
-0.04063975438475609,
-0.019663212820887566,
0.04123879596590996,
0.0642397478222847,
0.23395729064941406,
-0.16388212144374847,
-0.08556681126356125,
-0.09442593902349472,
0.06889703124761581,
0.14128687977790833,
-0.08717766404151917,
0.02739257737994194,
0.023257065564393997,
0.0379478894174099,
-0.03814557567238808,
-0.06513190269470215,
0.03715535253286362,
-0.05129767209291458,
-0.08091593533754349,
-0.029339471831917763,
0.10420379042625427,
-0.013760856352746487,
0.04233028367161751,
0.0043602073565125465,
-0.08111255615949631,
-0.029238374903798103,
-0.13341934978961945,
-0.08155737072229385,
0.011551427654922009,
0.03674900531768799,
-0.014464883133769035,
-0.11407861858606339,
0.08242060244083405,
0.01091908197849989,
-0.08854662626981735,
0.05985280126333237,
0.1554964929819107,
-0.07263710349798203,
0.027405472472310066,
0.10036341845989227,
-0.05703074485063553,
-0.1685742288827896,
-0.02739207074046135,
0.03503122180700302,
0.08234784752130508,
-0.04846698045730591,
-0.13888055086135864,
0.05061066150665283,
0.020648600533604622,
0.017608024179935455,
0.02271217852830887,
-0.26807111501693726,
-0.13048040866851807,
-0.00420879852026701,
0.0812300369143486,
0.07485232502222061,
-0.09619033336639404,
-0.043214354664087296,
-0.0604131855070591,
-0.06660735607147217,
0.06516462564468384,
0.07147041708230972,
0.11458156257867813,
-0.04695005342364311,
0.02289736457169056,
0.043630652129650116,
-0.03266996890306473,
0.054636672139167786,
-0.020260430872440338,
0.10210494697093964,
-0.019849948585033417,
0.03160820156335831,
0.03822663053870201,
-0.05961857736110687,
0.1947362720966339,
-0.16753852367401123,
0.10137666016817093,
-0.1839456707239151,
-0.04944174736738205,
-0.029112059623003006,
-0.00034206578857265413,
-0.03571154922246933,
-0.05574386566877365,
-0.11538968980312347,
0.035964418202638626,
0.047439251095056534,
-0.027210041880607605,
0.03864661231637001,
-0.02651042863726616,
-0.0468120276927948,
0.0702924132347107,
0.08673036843538284,
-0.021565215662121773,
-0.11612516641616821,
0.042993269860744476,
0.021091364324092865,
0.10513246804475784,
-0.2135111391544342,
0.018826927989721298,
0.11492995172739029,
0.01924554817378521,
0.0981360673904419,
0.004495385568588972,
-0.08035161346197128,
0.02572692558169365,
0.06637220829725266,
-0.06802040338516235,
-0.08610668778419495,
-0.013160875998437405,
-0.05547573044896126,
-0.09037083387374878,
0.02512751705944538,
0.08476406335830688,
-0.06920764595270157,
-0.007505182176828384,
-0.006755246315151453,
0.015593000687658787,
-0.06709259748458862,
0.19178687036037445,
0.0195227712392807,
0.08096428960561752,
-0.05975651368498802,
0.08501327782869339,
0.09200000017881393,
-0.12803763151168823,
0.024901511147618294,
0.16444797813892365,
-0.09035273641347885,
-0.022008372470736504,
0.06999394297599792,
0.1099645122885704,
-0.022783782333135605,
-0.04806206375360489,
-0.08959156274795532,
-0.07670996338129044,
0.022528955712914467,
0.04472341388463974,
0.06436042487621307,
0.08471567928791046,
-0.02534819208085537,
0.0007490639109164476,
-0.1201402023434639,
0.0984516367316246,
0.07676256448030472,
0.04330703616142273,
-0.12129773944616318,
0.12050002068281174,
0.04403122141957283,
0.07283248752355576,
0.000011813640412583482,
0.031788840889930725,
-0.10896920412778854,
0.03691820055246353,
-0.016148926690220833,
0.036562491208314896,
-0.0008540523122064769,
0.04720285162329674,
-0.04701721668243408,
0.029807234182953835,
-0.02866983972489834,
0.04847466200590134,
-0.03494071215391159,
-0.02628221921622753,
-0.03452959656715393,
0.030775679275393486,
-0.058857012540102005,
-0.02408806048333645,
0.006702598184347153,
-0.07986391335725784,
0.09211253374814987,
-0.06663721799850464,
-0.008553115651011467,
-0.0018968121148645878,
0.011835071258246899,
0.06413587927818298,
0.023729952052235603,
0.046307265758514404,
-0.001533690607175231,
0.0058363392017781734,
0.038526348769664764,
0.01912231370806694,
-0.009543873369693756,
-0.007479609921574593,
0.07424063980579376,
-0.14946307241916656,
-0.0862467810511589,
-0.07881145924329758,
-0.06495750695466995,
-0.0637018159031868,
0.0777023434638977,
0.09131966531276703,
0.07476784288883209,
0.08656520396471024,
-0.03457708656787872,
0.00003787954483414069,
-0.1671580970287323,
-0.044722650200128555,
0.05421457067131996,
-0.016665378585457802,
-0.09202800691127777,
-0.03466605022549629,
0.0615643747150898,
-0.04086949676275253,
0.11198381334543228,
0.0024521159939467907,
0.06232403591275215,
-0.011644688434898853,
-0.05931193381547928,
-0.03191874176263809,
0.002885041991248727,
0.17863012850284576,
-0.09872040897607803,
0.005505906417965889,
-0.008720876649022102,
0.017819279804825783,
0.03409962356090546,
0.18134833872318268,
0.09179789572954178,
0.12250220030546188,
0.03707864508032799,
0.07765444368124008,
-0.05234699696302414,
-0.031758081167936325,
-0.12341929227113724,
0.07696381211280823,
-0.027356235310435295,
0.047331683337688446,
-0.04291205480694771,
0.11933628469705582,
0.11218486726284027,
-0.1403203159570694,
0.10254427790641785,
0.017151108011603355,
-0.09988369792699814,
-0.04540589079260826,
-0.1024160385131836,
-0.04707223176956177,
-0.10313212871551514,
0.009934433735907078,
-0.10512647777795792,
0.01447982620447874,
0.05374514311552048,
0.03545226529240608,
-0.02641129679977894,
0.1462336927652359,
-0.0013895308366045356,
-0.05353656783699989,
0.04179411008954048,
0.049792177975177765,
0.023117925971746445,
0.0884159505367279,
0.022545402869582176,
0.06803335249423981,
-0.07967832684516907,
0.060057397931814194,
0.029432449489831924,
-0.008197366259992123,
0.006658375728875399,
0.03439032658934593,
-0.000716584618203342,
-0.05474500730633736,
-0.00008536968380212784,
0.07485765218734741,
0.1671004444360733,
0.05349121242761612,
-0.04448241740465164,
-0.047332968562841415,
0.20303577184677124,
-0.05321867763996124,
-0.045340921729803085,
-0.11968202143907547,
0.17062939703464508,
0.049734022468328476,
0.01066216267645359,
0.017911676317453384,
-0.06634590774774551,
-0.026096494868397713,
0.24761523306369781,
0.04740268364548683,
-0.03407016769051552,
-0.03458955138921738,
-0.0197018813341856,
-0.011052917689085007,
-0.03779252618551254,
0.1601617932319641,
0.012045093812048435,
0.23264658451080322,
0.01088682934641838,
-0.004045944660902023,
-0.041231513023376465,
-0.04928240552544594,
-0.005783068481832743,
0.19211295247077942,
-0.036376744508743286,
0.027734968811273575,
-0.09455617517232895,
-0.017331726849079132,
0.014733885414898396,
-0.1260790228843689,
0.12075251340866089,
-0.1286340206861496,
-0.0698118656873703,
0.006515385117381811,
0.057799115777015686,
-0.03840314596891403,
0.0374339334666729,
-0.02081141620874405,
0.07215801626443863,
0.04602702334523201,
-0.03066118247807026,
-0.10409409552812576,
-0.159189373254776,
0.03446139767765999,
-0.021319247782230377,
0.1307346224784851,
0.013009409420192242,
0.07068091630935669,
0.0833563506603241,
0.01194353774189949,
-0.08205022662878036,
0.09721606224775314,
0.02965344674885273,
-0.004434662871062756,
0.0518115796148777,
0.1317128688097,
-0.04100470989942551,
0.16819609701633453,
0.00041289994260296226,
-0.026280025020241737,
-0.02730378322303295,
-0.04452258720993996,
-0.004988800268620253,
-0.15600302815437317,
0.0034068783279508352,
-0.04646133631467819,
0.1351887732744217,
0.19243064522743225,
-0.04310735687613487,
-0.01828438974916935,
-0.057272933423519135,
0.08098039776086807,
-0.016771510243415833,
0.07244183868169785,
0.0058593908324837685,
-0.15590491890907288,
0.006046828348189592,
-0.006973033770918846,
-0.0027453687507659197,
-0.17693041265010834,
-0.04450079798698425,
-0.04221067950129509,
-0.03794170543551445,
-0.08740545809268951,
0.14602893590927124,
0.0560532882809639,
0.045686133205890656,
-0.03521029278635979,
-0.1581193506717682,
-0.00579764973372221,
0.05515317618846893,
-0.13144372403621674,
-0.12176232039928436
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus go dataset.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/go/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_go
|
[
"transformers",
"pytorch",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus go dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
43,
111
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12972386181354523,
-0.011705702170729637,
-0.0008059580577537417,
0.05480985343456268,
0.15767131745815277,
0.02029873989522457,
0.11163502931594849,
0.03673721104860306,
0.010427008382976055,
-0.03600525110960007,
0.07933077961206436,
0.1134614571928978,
0.027360007166862488,
0.1601782888174057,
-0.02471579611301422,
-0.2098979949951172,
-0.009880627505481243,
0.05988004431128502,
-0.17780934274196625,
0.12097484618425369,
0.11748672276735306,
-0.04554397612810135,
0.09586193412542343,
0.012121782638132572,
-0.19246675074100494,
0.03598737716674805,
-0.004541174042969942,
-0.07786460220813751,
0.1432587057352066,
0.10010135918855667,
0.11450454592704773,
0.025738710537552834,
0.003534542629495263,
-0.18089888989925385,
0.04003550857305527,
-0.03848846256732941,
-0.00019531631551217288,
0.052995264530181885,
0.03647983819246292,
-0.05212412774562836,
0.19772373139858246,
0.010853441432118416,
0.005991050973534584,
0.06467925757169724,
-0.11651443690061569,
-0.06446222960948944,
-0.014414971694350243,
-0.0041517214849591255,
0.054225124418735504,
0.07965051382780075,
0.012845358811318874,
0.09112279862165451,
-0.1445215493440628,
0.12468301504850388,
0.10524245351552963,
-0.18041178584098816,
-0.012770118191838264,
0.1054760068655014,
0.08984427154064178,
-0.07421957701444626,
-0.037522681057453156,
0.015750808641314507,
0.07107500731945038,
0.01393736619502306,
0.0058560678735375404,
-0.10260310024023056,
-0.1157795712351799,
0.06177712604403496,
-0.07630299776792526,
-0.05975545197725296,
0.27181923389434814,
0.0033493302762508392,
-0.038269415497779846,
-0.06634755432605743,
-0.054443176835775375,
0.007624492049217224,
0.010637682862579823,
0.006392246577888727,
-0.00613047368824482,
-0.013296295888721943,
-0.018747499212622643,
-0.03283567726612091,
-0.11689314246177673,
-0.11861462146043777,
-0.022534973919391632,
0.11775632202625275,
-0.011598298326134682,
0.0344931036233902,
-0.15860261023044586,
0.098038449883461,
0.0588914230465889,
-0.06863703578710556,
0.019767312332987785,
-0.06683074682950974,
-0.05556870624423027,
-0.007619372569024563,
-0.06870295852422714,
-0.13625791668891907,
0.09425708651542664,
0.08650490641593933,
-0.053979046642780304,
0.048045020550489426,
0.031531985849142075,
0.07525519281625748,
0.06018465384840965,
0.19578249752521515,
-0.008456184528768063,
-0.07362684607505798,
0.03505593538284302,
-0.036309659481048584,
-0.04326443746685982,
-0.011831016279757023,
-0.07089342176914215,
-0.040275074541568756,
0.019205790013074875,
0.11248360574245453,
-0.08229131251573563,
0.08660333603620529,
-0.07144226133823395,
-0.0479452945291996,
0.01886702887713909,
-0.13304567337036133,
-0.039401594549417496,
0.01888578198850155,
-0.06019456312060356,
-0.03996233269572258,
0.12276379019021988,
-0.07224301993846893,
-0.11027302592992783,
-0.007599046919494867,
-0.06801874190568924,
-0.0069906944409012794,
-0.09002158790826797,
-0.07248701900243759,
0.0029392754659056664,
0.035379767417907715,
0.06583035737276077,
-0.12890039384365082,
-0.16565555334091187,
0.009700184687972069,
0.0885823518037796,
0.004303992725908756,
0.03770274668931961,
-0.08212803304195404,
-0.0021568124648183584,
-0.03810962662100792,
-0.03245208039879799,
0.008388319984078407,
-0.07577919214963913,
0.08355654031038284,
0.09344889968633652,
0.04156527295708656,
-0.07499179244041443,
0.04141826927661896,
-0.11679860949516296,
0.0581614151597023,
-0.18291005492210388,
0.07900179177522659,
-0.05570234730839729,
0.09592749923467636,
-0.08721961081027985,
-0.055610571056604385,
0.04126323387026787,
0.06625770777463913,
0.05979589745402336,
0.13505344092845917,
-0.08240441977977753,
-0.06803063303232193,
0.13747720420360565,
-0.12462066859006882,
-0.21187376976013184,
0.0782386064529419,
-0.06001299247145653,
0.17558661103248596,
0.05275506526231766,
0.17431773245334625,
0.18280363082885742,
-0.08627034723758698,
0.06893294304609299,
0.07764659076929092,
-0.0687057226896286,
-0.08453971892595291,
0.06168388947844505,
0.05533445253968239,
-0.12452489882707596,
0.05790534242987633,
-0.006542484275996685,
0.13139194250106812,
-0.051823779940605164,
-0.04752291738986969,
0.0007990702288225293,
-0.07325614243745804,
0.068101666867733,
-0.022812526673078537,
0.08445623517036438,
-0.010008146055042744,
-0.007615670561790466,
0.019209273159503937,
0.10046391189098358,
-0.10479315370321274,
-0.005642381496727467,
-0.1307850480079651,
0.07801391184329987,
-0.10784116387367249,
0.03172215819358826,
-0.21606479585170746,
-0.03128275275230408,
-0.0025852357503026724,
0.03764552250504494,
0.07404171675443649,
0.051404453814029694,
0.011964826844632626,
0.021312423050403595,
-0.012011422775685787,
-0.0031940892804414034,
0.0021610399708151817,
-0.017228281125426292,
-0.020393095910549164,
-0.09370335191488266,
-0.03499769791960716,
-0.05847203731536865,
0.02685607597231865,
-0.17000283300876617,
0.0006723730475641787,
0.046455200761556625,
0.06320646405220032,
0.013684632256627083,
0.02911175601184368,
0.02744455263018608,
0.058669887483119965,
-0.04217188432812691,
-0.01275530643761158,
0.06806235760450363,
0.03198957070708275,
-0.12636131048202515,
0.04495648294687271,
-0.06658601015806198,
0.07748355716466904,
0.11784954369068146,
-0.145955428481102,
-0.07215556502342224,
-0.06947391480207443,
-0.041736118495464325,
-0.022604400292038918,
0.019101904705166817,
-0.030385702848434448,
0.2213573008775711,
0.007395792752504349,
0.16313676536083221,
-0.08848626166582108,
-0.017526600509881973,
-0.02424602210521698,
-0.022252164781093597,
0.03366130590438843,
0.12617698311805725,
0.10381276160478592,
-0.21123746037483215,
0.042540811002254486,
0.0710887759923935,
-0.010706806555390358,
0.22635000944137573,
-0.04447339475154877,
-0.026584314182400703,
-0.034091707319021225,
0.07530935108661652,
-0.025526225566864014,
0.17692162096500397,
-0.20804725587368011,
-0.014755998738110065,
0.013677030801773071,
-0.011966153047978878,
0.12274335324764252,
-0.13120368123054504,
0.0036586676724255085,
0.030080052092671394,
-0.03525694087147713,
-0.11603476107120514,
0.038675278425216675,
0.004253702238202095,
0.03722592815756798,
-0.003314680652692914,
-0.015313178300857544,
0.0416572168469429,
-0.030654912814497948,
-0.12805108726024628,
0.2305881232023239,
-0.07028822600841522,
-0.2440977394580841,
-0.20168277621269226,
0.054000698029994965,
-0.04691414162516594,
0.004676144570112228,
0.0634446069598198,
-0.055769648402929306,
-0.03940645605325699,
-0.01968333125114441,
0.1660236418247223,
-0.02311559021472931,
-0.03326239809393883,
0.003245368367061019,
0.06294164061546326,
-0.007041992153972387,
-0.19100943207740784,
-0.016547948122024536,
-0.006359577178955078,
0.06597273796796799,
0.02927619218826294,
-0.14443501830101013,
0.11173561960458755,
0.10107576102018356,
-0.037680093199014664,
0.045893747359514236,
-0.04574831947684288,
0.2256115823984146,
-0.07346920669078827,
-0.07335282117128372,
0.19677549600601196,
-0.07995235174894333,
0.018891675397753716,
0.012503206729888916,
0.00723772868514061,
-0.10757918655872345,
0.03430573269724846,
-0.0420549176633358,
-0.07533567398786545,
-0.23896189033985138,
-0.11344828456640244,
-0.0891512855887413,
0.11438644677400589,
0.027414914220571518,
0.024689622223377228,
-0.06834996491670609,
0.06996941566467285,
0.06498974561691284,
0.09966712445020676,
0.0006116132717579603,
0.05203081667423248,
0.07579614967107773,
-0.012982008047401905,
0.0030690189450979233,
-0.10693804919719696,
-0.05661596730351448,
0.05609608814120293,
0.07862458378076553,
0.20816518366336823,
0.0036447462625801563,
0.1410573571920395,
0.05615190789103508,
0.016968678683042526,
0.03896177560091019,
0.20352938771247864,
-0.08170526474714279,
0.022958675399422646,
-0.01594078354537487,
-0.03229193761944771,
-0.15041936933994293,
0.03730107098817825,
-0.004110573325306177,
0.0037036179564893246,
-0.13759878277778625,
-0.04809682443737984,
0.0702473521232605,
0.06350530683994293,
-0.01656515710055828,
-0.2560632824897766,
-0.13315045833587646,
0.01611601933836937,
-0.04896062612533569,
-0.06723809242248535,
0.06113288924098015,
0.11717499047517776,
-0.12211889028549194,
0.007403738796710968,
-0.049689922481775284,
0.156194269657135,
-0.07866943627595901,
0.026652006432414055,
-0.06932755559682846,
-0.03602694720029831,
0.003807245986536145,
0.16462256014347076,
-0.19469887018203735,
0.22570453584194183,
-0.005653216503560543,
0.016571735963225365,
-0.08004945516586304,
0.029644377529621124,
0.021158386021852493,
0.06791461259126663,
0.11569850146770477,
-0.022481529042124748,
-0.04182478040456772,
-0.1626690924167633,
0.030495740473270416,
0.082771435379982,
0.08413926512002945,
-0.0316675640642643,
0.06709609925746918,
-0.022091660648584366,
0.03506132960319519,
-0.007748501840978861,
-0.10301070660352707,
-0.08925972133874893,
-0.09942901134490967,
0.012573366984724998,
-0.034376360476017,
0.06269581615924835,
-0.030987992882728577,
0.008374931290745735,
0.05921320989727974,
0.19180840253829956,
-0.08809977024793625,
-0.06970173865556717,
-0.11943904310464859,
0.02527763321995735,
0.11219604313373566,
-0.09152932465076447,
0.027680644765496254,
0.0046603926457464695,
0.021592389792203903,
-0.009198227897286415,
-0.14753423631191254,
0.0674084722995758,
-0.06762617081403732,
-0.019758475944399834,
-0.030191179364919662,
0.1115184873342514,
-0.012210668995976448,
-0.014106945134699345,
0.04089309275150299,
-0.08018199354410172,
-0.05644053965806961,
-0.1543789952993393,
-0.08939095586538315,
-0.059463921934366226,
0.026754871010780334,
0.07720654457807541,
-0.13030944764614105,
0.006471259519457817,
-0.009813074953854084,
-0.03475326672196388,
0.21061298251152039,
0.11804096400737762,
-0.028915369883179665,
0.024439379572868347,
0.14445284008979797,
-0.08383840322494507,
-0.2519531846046448,
-0.032530684024095535,
-0.02520158141851425,
0.08192821592092514,
0.008400386199355125,
-0.1438835710287094,
0.09249579906463623,
-0.035347919911146164,
0.034988369792699814,
0.028804145753383636,
-0.2296605408191681,
-0.11533249169588089,
0.13776397705078125,
0.1236821636557579,
0.118313729763031,
-0.09705523401498795,
-0.060245949774980545,
-0.08116339147090912,
-0.18856611847877502,
0.15406928956508636,
-0.0822342038154602,
0.10539856553077698,
0.006066019646823406,
0.0657060369849205,
0.021875791251659393,
-0.059329528361558914,
0.12315791100263596,
0.0009364417637698352,
0.07997794449329376,
-0.02721910923719406,
-0.11441028118133545,
0.1463351845741272,
-0.029088998213410378,
0.12956427037715912,
-0.0897977203130722,
0.08177072554826736,
-0.20006458461284637,
-0.05687020719051361,
-0.04246200621128082,
0.049954552203416824,
-0.00008488682215102017,
-0.06773470342159271,
-0.06446957588195801,
0.022147739306092262,
0.021686963737010956,
-0.001737515558488667,
0.07672185450792313,
-0.051670465618371964,
-0.016715873032808304,
0.11262015998363495,
0.1412864476442337,
-0.03478400409221649,
-0.05656952038407326,
0.022343480959534645,
0.015199496410787106,
0.103703074157238,
-0.19945941865444183,
0.07738416641950607,
0.13262073695659637,
0.024043239653110504,
0.10517188906669617,
0.0949825718998909,
-0.03872399032115936,
0.04595870152115822,
0.10527314990758896,
-0.13664570450782776,
-0.06749202311038971,
-0.07195714116096497,
-0.10370977967977524,
0.006268088705837727,
0.10725612938404083,
0.16102248430252075,
-0.0371827632188797,
0.009853613562881947,
-0.008551876991987228,
-0.02593080885708332,
-0.13336950540542603,
0.14005346596240997,
0.03040425293147564,
0.07420487701892853,
-0.07875879853963852,
0.06812800467014313,
0.0472322441637516,
-0.14262647926807404,
-0.031404122710227966,
0.08636368811130524,
-0.1268494576215744,
-0.07701308280229568,
-0.03682770952582359,
0.2727223038673401,
-0.14417748153209686,
-0.09591358155012131,
-0.15155534446239471,
-0.06921181827783585,
0.0019962976220995188,
0.23106227815151215,
0.10105011612176895,
0.08777470886707306,
-0.06196974962949753,
0.007959702052175999,
-0.08243581652641296,
0.051692720502614975,
0.10698913782835007,
-0.002395372139289975,
-0.10890853404998779,
0.08953224867582321,
-0.004788936581462622,
0.15700386464595795,
-0.06953740119934082,
-0.042466286569833755,
-0.1841660588979721,
0.08541200309991837,
-0.0947207659482956,
0.051455702632665634,
-0.06628809124231339,
0.0354904904961586,
0.00804993137717247,
-0.002802641363814473,
-0.032661501318216324,
0.04119260609149933,
-0.08190659433603287,
0.023905251175165176,
-0.001507982611656189,
0.06905972212553024,
-0.09483644366264343,
-0.0064124963246285915,
0.08694148808717728,
-0.06655368953943253,
0.10929792374372482,
0.035697076469659805,
-0.07544862478971481,
0.1052204892039299,
-0.16576725244522095,
-0.023770375177264214,
0.038091469556093216,
0.02354247309267521,
0.045296963304281235,
-0.04912565276026726,
0.03655547648668289,
0.020141907036304474,
0.030586061999201775,
-0.004655424039810896,
0.12309718132019043,
-0.12458427250385284,
-0.09058619290590286,
-0.02693640999495983,
-0.11917522549629211,
-0.04885660111904144,
0.021619442850351334,
0.033291686326265335,
0.09182766079902649,
0.0727752149105072,
-0.01953340880572796,
0.037673868238925934,
-0.07761594653129578,
-0.009660560637712479,
0.060349296778440475,
-0.07643971592187881,
-0.01743500679731369,
-0.10522130131721497,
0.03755327686667442,
-0.04812493175268173,
0.2215392142534256,
-0.015390874817967415,
0.139052614569664,
-0.015154850669205189,
-0.019061589613556862,
0.01858014613389969,
0.041328590363264084,
0.2284693717956543,
-0.03200164809823036,
0.05582078546285629,
-0.0699833407998085,
0.07130353897809982,
0.018851041793823242,
0.04036261886358261,
0.11363037675619125,
0.07995222508907318,
-0.02673446387052536,
0.10149593651294708,
0.018117709085345268,
0.022407079115509987,
-0.07775559276342392,
-0.12056349217891693,
0.08574000746011734,
0.039940983057022095,
-0.0366651713848114,
0.08783100545406342,
0.1131899356842041,
-0.0790272131562233,
0.10361219942569733,
0.00014653745165560395,
-0.09400904923677444,
-0.025759262964129448,
-0.023868883028626442,
-0.03851756080985069,
-0.14039246737957,
0.010971611365675926,
-0.1183321475982666,
-0.07629547268152237,
0.08690355718135834,
0.027758700773119926,
-0.057849474251270294,
0.2226591408252716,
-0.04449433833360672,
-0.06963539123535156,
0.047094110399484634,
-0.012751810252666473,
0.022667132318019867,
0.001985523384064436,
0.050298698246479034,
-0.015517348423600197,
0.010427986271679401,
0.01253654807806015,
0.04614269733428955,
-0.06457251310348511,
0.020779745653271675,
-0.04458062723278999,
-0.01949511654675007,
-0.04870733246207237,
0.040148090571165085,
-0.007847179658710957,
0.03954256325960159,
-0.004784026648849249,
-0.026732226833701134,
-0.007670840248465538,
0.21571876108646393,
-0.057328835129737854,
-0.10822033137083054,
-0.14197969436645508,
0.21497730910778046,
0.03792312741279602,
0.05644093453884125,
0.0001894171437015757,
-0.06900191307067871,
-0.046918582171201706,
0.2918294072151184,
0.21243056654930115,
-0.06927221268415451,
0.006541701965034008,
0.017892448231577873,
0.021072231233119965,
0.022573275491595268,
0.1281493753194809,
0.027603328227996826,
0.2693276107311249,
-0.020488707348704338,
-0.0686853751540184,
-0.04856473580002785,
-0.04298783093690872,
0.03282267600297928,
0.13482484221458435,
0.04256062209606171,
-0.023199617862701416,
-0.06371450424194336,
0.1079336479306221,
-0.14571917057037354,
-0.13061371445655823,
0.03724478930234909,
-0.13390086591243744,
-0.0783560648560524,
-0.06891849637031555,
0.06672828644514084,
-0.03157508373260498,
0.06449300795793533,
-0.04488911107182503,
-0.02307788096368313,
0.03596293553709984,
0.03720308095216751,
-0.1549084335565567,
-0.060128629207611084,
0.041680723428726196,
-0.041671812534332275,
0.12455214560031891,
-0.02843291498720646,
0.10946059226989746,
0.1025736853480339,
0.04392858222126961,
-0.024444904178380966,
0.051241591572761536,
0.059375859797000885,
-0.012633954174816608,
0.060304418206214905,
0.041785530745983124,
-0.03620486706495285,
0.11473040282726288,
-0.04373788461089134,
-0.13633930683135986,
0.04199931398034096,
0.048484522849321365,
-0.012891124933958054,
-0.10182803124189377,
-0.018964098766446114,
-0.10992587357759476,
0.09824704378843307,
0.14773240685462952,
-0.038654353469610214,
0.01150642428547144,
-0.07540108263492584,
0.14137671887874603,
0.008737224154174328,
-0.029392650350928307,
-0.08130553364753723,
-0.1442796140909195,
-0.02970166690647602,
0.04045885428786278,
-0.02472653053700924,
-0.2258729785680771,
-0.016265252605080605,
-0.04415855556726456,
0.014779319986701012,
-0.029776422306895256,
0.11700587719678879,
0.12468968331813812,
0.027435004711151123,
-0.025859760120511055,
-0.1493140310049057,
-0.025526568293571472,
0.06560952961444855,
-0.1335020363330841,
-0.15945906937122345
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/go/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_go_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 340,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1540256142616272,
-0.032214198261499405,
-0.0001879863702924922,
0.12594641745090485,
0.11956120282411575,
0.0323236882686615,
0.06293468922376633,
0.06172460317611694,
-0.037407319992780685,
0.027376526966691017,
0.05089357867836952,
0.005650153383612633,
0.029152462258934975,
0.19737985730171204,
0.009095990099012852,
-0.10472337156534195,
-0.0274683628231287,
0.0493512824177742,
-0.06253276765346527,
0.13202255964279175,
0.08099769055843353,
-0.06660246104001999,
0.05812747776508331,
-0.06589803099632263,
-0.23742572963237762,
0.05766869708895683,
0.012659881263971329,
-0.05930839106440544,
0.09998320788145065,
0.058818891644477844,
0.1253940910100937,
-0.01105958130210638,
0.018441474065184593,
-0.12907327711582184,
0.012200181372463703,
0.004345435183495283,
0.023959102109074593,
0.014845972880721092,
0.03102404810488224,
0.036496590822935104,
0.16893038153648376,
0.00881879311054945,
0.03819600120186806,
0.06604119390249252,
-0.07428204268217087,
-0.11619344353675842,
-0.0009527758811600506,
0.02897518128156662,
0.034245215356349945,
0.11727835983037949,
-0.01765284314751625,
0.12401584535837173,
-0.14395476877689362,
0.1344253420829773,
0.08536320924758911,
-0.22775614261627197,
-0.01495254784822464,
0.11901668459177017,
0.08496467024087906,
0.08516401052474976,
-0.05004993453621864,
-0.052256882190704346,
0.10676081478595734,
0.0456048883497715,
0.02775752730667591,
-0.09029782563447952,
-0.06742274761199951,
0.010228736326098442,
-0.08146676421165466,
-0.06784268468618393,
0.2160416692495346,
0.0051714093424379826,
-0.08152154088020325,
-0.056401949375867844,
-0.03009343147277832,
-0.1439492106437683,
0.036384448409080505,
0.035843104124069214,
0.0018421841086819768,
-0.038691576570272446,
-0.0004472459258977324,
0.026711778715252876,
-0.07562191784381866,
-0.15290872752666473,
0.023136992007493973,
0.11061970889568329,
0.051881130784749985,
0.027488378807902336,
-0.09690597653388977,
0.10014117509126663,
0.03960074856877327,
-0.0537089966237545,
-0.02462160401046276,
-0.009680726565420628,
-0.10241131484508514,
0.029357576742768288,
-0.0607675239443779,
-0.16956886649131775,
0.013826100155711174,
0.032974936068058014,
-0.05617612972855568,
0.04946665093302727,
0.043292827904224396,
0.03962485492229462,
0.007270392961800098,
0.21862807869911194,
0.06373993307352066,
-0.12608759105205536,
0.0600590854883194,
0.04415522515773773,
-0.020025208592414856,
-0.005903212819248438,
-0.07696311920881271,
-0.09729692339897156,
0.09885665029287338,
0.10077682137489319,
-0.12971168756484985,
0.040195904672145844,
-0.07137428969144821,
-0.036181624978780746,
0.00928573403507471,
-0.15221719443798065,
0.002111664740368724,
0.03475695103406906,
-0.07385953515768051,
-0.05378144979476929,
0.10753122717142105,
-0.17733359336853027,
-0.14507097005844116,
-0.03602195158600807,
-0.06971680372953415,
-0.04014819860458374,
-0.16212350130081177,
-0.15998421609401703,
-0.011168375611305237,
-0.034678488969802856,
0.026712536811828613,
-0.10207010060548782,
-0.13637661933898926,
-0.03179735317826271,
0.02149682678282261,
0.014274937100708485,
-0.005434074904769659,
-0.08202168345451355,
-0.011066177859902382,
-0.021056486293673515,
-0.03559160605072975,
0.0032556166406720877,
-0.04686468839645386,
0.13178543746471405,
0.11152306199073792,
0.0516790896654129,
-0.03992877155542374,
0.05805797874927521,
-0.07876783609390259,
0.05635877698659897,
-0.11237282305955887,
0.09438851475715637,
-0.04974934458732605,
0.07794440537691116,
-0.02785794623196125,
-0.11130376160144806,
0.0709906592965126,
0.06921089440584183,
0.07756124436855316,
0.04751995578408241,
-0.11147208511829376,
-0.04229053482413292,
0.182246595621109,
-0.1258271187543869,
-0.14660412073135376,
0.10496773570775986,
-0.025451458990573883,
0.08402156084775925,
0.09304347634315491,
0.13388724625110626,
0.15352872014045715,
-0.04808979108929634,
0.008110983297228813,
0.044056959450244904,
0.03300093859434128,
-0.1452654004096985,
0.07892858237028122,
0.061821918934583664,
-0.0914095938205719,
0.05451708287000656,
-0.006999646779149771,
0.11511711031198502,
-0.013351279310882092,
-0.03081393800675869,
-0.04825429618358612,
-0.09053882956504822,
0.005981681868433952,
0.01896318607032299,
0.07538487762212753,
-0.08513117581605911,
-0.07820874452590942,
0.08634523302316666,
0.16549289226531982,
-0.12991198897361755,
0.0005762719083577394,
-0.08976816385984421,
0.06692077219486237,
-0.08030223101377487,
0.028651993721723557,
-0.16673225164413452,
0.02017677016556263,
0.06481222063302994,
-0.006419200915843248,
0.0764298290014267,
0.1246156394481659,
0.02265269309282303,
0.04286520183086395,
-0.012040642090141773,
-0.022097187116742134,
-0.11671161651611328,
-0.06275615096092224,
-0.07271528244018555,
-0.06818988174200058,
-0.08184905350208282,
-0.05531884729862213,
0.002977109747007489,
-0.20475627481937408,
0.014062625356018543,
0.008322142995893955,
-0.009854909032583237,
0.018485503271222115,
-0.013315940275788307,
0.021282590925693512,
0.07855092734098434,
-0.06024744361639023,
-0.03280080109834671,
0.04218919575214386,
0.024502677842974663,
-0.06524690240621567,
-0.0710848793387413,
-0.09417815506458282,
0.011107600294053555,
0.12476867437362671,
0.03439231589436531,
-0.09762220084667206,
0.018266893923282623,
-0.017932338640093803,
-0.04364347830414772,
0.022069621831178665,
-0.07000169157981873,
0.16233834624290466,
-0.013987069018185139,
0.19626174867153168,
-0.15309683978557587,
-0.036351658403873444,
-0.028203457593917847,
0.027691015973687172,
0.06236860901117325,
0.14630666375160217,
-0.008055274374783039,
-0.08688557893037796,
0.06464085727930069,
0.016459355130791664,
-0.11042861640453339,
0.2365443855524063,
-0.04738235101103783,
-0.09480147063732147,
0.034073356539011,
0.10311512649059296,
-0.004468541592359543,
0.1792811006307602,
-0.20760224759578705,
-0.029041685163974762,
0.0061819241382181644,
-0.0037731400225311518,
0.07004709541797638,
-0.13042742013931274,
0.007471146527677774,
0.013875605538487434,
-0.07119223475456238,
-0.08796179294586182,
-0.003039504401385784,
-0.015803134068846703,
0.04663388803601265,
-0.007744114380329847,
-0.03308694437146187,
0.017202986404299736,
-0.03278139606118202,
-0.12153766304254532,
0.2218315750360489,
-0.08489307761192322,
-0.20641237497329712,
-0.1995900720357895,
0.11346897482872009,
-0.06604303419589996,
-0.012692281976342201,
0.03689255937933922,
-0.08880725502967834,
-0.04228338226675987,
-0.05125149339437485,
0.18093295395374298,
-0.06942970305681229,
-0.005007337778806686,
-0.027389848604798317,
0.07418784499168396,
0.017432672902941704,
-0.20197685062885284,
0.03504105657339096,
-0.022847434505820274,
-0.015959370881319046,
0.015083367936313152,
-0.10777828097343445,
0.09912275522947311,
0.16654406487941742,
-0.07876572012901306,
0.019675040617585182,
-0.005846160929650068,
0.19951412081718445,
-0.048347774893045425,
-0.058114588260650635,
0.1427946537733078,
-0.0166912954300642,
-0.011960971169173717,
0.012045997194945812,
-0.014096644707024097,
-0.1031331717967987,
0.06656214594841003,
-0.01690533757209778,
-0.0326850451529026,
-0.27135998010635376,
-0.021264169365167618,
-0.07594167441129684,
0.04560016468167305,
0.04242460057139397,
0.03920074179768562,
-0.0926763191819191,
0.03048892505466938,
0.052359290421009064,
0.13697932660579681,
-0.010513074696063995,
0.045924652367830276,
0.060787513852119446,
0.0014715471770614386,
0.01513107679784298,
-0.10226675122976303,
0.011094614863395691,
0.07518361508846283,
0.09438539296388626,
0.26538509130477905,
-0.10074041783809662,
0.18580591678619385,
0.03722909092903137,
0.042869728058576584,
0.04549876227974892,
0.13933056592941284,
-0.12027692049741745,
0.03157025948166847,
0.014645694755017757,
-0.007030973210930824,
-0.1160222664475441,
0.02140911854803562,
-0.03290186822414398,
0.08659633994102478,
-0.12121521681547165,
-0.048901066184043884,
0.006067054811865091,
0.1371094137430191,
0.051921818405389786,
-0.23245972394943237,
-0.1430121213197708,
0.011555714532732964,
-0.07575438171625137,
-0.09504635632038116,
0.06342975795269012,
0.2233044058084488,
-0.06876775622367859,
-0.023844890296459198,
-0.005619549658149481,
0.13376101851463318,
-0.028200650587677956,
-0.03020665794610977,
-0.03740479797124863,
0.056869231164455414,
0.013851633295416832,
0.12969201803207397,
-0.29685869812965393,
0.13719432055950165,
-0.009059712290763855,
0.06699760258197784,
-0.034276556223630905,
0.04213954508304596,
-0.028691081330180168,
0.07588712126016617,
0.04107305780053139,
-0.00870584324002266,
0.03823278844356537,
-0.17520160973072052,
0.003997563850134611,
0.03888499364256859,
0.023572247475385666,
0.06174008920788765,
0.0692291185259819,
-0.000511963211465627,
0.055586077272892,
-0.013992885127663612,
-0.13834315538406372,
-0.06799693405628204,
-0.06438442319631577,
-0.027673929929733276,
-0.029430292546749115,
-0.021409913897514343,
-0.03974486142396927,
-0.0192234106361866,
0.06794055551290512,
0.1942620724439621,
-0.09243215620517731,
-0.08174409717321396,
-0.07681751996278763,
0.061704184859991074,
0.0904538705945015,
-0.09284968674182892,
0.04037024453282356,
-0.0029752282425761223,
0.02518133446574211,
-0.009317039512097836,
-0.08119694143533707,
0.06639143079519272,
-0.04028889909386635,
-0.06561556458473206,
-0.0075471485033631325,
0.07292697578668594,
0.004850344266742468,
0.040929511189460754,
0.008461621589958668,
-0.09537320584058762,
-0.04163419455289841,
-0.11803345382213593,
-0.11470459401607513,
-0.052187345921993256,
0.003549699205905199,
0.05625728890299797,
-0.1464327573776245,
-0.06256072223186493,
-0.005905210040509701,
-0.03548998758196831,
0.14173626899719238,
0.16366395354270935,
-0.060025766491889954,
0.013998922891914845,
0.1247708722949028,
-0.055493030697107315,
-0.20693616569042206,
0.039798904210329056,
0.051315583288669586,
0.12674549221992493,
-0.052033036947250366,
-0.16191056370735168,
0.04758608713746071,
0.0013274479424580932,
0.03701958805322647,
0.07529159635305405,
-0.29218441247940063,
-0.13423489034175873,
0.09187664836645126,
0.16364607214927673,
0.13884030282497406,
-0.1316526234149933,
-0.03607324883341789,
-0.06246640905737877,
-0.11515425145626068,
0.0762774720788002,
-0.05251266807317734,
0.13497719168663025,
-0.06809398531913757,
0.02291274257004261,
0.03303566575050354,
-0.04282321035861969,
0.07268360257148743,
0.02188560552895069,
0.10296232998371124,
-0.03807807341217995,
0.015728430822491646,
0.1384744793176651,
-0.03273722529411316,
0.17253535985946655,
-0.14871878921985626,
0.0928877517580986,
-0.23205359280109406,
-0.059849195182323456,
-0.07397395372390747,
0.015353829599916935,
-0.03436286002397537,
-0.03789125755429268,
-0.08135304600000381,
0.02776585891842842,
-0.008838454261422157,
-0.008977404795587063,
0.017858419567346573,
-0.043931007385253906,
-0.019689826294779778,
0.08826664835214615,
0.12061933428049088,
-0.005031541455537081,
-0.07251352816820145,
0.06296496093273163,
0.044661153107881546,
0.11640679091215134,
-0.18782275915145874,
0.022391296923160553,
0.11678048223257065,
0.020372958853840828,
0.11239144951105118,
0.04460097476840019,
-0.10371769964694977,
0.04926580190658569,
0.09341058135032654,
-0.061935532838106155,
-0.06619229167699814,
-0.028887512162327766,
-0.1115439385175705,
-0.07238198071718216,
0.05328289046883583,
0.0995761975646019,
-0.04491525515913963,
-0.009753013029694557,
-0.026808220893144608,
-0.028737150132656097,
-0.12088701874017715,
0.19305358827114105,
0.07726682722568512,
0.0804680809378624,
-0.06222658231854439,
0.05367036908864975,
0.07079773396253586,
-0.08382610231637955,
0.01595047302544117,
0.16861708462238312,
-0.09835480898618698,
-0.045248936861753464,
0.0673011764883995,
0.2151004672050476,
-0.04596107453107834,
-0.0631813108921051,
-0.14015725255012512,
-0.07925689965486526,
0.02731267921626568,
0.17006199061870575,
0.11064007133245468,
0.07788173109292984,
-0.027223920449614525,
0.00596409710124135,
-0.10736013203859329,
0.08568964898586273,
0.06963680684566498,
0.04029490426182747,
-0.1077895313501358,
0.1313437670469284,
0.04775827005505562,
0.1161198616027832,
-0.03067963570356369,
-0.009242075495421886,
-0.1504083275794983,
0.07443811744451523,
-0.09602056443691254,
0.030223486945033073,
-0.004296524450182915,
0.05204836651682854,
-0.028373386710882187,
-0.004120446275919676,
-0.03468878194689751,
0.06366674602031708,
-0.08450164645910263,
0.004503341391682625,
0.011028957553207874,
0.0418100506067276,
-0.056246403604745865,
-0.01801135763525963,
0.023202287033200264,
-0.09372702986001968,
0.12632843852043152,
-0.02341543324291706,
-0.031094374135136604,
0.08593974262475967,
-0.05309731885790825,
0.03771091252565384,
0.023343287408351898,
0.0540865957736969,
0.011657023802399635,
0.015925034880638123,
0.0800933986902237,
0.0396331287920475,
0.05883911997079849,
0.03623043745756149,
0.12796227633953094,
-0.12566600739955902,
-0.07665083557367325,
-0.056948576122522354,
-0.10730139166116714,
-0.05659601092338562,
0.1006523072719574,
0.031676217913627625,
0.10247287154197693,
0.09975536912679672,
-0.03854237496852875,
0.009453488513827324,
-0.13182826340198517,
-0.06031137704849243,
0.027126789093017578,
-0.022248148918151855,
-0.09423209726810455,
-0.05745648220181465,
0.05065683275461197,
-0.02675943449139595,
0.12152769416570663,
0.008101014420390129,
0.04279070347547531,
-0.019117679446935654,
-0.04141770675778389,
0.0012348816962912679,
0.012258092872798443,
0.21948759257793427,
-0.07798060029745102,
0.05110081657767296,
0.00626275734975934,
0.019954124465584755,
0.01688639260828495,
0.11897285282611847,
0.14083422720432281,
0.1528463065624237,
-0.03968426212668419,
0.11213504523038864,
0.008347532711923122,
0.0009981077164411545,
-0.0831117033958435,
-0.0025877179577946663,
0.008668582886457443,
0.05426590144634247,
-0.040536489337682724,
0.18950578570365906,
0.09291171282529831,
-0.11363133043050766,
0.10104702413082123,
0.02105806954205036,
-0.1329137086868286,
-0.03857262432575226,
0.029156064614653587,
-0.03645152971148491,
-0.14765042066574097,
0.02977205254137516,
-0.11772111803293228,
-0.04182472079992294,
0.036523714661598206,
0.05016341805458069,
-0.07996851950883865,
0.19107772409915924,
0.011817886494100094,
-0.0539548322558403,
0.053585417568683624,
-0.006590516772121191,
0.0209769569337368,
0.028599603101611137,
0.032086946070194244,
0.029297299683094025,
-0.04070666432380676,
0.043426308780908585,
0.026095617562532425,
-0.04945363476872444,
0.0014106739545240998,
-0.006421365309506655,
-0.000009101416253542993,
-0.022016795352101326,
0.03143308311700821,
0.0646945983171463,
0.1693812906742096,
0.030112607404589653,
-0.06889133155345917,
-0.02422025054693222,
0.14995644986629486,
-0.033071763813495636,
-0.10096436738967896,
-0.12275253236293793,
0.15867382287979126,
0.038449469953775406,
0.006560538429766893,
0.015755847096443176,
-0.09075740724802017,
-0.043422047048807144,
0.22703473269939423,
0.07754544913768768,
-0.03876867517828941,
-0.01866275444626808,
0.006038240622729063,
0.0007552957395091653,
-0.03389504551887512,
0.2064768522977829,
0.026778027415275574,
0.23554065823554993,
0.01788056455552578,
-0.030519362539052963,
-0.07643108069896698,
-0.03754303604364395,
0.015112725086510181,
0.11892116814851761,
-0.025020353496074677,
-0.04109348729252815,
-0.08633428066968918,
0.007450758945196867,
-0.006666332017630339,
-0.07691844552755356,
0.10453885048627853,
-0.14311319589614868,
-0.0948033556342125,
-0.04670325666666031,
0.0394432507455349,
-0.05199122801423073,
0.022578982636332512,
-0.028536539524793625,
0.03653869405388832,
0.05910472199320793,
-0.03627839684486389,
-0.12090139091014862,
-0.1609574258327484,
0.08464391529560089,
-0.054078903049230576,
0.1328168660402298,
-0.0197641309350729,
0.16358332335948944,
0.09448640048503876,
0.038422759622335434,
-0.04763440415263176,
0.11644714325666428,
0.032040610909461975,
0.03420040011405945,
0.05756818503141403,
0.11154663562774658,
-0.05273595079779625,
0.13625174760818481,
-0.049823980778455734,
-0.02066202275454998,
-0.01330261304974556,
-0.06711895018815994,
-0.022025909274816513,
-0.16394168138504028,
-0.013330278918147087,
-0.10708855837583542,
0.0988871157169342,
0.1961042433977127,
-0.04048774763941765,
-0.032762352377176285,
-0.09060381352901459,
0.10285738110542297,
-0.0026292181573808193,
0.06062573194503784,
-0.03405969962477684,
-0.18622826039791107,
-0.00030110159423202276,
-0.0018299914663657546,
0.0043863398022949696,
-0.2866016924381256,
-0.008790088817477226,
-0.04864583536982536,
-0.023577004671096802,
-0.09378628432750702,
0.16103602945804596,
0.07520197331905365,
0.04092162847518921,
-0.040234945714473724,
-0.13260455429553986,
-0.039277125149965286,
0.06595174968242645,
-0.15885905921459198,
-0.1488313376903534
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the go function/method.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/go/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_go_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the go function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
88,
107
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1258632242679596,
0.05267159640789032,
-0.0008433153270743787,
0.09481534361839294,
0.03763256594538689,
0.029692726209759712,
0.052412137389183044,
0.09670394659042358,
-0.019575342535972595,
0.06786224246025085,
0.050830643624067307,
-0.07841450721025467,
0.058543648570775986,
0.19150269031524658,
0.018528016284108162,
-0.1232183575630188,
-0.025341257452964783,
0.046050626784563065,
-0.07628627866506577,
0.107062928378582,
0.07034357637166977,
-0.08811192959547043,
0.0693044438958168,
-0.03471162170171738,
-0.13650061190128326,
0.032723523676395416,
-0.026224929839372635,
-0.01062187273055315,
0.10063770413398743,
0.07019755989313126,
0.12193035334348679,
-0.014668040908873081,
0.05883945897221565,
-0.18412046134471893,
0.004600078798830509,
0.018477991223335266,
0.06381873041391373,
0.04258674755692482,
0.044765654951334,
0.08520428836345673,
0.10970284789800644,
-0.007884913124144077,
0.03536755591630936,
0.05124645307660103,
-0.06428711861371994,
-0.04057648405432701,
-0.08204372972249985,
0.08260439336299896,
0.0767662525177002,
0.09964385628700256,
-0.0037714915815740824,
0.03899044916033745,
-0.07705364376306534,
0.08893048763275146,
0.11767547577619553,
-0.22806352376937866,
-0.023617830127477646,
0.10934585332870483,
0.0990336686372757,
0.03256478160619736,
-0.0756261870265007,
-0.03860890492796898,
0.10372262448072433,
0.04012662544846535,
0.03945799916982651,
-0.0845118910074234,
-0.018917357549071312,
-0.011477450840175152,
-0.05049584433436394,
-0.043210018426179886,
0.1723494678735733,
0.04079290106892586,
-0.060601502656936646,
-0.10022469609975815,
-0.0415055938065052,
-0.19779907166957855,
0.03950837254524231,
0.012577647343277931,
0.0002962102589663118,
-0.0150861581787467,
0.0024073293898254633,
-0.009165318682789803,
-0.09699750691652298,
-0.12033740431070328,
0.015843311324715614,
0.028784919530153275,
0.05453238636255264,
0.03504171967506409,
-0.0424860380589962,
0.08308117091655731,
0.025742657482624054,
-0.04195340722799301,
-0.019005853682756424,
0.020521804690361023,
-0.12390772253274918,
-0.0009570811525918543,
-0.02003701776266098,
-0.07577507942914963,
-0.004709569737315178,
0.0663929209113121,
-0.10966091603040695,
0.08094283938407898,
0.08950957655906677,
0.02367754653096199,
0.018977424129843712,
0.20610447227954865,
0.044628653675317764,
-0.1493232548236847,
0.026391321793198586,
0.021367201581597328,
0.004939729813486338,
0.004796852357685566,
-0.047608889639377594,
-0.04196959733963013,
0.027843013405799866,
0.06054327264428139,
-0.12221713364124298,
0.022956645116209984,
-0.06135163828730583,
-0.015511282719671726,
0.09112262725830078,
-0.12587344646453857,
0.03531094267964363,
0.02324536070227623,
-0.04685060679912567,
-0.04110868647694588,
0.0902235358953476,
-0.13144069910049438,
-0.1133827343583107,
0.03815245255827904,
-0.043070171028375626,
-0.03419181704521179,
-0.11604493111371994,
-0.10275347530841827,
0.0022671115584671497,
-0.009219382889568806,
-0.006458901334553957,
-0.0845469981431961,
-0.09078848361968994,
-0.02184193953871727,
0.0463927760720253,
0.00007812811236362904,
-0.03481360152363777,
-0.03146624192595482,
0.006215497385710478,
-0.007307909429073334,
-0.022627195343375206,
0.019794559106230736,
-0.024879299104213715,
0.09533592313528061,
0.07697290182113647,
0.03943051025271416,
-0.020631704479455948,
0.025369741022586823,
-0.07648969441652298,
0.0837218165397644,
-0.10928592830896378,
0.05346040800213814,
-0.005534496624022722,
0.053397275507450104,
-0.08765605092048645,
-0.06989714503288269,
-0.004773137625306845,
0.05611380562186241,
0.0789414718747139,
0.029776381328701973,
-0.12852777540683746,
0.02185518853366375,
0.1506490409374237,
-0.11901719868183136,
-0.1448238044977188,
0.10350799560546875,
-0.010906267911195755,
0.03698477894067764,
0.06136361137032509,
0.11402996629476547,
0.14620241522789001,
-0.08646445721387863,
-0.03221094235777855,
0.06438503414392471,
0.04682697728276253,
-0.08073607832193375,
0.05431324988603592,
0.026027290150523186,
-0.011290219612419605,
0.023148661479353905,
0.0594317689538002,
0.05621035769581795,
-0.004953390918672085,
-0.04066609963774681,
-0.03194260969758034,
-0.09541864693164825,
-0.06550242751836777,
-0.012726415880024433,
0.03006543032824993,
-0.054888445883989334,
-0.05202163755893707,
0.005514012183994055,
0.16637833416461945,
-0.09368697553873062,
0.03263961896300316,
-0.08293084055185318,
-0.037400584667921066,
-0.07961221784353256,
0.02658647857606411,
-0.13524112105369568,
0.02214248664677143,
0.057274412363767624,
-0.04522417485713959,
0.05297660082578659,
0.08219194412231445,
-0.0009562612976878881,
0.025366276502609253,
-0.06197325885295868,
-0.027839601039886475,
-0.03911607712507248,
-0.07332374900579453,
-0.1086781695485115,
-0.04377193748950958,
-0.0897403135895729,
-0.030121758580207825,
-0.03272540122270584,
-0.17646872997283936,
-0.0012569986283779144,
0.013666615821421146,
0.021689441055059433,
0.0156873632222414,
-0.04409606382250786,
0.022252412512898445,
0.05648021772503853,
-0.05147657170891762,
-0.06832875311374664,
0.02295098826289177,
0.05528343468904495,
-0.09899510443210602,
-0.03933372348546982,
-0.09046754986047745,
-0.07784371823072433,
0.08257222920656204,
0.10600333660840988,
-0.13237668573856354,
-0.009410936385393143,
-0.02873552218079567,
-0.05090387538075447,
-0.048961810767650604,
-0.05662000924348831,
0.16443035006523132,
0.01787339709699154,
0.15455469489097595,
-0.1352207362651825,
-0.06724650412797928,
-0.02643345668911934,
0.016724631190299988,
0.02905498445034027,
0.1421971619129181,
0.03436288610100746,
-0.11846080422401428,
0.03170941397547722,
-0.044249627739191055,
-0.0603691004216671,
0.16277937591075897,
-0.017302481457591057,
-0.06130930408835411,
-0.0060619330033659935,
0.12168823927640915,
0.004242944996803999,
0.20776331424713135,
-0.0699700340628624,
0.0020242289174348116,
-0.012598216533660889,
0.010754763148725033,
0.04908331483602524,
-0.13084910809993744,
0.029446657747030258,
0.03073233738541603,
-0.06183687224984169,
-0.0335262157022953,
-0.02826022170484066,
-0.036790888756513596,
0.04399176687002182,
0.01967141404747963,
0.04224805906414986,
-0.009135127067565918,
-0.03616606071591377,
-0.11405816674232483,
0.17981240153312683,
-0.06465431302785873,
-0.20874139666557312,
-0.16357268393039703,
0.09480661153793335,
-0.03955584019422531,
-0.01617981307208538,
0.030658256262540817,
-0.08006350696086884,
-0.048707976937294006,
-0.0914362445473671,
0.12192486226558685,
-0.1001761183142662,
0.003525781212374568,
-0.00022142712259665132,
0.059748098254203796,
0.062002621591091156,
-0.15925829112529755,
0.03206038475036621,
-0.025889785960316658,
0.021091334521770477,
-0.007391439285129309,
-0.05322455242276192,
0.08127862215042114,
0.11520782113075256,
-0.058261971920728683,
0.018865272402763367,
0.0023341539781540632,
0.16345690190792084,
-0.06254011392593384,
0.04565488174557686,
0.16862313449382782,
-0.0007879838231019676,
0.027779661118984222,
0.0520441010594368,
0.014987699687480927,
-0.09604576975107193,
0.0562441311776638,
0.04239898920059204,
-0.043529532849788666,
-0.21204672753810883,
-0.029778392985463142,
-0.08547943085432053,
0.05488777905702591,
0.10580933094024658,
0.051445282995700836,
-0.15104541182518005,
0.02591775916516781,
-0.00832541473209858,
0.160191610455513,
-0.02785501442849636,
0.055705949664115906,
0.0012855600798502564,
0.01593233086168766,
0.002005633432418108,
-0.1061990037560463,
0.00831871572881937,
0.07724668830633163,
0.10669069737195969,
0.20351818203926086,
-0.08586785942316055,
0.15759433805942535,
0.015043037943542004,
0.0984482690691948,
0.04557683691382408,
0.1087878867983818,
-0.1311432272195816,
0.007511183153837919,
0.009537671692669392,
-0.017056703567504883,
-0.059480566531419754,
0.047213874757289886,
-0.039271119982004166,
0.06864911317825317,
-0.06106865778565407,
0.0061726318672299385,
0.017048420384526253,
0.19502602517604828,
0.07427825033664703,
-0.161990225315094,
-0.14142082631587982,
0.00923637393862009,
-0.07321732491254807,
-0.10697705298662186,
0.061484143137931824,
0.21586699783802032,
-0.05305825173854828,
0.02791197970509529,
-0.01373195368796587,
0.1310252547264099,
-0.09544051438570023,
-0.01944963075220585,
0.03472970798611641,
0.05810336023569107,
0.007778710685670376,
0.11038707941770554,
-0.24066980183124542,
0.08589199930429459,
0.014010191895067692,
0.08460980653762817,
-0.03127435967326164,
0.05847522243857384,
-0.044107891619205475,
-0.006129154469817877,
0.073470838367939,
0.014185969717800617,
-0.04949823394417763,
-0.18310075998306274,
-0.04170528054237366,
0.020984452217817307,
0.05034476891160011,
0.0028119771741330624,
0.08875104784965515,
-0.007041803561151028,
0.04825736954808235,
-0.028079858049750328,
-0.11510825157165527,
-0.059119850397109985,
-0.13096417486667633,
-0.0255681611597538,
0.0076820384711027145,
-0.07506278902292252,
-0.025587068870663643,
0.03889644891023636,
0.040638528764247894,
0.2540448307991028,
-0.1577322632074356,
-0.062125008553266525,
-0.09508121013641357,
0.06270492076873779,
0.1295587569475174,
-0.08743306249380112,
0.012125657871365547,
0.013548347167670727,
0.06624579429626465,
-0.051644787192344666,
-0.06878960132598877,
0.03083035722374916,
-0.059698887169361115,
-0.09183172136545181,
-0.04093369096517563,
0.11489029973745346,
-0.008882480673491955,
0.04211292788386345,
0.003535793861374259,
-0.08351122587919235,
-0.02959204837679863,
-0.1346648633480072,
-0.07022924721240997,
-0.030164800584316254,
0.029737623408436775,
-0.01337111834436655,
-0.13702769577503204,
0.07368596643209457,
0.007906997576355934,
-0.0957985669374466,
0.07167059183120728,
0.18056832253932953,
-0.07232499122619629,
0.03458641096949577,
0.07622506469488144,
-0.05335216596722603,
-0.19617918133735657,
-0.03140422701835632,
0.04963603988289833,
0.08789247274398804,
-0.025539323687553406,
-0.14283430576324463,
0.07317661494016647,
-0.00028710553306154907,
0.010078562423586845,
0.022521421313285828,
-0.2315693199634552,
-0.12761253118515015,
0.005714651197195053,
0.07471546530723572,
0.05366038903594017,
-0.09952808171510696,
-0.050857435911893845,
-0.06523182988166809,
-0.04117949679493904,
0.06262659281492233,
0.06770120561122894,
0.10950260609388351,
-0.03045392967760563,
0.02663368172943592,
0.039001621305942535,
-0.03584303706884384,
0.06505254656076431,
-0.014126627705991268,
0.09600112587213516,
-0.016580138355493546,
0.003734394209459424,
0.06555015593767166,
-0.061336856335401535,
0.17896737158298492,
-0.15890845656394958,
0.09665320068597794,
-0.15809467434883118,
-0.03605024516582489,
-0.030104180797934532,
0.0020534174982458353,
-0.043745506554841995,
-0.03178303316235542,
-0.11828166246414185,
0.04069721698760986,
0.0485948771238327,
-0.03264697268605232,
0.039161887019872665,
-0.012034161016345024,
-0.05042067542672157,
0.06796707957983017,
0.07925336807966232,
-0.004026638809591532,
-0.11835384368896484,
0.0343439057469368,
0.015363937243819237,
0.09126738458871841,
-0.17091140151023865,
0.02944396808743477,
0.10372553020715714,
0.019652534276247025,
0.0883757695555687,
0.016980629414319992,
-0.09547006338834763,
0.022981567308306694,
0.07763569802045822,
-0.07474654912948608,
-0.06152312830090523,
-0.01529654860496521,
-0.026862116530537605,
-0.0914686843752861,
0.0455707423388958,
0.09048335999250412,
-0.04462691396474838,
-0.0022217517253011465,
-0.005651490762829781,
0.010969751514494419,
-0.08084623515605927,
0.16868238151073456,
0.005207060370594263,
0.08472899347543716,
-0.06239712983369827,
0.07060170918703079,
0.09744171053171158,
-0.09908303618431091,
0.02911144308745861,
0.1459353119134903,
-0.08425331115722656,
-0.0196896530687809,
0.08321525156497955,
0.13253706693649292,
-0.018867356702685356,
-0.05571238696575165,
-0.10050371289253235,
-0.08889231830835342,
0.019399922341108322,
0.06307581812143326,
0.06899888813495636,
0.09111341834068298,
-0.020874017849564552,
-0.0007177648949436843,
-0.1251470297574997,
0.09360389411449432,
0.07433392107486725,
0.04815712198615074,
-0.12764324247837067,
0.12728500366210938,
0.037649281322956085,
0.08263423293828964,
-0.0006545347278006375,
0.030109036713838577,
-0.12243998795747757,
0.034464724361896515,
-0.026337170973420143,
0.02966686338186264,
-0.009749261662364006,
0.041414398699998856,
-0.04001938924193382,
0.0368322990834713,
-0.03800026327371597,
0.04456619545817375,
-0.04144176468253136,
-0.022570917382836342,
-0.04370669275522232,
0.01922663114964962,
-0.05636431649327278,
-0.01502912025898695,
0.013313088566064835,
-0.09566673636436462,
0.09076616168022156,
-0.05442797392606735,
-0.007851531729102135,
-0.004073019605129957,
0.027866628021001816,
0.047620296478271484,
0.0050254520028829575,
0.05659732222557068,
-0.01221408974379301,
-0.01358070783317089,
0.019279155880212784,
0.032874319702386856,
-0.006356494966894388,
0.0012073135003447533,
0.09862454235553741,
-0.13321925699710846,
-0.08422495424747467,
-0.09124834835529327,
-0.07909856736660004,
-0.05799366533756256,
0.07247722148895264,
0.0890907347202301,
0.08399763703346252,
0.08128274977207184,
-0.033553171902894974,
0.005233678035438061,
-0.16765888035297394,
-0.03938912972807884,
0.05350740626454353,
-0.00039767942507751286,
-0.12167526036500931,
-0.039797332137823105,
0.06512412428855896,
-0.031137580052018166,
0.13214413821697235,
-0.036753423511981964,
0.03255723416805267,
-0.00940768513828516,
-0.05933674797415733,
-0.050511207431554794,
0.005767017137259245,
0.17473295331001282,
-0.1052529439330101,
0.0051237186416983604,
-0.004901978652924299,
0.007601122837513685,
0.019596794620156288,
0.15272915363311768,
0.12618091702461243,
0.11997253447771072,
0.03808155655860901,
0.08459131419658661,
-0.03913727030158043,
-0.03450740501284599,
-0.10077618062496185,
0.07027456909418106,
-0.04393460601568222,
0.03018343634903431,
-0.03219069540500641,
0.14315474033355713,
0.08063483983278275,
-0.14017286896705627,
0.10590015351772308,
-0.0012730596354231238,
-0.09679071605205536,
-0.029370279982686043,
-0.08327268809080124,
-0.043519362807273865,
-0.0928436890244484,
0.004815689288079739,
-0.10513710975646973,
-0.012940444983541965,
0.052428651601076126,
0.03030533343553543,
-0.026464330032467842,
0.17059241235256195,
-0.048996489495038986,
-0.04364310950040817,
0.026035765185952187,
0.048990555107593536,
0.02464725822210312,
0.09293410181999207,
0.024740010499954224,
0.06400210410356522,
-0.048956386744976044,
0.07474400103092194,
0.04037081077694893,
0.0024358462542295456,
0.024186886847019196,
0.043852467089891434,
-0.010920674540102482,
-0.04336719214916229,
-0.023219885304570198,
0.09202089160680771,
0.13202954828739166,
0.031780801713466644,
-0.0347302220761776,
-0.05509330704808235,
0.16169984638690948,
-0.056562405079603195,
-0.05931331589818001,
-0.12445519119501114,
0.1679198443889618,
0.031815290451049805,
0.000751937972381711,
0.014858387410640717,
-0.07807409018278122,
-0.02076624147593975,
0.2542401850223541,
0.06049337610602379,
-0.05948079749941826,
-0.023274218663573265,
0.0007880500052124262,
-0.009314223192632198,
-0.038707248866558075,
0.13608363270759583,
0.0045743160881102085,
0.25438109040260315,
0.018870074301958084,
-0.017753170803189278,
-0.04879312589764595,
-0.04294969141483307,
0.003331327112391591,
0.2031099647283554,
-0.03376505896449089,
0.03000479005277157,
-0.11218743771314621,
-0.013314913958311081,
0.027606215327978134,
-0.15684843063354492,
0.13324709236621857,
-0.14351525902748108,
-0.0800204649567604,
0.021692171692848206,
0.06945549696683884,
-0.05851779505610466,
0.042876407504081726,
-0.023449573665857315,
0.06885997951030731,
0.037463583052158356,
-0.03055761381983757,
-0.09586386382579803,
-0.13526783883571625,
0.04942606762051582,
-0.01267449464648962,
0.13441777229309082,
0.014611613936722279,
0.09443474560976028,
0.08536328375339508,
0.011989499442279339,
-0.07775171846151352,
0.08446397632360458,
0.025059999898076057,
-0.020678378641605377,
0.044991299510002136,
0.12476222962141037,
-0.05001666024327278,
0.15404173731803894,
0.012370944023132324,
-0.02627786435186863,
-0.025868581607937813,
-0.02718539535999298,
-0.011915713548660278,
-0.153049036860466,
0.0009854402160272002,
-0.0680563896894455,
0.14624209702014923,
0.1959424465894699,
-0.04438295215368271,
-0.01973707601428032,
-0.04703110456466675,
0.09480766952037811,
-0.011312801390886307,
0.08994722366333008,
0.0023820113856345415,
-0.18741394579410553,
0.02487756311893463,
-0.04164612665772438,
0.009109544567763805,
-0.2039228081703186,
-0.06651464104652405,
-0.028087005019187927,
-0.034541357308626175,
-0.09457504749298096,
0.1340385377407074,
0.0670241117477417,
0.032902851700782776,
-0.048162464052438736,
-0.11328588426113129,
-0.015019621700048447,
0.042690519243478775,
-0.11659780889749527,
-0.12524275481700897
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation go
Pretrained model on programming language go using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized go code functions: it works best with tokenized go functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the go function/method.
## Intended uses & limitations
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_go_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/go/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "func ( pr * Progress ) needSnapshotAbort ( ) bool { return pr . State == ProgressStateSnapshot && pr . Match >= pr . PendingSnapshot }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_go_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation go
====================================================
Pretrained model on programming language go using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized go code functions: it works best with tokenized go functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the go function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the go function or be fine-tuned on other go code tasks. It can be used on unparsed and untokenized go code. However, if the go code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate go function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
87,
107
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate go function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing go code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12602825462818146,
0.02247067354619503,
-0.0008078367100097239,
0.09323185682296753,
0.03922701254487038,
0.02328694425523281,
0.06274149566888809,
0.1003599613904953,
-0.013304390013217926,
0.06730496138334274,
0.03152818605303764,
-0.056799232959747314,
0.06142314523458481,
0.2021450251340866,
0.019979843869805336,
-0.12265942990779877,
-0.018138617277145386,
0.049293361604213715,
-0.09557870775461197,
0.10897516459226608,
0.07283641397953033,
-0.09364169836044312,
0.07308594137430191,
-0.03294193372130394,
-0.1331590712070465,
0.01899779960513115,
-0.023577148094773293,
-0.010496353730559349,
0.10343847423791885,
0.06922628730535507,
0.123275026679039,
-0.013250963762402534,
0.059770699590444565,
-0.18518118560314178,
0.005411727353930473,
0.020631195977330208,
0.06322886794805527,
0.047654639929533005,
0.042652491480112076,
0.08724533766508102,
0.09298167377710342,
-0.014311504550278187,
0.036665111780166626,
0.05506790801882744,
-0.06675957888364792,
-0.054469455033540726,
-0.08476142585277557,
0.08762399107217789,
0.08106871694326401,
0.10011497884988785,
-0.007966236211359501,
0.049392152577638626,
-0.08810305595397949,
0.08361855149269104,
0.12528929114341736,
-0.23598957061767578,
-0.021651018410921097,
0.11748350411653519,
0.10894204676151276,
0.03378383070230484,
-0.07616789638996124,
-0.03353762626647949,
0.10806240141391754,
0.029993819072842598,
0.0456361323595047,
-0.0737060159444809,
0.024160204455256462,
-0.008341200649738312,
-0.06105687469244003,
-0.039115242660045624,
0.16351580619812012,
0.0460897833108902,
-0.06331521272659302,
-0.10614211857318878,
-0.035976558923721313,
-0.18969954550266266,
0.039354078471660614,
-0.0002482036070432514,
-0.007145538926124573,
-0.012561467476189137,
-0.006572704762220383,
-0.0012052170932292938,
-0.08948598802089691,
-0.11968554556369781,
0.02565114013850689,
0.035331349819898605,
0.06060799956321716,
0.03005451150238514,
-0.05095479637384415,
0.08471263200044632,
0.05531499907374382,
-0.046815935522317886,
-0.02095775492489338,
0.014471151866018772,
-0.12724430859088898,
-0.006808990612626076,
-0.01766045019030571,
-0.06332346796989441,
-0.011741697788238525,
0.08281419426202774,
-0.08165128529071808,
0.07443226873874664,
0.08928076922893524,
0.02463844045996666,
0.011618751101195812,
0.2131710648536682,
0.04259037971496582,
-0.1427726298570633,
0.027909742668271065,
0.021930931136012077,
0.005679276771843433,
0.007156688719987869,
-0.05724874883890152,
-0.04537440463900566,
0.03265191242098808,
0.0640665590763092,
-0.12117008119821548,
0.03363046422600746,
-0.06847191601991653,
-0.028651056811213493,
0.09120927006006241,
-0.12664847075939178,
0.028572045266628265,
0.02233313024044037,
-0.054729562252759933,
-0.03747795894742012,
0.09850665926933289,
-0.13413123786449432,
-0.1079304963350296,
0.029765482991933823,
-0.04555483162403107,
-0.03700317069888115,
-0.11557698994874954,
-0.10769113898277283,
0.0015679788775742054,
-0.027000930160284042,
-0.0013813385739922523,
-0.09281481057405472,
-0.08060236275196075,
-0.016553392633795738,
0.04480705037713051,
0.0035887989215552807,
-0.035382118076086044,
-0.03830936178565025,
0.002082767430692911,
-0.010690970346331596,
-0.019272664561867714,
0.005600691772997379,
-0.025744007900357246,
0.0985196903347969,
0.0698699802160263,
0.04841814562678337,
-0.011735460720956326,
0.028433136641979218,
-0.0763254389166832,
0.08346639573574066,
-0.13620580732822418,
0.06899035722017288,
-0.009586517699062824,
0.04944237321615219,
-0.09409692883491516,
-0.06816241145133972,
0.004119703080505133,
0.050798241049051285,
0.08490053564310074,
0.035242944955825806,
-0.12797757983207703,
0.01578821986913681,
0.1499890685081482,
-0.1194315254688263,
-0.15014277398586273,
0.11257852613925934,
-0.015979254618287086,
0.0602373369038105,
0.06614837795495987,
0.11596912145614624,
0.1504794806241989,
-0.10249778628349304,
-0.043060097843408585,
0.05793721601366997,
0.04306913912296295,
-0.07220474630594254,
0.05158962309360504,
0.033134106546640396,
-0.01605214737355709,
0.011442665010690689,
0.05932598188519478,
0.0569753535091877,
-0.009426605887711048,
-0.04609546437859535,
-0.026368940249085426,
-0.0957392230629921,
-0.0491911843419075,
-0.014360959641635418,
0.027848869562149048,
-0.04509690776467323,
-0.06445973366498947,
0.012994534336030483,
0.16382576525211334,
-0.10107045620679855,
0.0343540720641613,
-0.09796871244907379,
-0.02286413684487343,
-0.07387245446443558,
0.027722705155611038,
-0.12873084843158722,
0.01605844683945179,
0.058210693299770355,
-0.04361530393362045,
0.05577386915683746,
0.08213993161916733,
-0.00009387218597112224,
0.02414979413151741,
-0.057074472308158875,
-0.032924260944128036,
-0.04702714458107948,
-0.07108666747808456,
-0.11428778618574142,
-0.04473186656832695,
-0.0993107408285141,
-0.030022289603948593,
-0.03700089454650879,
-0.17893588542938232,
0.0004884478985331953,
0.013966923579573631,
0.02346808835864067,
0.023057861253619194,
-0.0450916588306427,
0.01674182526767254,
0.048726920038461685,
-0.05118219181895256,
-0.07154972851276398,
0.023384174332022667,
0.04855217784643173,
-0.10229271650314331,
-0.02996978349983692,
-0.09966012090444565,
-0.0919489860534668,
0.09184654802083969,
0.09267957508563995,
-0.13043944537639618,
-0.016706759110093117,
-0.03062080219388008,
-0.05292641744017601,
-0.04659971222281456,
-0.06090152636170387,
0.158468559384346,
0.01845649443566799,
0.1446046084165573,
-0.13683755695819855,
-0.06719653308391571,
-0.0266671534627676,
0.02187609300017357,
0.0383298397064209,
0.1424214243888855,
0.02232687547802925,
-0.10084415972232819,
0.030359463766217232,
-0.02543458715081215,
-0.04690957069396973,
0.15266157686710358,
-0.023975616320967674,
-0.06202174723148346,
-0.005609581712633371,
0.11331485211849213,
0.013515611179172993,
0.21633830666542053,
-0.06585168838500977,
0.0023672939278185368,
-0.009238618426024914,
0.01222159806638956,
0.0430041141808033,
-0.1299872100353241,
0.02487914450466633,
0.03199280425906181,
-0.06732142716646194,
-0.03263194113969803,
-0.02449820563197136,
-0.04133233800530434,
0.03971134498715401,
0.022599229589104652,
0.03763328492641449,
-0.003053045365959406,
-0.03341430425643921,
-0.1116766631603241,
0.1753740757703781,
-0.0584266372025013,
-0.20134280622005463,
-0.166122704744339,
0.09381850808858871,
-0.018665114417672157,
-0.013873886317014694,
0.02879492938518524,
-0.0890830010175705,
-0.05330682545900345,
-0.08132565021514893,
0.1419665813446045,
-0.08830412477254868,
0.0039983103051781654,
-0.0011548680486157537,
0.05558900535106659,
0.057527631521224976,
-0.16399028897285461,
0.032962873578071594,
-0.026344187557697296,
0.014658893458545208,
-0.007397095672786236,
-0.0660620853304863,
0.07530634850263596,
0.12023687362670898,
-0.057572584599256516,
0.019159071147441864,
-0.004058901220560074,
0.16525638103485107,
-0.0721987634897232,
0.05270928144454956,
0.17010366916656494,
-0.008907299488782883,
0.02514268271625042,
0.05540488287806511,
0.006536025088280439,
-0.0997978001832962,
0.06282825022935867,
0.0437873937189579,
-0.04395993426442146,
-0.22271011769771576,
-0.03058326430618763,
-0.07748395204544067,
0.06084252521395683,
0.10181988775730133,
0.040323656052351,
-0.13948442041873932,
0.034869484603405,
-0.01026715338230133,
0.16137440502643585,
-0.017644532024860382,
0.05694935470819473,
0.005576668307185173,
0.023373926058411598,
0.008231217041611671,
-0.10134362429380417,
0.0119645856320858,
0.06793032586574554,
0.10001468658447266,
0.20967867970466614,
-0.08889990299940109,
0.16653332114219666,
0.01349049061536789,
0.12103903293609619,
0.04544040188193321,
0.1128535121679306,
-0.1192503273487091,
0.012087161652743816,
0.007241992745548487,
-0.021195754408836365,
-0.05803725868463516,
0.04694458842277527,
-0.04583987966179848,
0.07368256151676178,
-0.06904005259275436,
0.02544357068836689,
0.017177632078528404,
0.19600149989128113,
0.08861876279115677,
-0.1706719547510147,
-0.1501513570547104,
0.001877465401776135,
-0.06934472918510437,
-0.09314785897731781,
0.06100831553339958,
0.21031856536865234,
-0.05593034625053406,
0.020782679319381714,
-0.020538173615932465,
0.1311558037996292,
-0.09410030394792557,
-0.01765182428061962,
0.03809646517038345,
0.06261923164129257,
0.0054336427710950375,
0.10753002762794495,
-0.2588728964328766,
0.08337906002998352,
0.011906065046787262,
0.08546989411115646,
-0.02802795171737671,
0.0520881749689579,
-0.03813885524868965,
-0.006040253676474094,
0.07519086450338364,
0.014650694094598293,
-0.034111980348825455,
-0.19944609701633453,
-0.05646538734436035,
0.020968737080693245,
0.05385018512606621,
-0.009515267796814442,
0.08753761649131775,
-0.012775528244674206,
0.04725547879934311,
-0.020641649141907692,
-0.09905269742012024,
-0.06469525396823883,
-0.11611375212669373,
-0.03727417066693306,
-0.006926927249878645,
-0.0447569414973259,
-0.02535686455667019,
0.03161230310797691,
0.02716371789574623,
0.2492990493774414,
-0.14717452228069305,
-0.06010166183114052,
-0.09063628315925598,
0.05606710538268089,
0.1227051168680191,
-0.09236785769462585,
0.01599801518023014,
0.022250141948461533,
0.057160884141922,
-0.046764928847551346,
-0.07795339822769165,
0.0434202216565609,
-0.060298118740320206,
-0.0963032990694046,
-0.03701286390423775,
0.10552980750799179,
0.011306574568152428,
0.047583870589733124,
0.013912302441895008,
-0.08896961808204651,
-0.01881476677954197,
-0.12806877493858337,
-0.06921988725662231,
-0.026625461876392365,
0.034786757081747055,
-0.00510905496776104,
-0.13455262780189514,
0.06854721158742905,
-0.015842929482460022,
-0.08514486253261566,
0.05742337182164192,
0.1739269495010376,
-0.07418560981750488,
0.03225947543978691,
0.07585639506578445,
-0.05101311206817627,
-0.20209693908691406,
-0.025546658784151077,
0.047506626695394516,
0.08740311115980148,
-0.025258051231503487,
-0.13927695155143738,
0.08365081995725632,
-0.009281900711357594,
0.011360974982380867,
0.02307160571217537,
-0.22450223565101624,
-0.1350535899400711,
0.012428970076143742,
0.07069510966539383,
0.0562346875667572,
-0.09125290811061859,
-0.04793922230601311,
-0.05382516235113144,
-0.04769675061106682,
0.08076325058937073,
0.07590115815401077,
0.10501732677221298,
-0.02637636847794056,
0.02075509913265705,
0.04215829819440842,
-0.029723281040787697,
0.05632569640874863,
-0.001657958491705358,
0.09904375672340393,
-0.023380659520626068,
0.0038751200772821903,
0.06088284030556679,
-0.06453156471252441,
0.1705963909626007,
-0.1478930413722992,
0.09350361675024033,
-0.161324605345726,
-0.03375905379652977,
-0.0323854424059391,
-0.0013551925076171756,
-0.04176829382777214,
-0.039780739694833755,
-0.133564293384552,
0.049769118428230286,
0.04962728172540665,
-0.0268149022012949,
0.049594633281230927,
-0.0018088786164298654,
-0.04860733821988106,
0.056457627564668655,
0.08827078342437744,
-0.005115465726703405,
-0.11970265209674835,
0.03465704619884491,
0.01670869253575802,
0.10441615432500839,
-0.16196085512638092,
0.03326363116502762,
0.10553810745477676,
0.013604643754661083,
0.09128723293542862,
0.020160332322120667,
-0.1057293489575386,
0.028076471760869026,
0.06704490631818771,
-0.07015373557806015,
-0.0646195337176323,
-0.014429224655032158,
-0.035793956369161606,
-0.088369220495224,
0.05125768855214119,
0.09194165468215942,
-0.046598438173532486,
-0.0012658198829740286,
-0.003471229923889041,
0.009785743430256844,
-0.0831509381532669,
0.17461177706718445,
0.013541074469685555,
0.08629842102527618,
-0.05650612711906433,
0.07195209711790085,
0.09353651851415634,
-0.10718129575252533,
0.03411046415567398,
0.12733864784240723,
-0.09109503030776978,
-0.013178970664739609,
0.09549494087696075,
0.12304035574197769,
-0.022496836259961128,
-0.050340309739112854,
-0.09974684566259384,
-0.09031477570533752,
0.017762821167707443,
0.08019432425498962,
0.0766148790717125,
0.09571872651576996,
-0.016877856105566025,
0.007734066806733608,
-0.1314062625169754,
0.0947953462600708,
0.08069789409637451,
0.04917176812887192,
-0.1311362087726593,
0.13932685554027557,
0.03481883183121681,
0.08823459595441818,
-0.006589178461581469,
0.026774520054459572,
-0.13104648888111115,
0.03424001857638359,
-0.043851807713508606,
0.03818516805768013,
-0.014217753894627094,
0.03734641522169113,
-0.05123518779873848,
0.0347265899181366,
-0.025558631867170334,
0.04272734746336937,
-0.038876332342624664,
-0.02190845087170601,
-0.03278055414557457,
0.01475368719547987,
-0.05950241908431053,
-0.017810773104429245,
0.01217591017484665,
-0.09823254495859146,
0.09902217984199524,
-0.05248861759901047,
-0.005995297338813543,
-0.0073650735430419445,
0.03389545902609825,
0.04717965051531792,
0.004891697783023119,
0.05182299762964249,
-0.006462287623435259,
-0.02466832660138607,
0.015753723680973053,
0.020437022671103477,
-0.007700623944401741,
0.003103427356109023,
0.10515894740819931,
-0.1259811818599701,
-0.07622458040714264,
-0.09705714881420135,
-0.06368741393089294,
-0.0592118464410305,
0.07757172733545303,
0.07269946485757828,
0.07968713343143463,
0.08725965768098831,
-0.03400086238980293,
0.007944967597723007,
-0.17361778020858765,
-0.04288633540272713,
0.05171181261539459,
0.0026547228917479515,
-0.11406528204679489,
-0.043054644018411636,
0.06782873719930649,
-0.0329168327152729,
0.10812342911958694,
-0.03818506747484207,
0.028164463117718697,
-0.011027981527149677,
-0.05302151292562485,
-0.06388920545578003,
0.007663598749786615,
0.17635712027549744,
-0.10246029496192932,
0.014845028519630432,
-0.001836586045101285,
0.008537589572370052,
0.021156081929802895,
0.1509254425764084,
0.13504533469676971,
0.1278609037399292,
0.01814081147313118,
0.08659966289997101,
-0.03659572824835777,
-0.028791069984436035,
-0.11323139071464539,
0.06783187389373779,
-0.053529903292655945,
0.02334141917526722,
-0.0235840305685997,
0.14058957993984222,
0.06749080866575241,
-0.14424219727516174,
0.10716665536165237,
-0.0024682481307536364,
-0.09910627454519272,
-0.0350334495306015,
-0.09786686301231384,
-0.035483039915561676,
-0.10448393225669861,
0.002986723557114601,
-0.10041497647762299,
-0.022981561720371246,
0.05754963308572769,
0.032718632370233536,
-0.029560057446360588,
0.17115646600723267,
-0.0658218041062355,
-0.04108821973204613,
0.029559167101979256,
0.047620922327041626,
0.010600144043564796,
0.08783534914255142,
0.023519769310951233,
0.061495400965213776,
-0.0407094843685627,
0.07359340786933899,
0.03274475038051605,
0.0019125081598758698,
0.030606847256422043,
0.04653693363070488,
-0.01387808658182621,
-0.04095817729830742,
-0.01628568395972252,
0.09440949559211731,
0.1315026730298996,
0.03249165415763855,
-0.027664165943861008,
-0.05518865957856178,
0.16073571145534515,
-0.053826767951250076,
-0.05899668112397194,
-0.12602579593658447,
0.16979558765888214,
0.014464239589869976,
0.0012236754409968853,
0.015324529260396957,
-0.0774795189499855,
-0.01739071123301983,
0.26492831110954285,
0.05872516706585884,
-0.05752291902899742,
-0.023991351947188377,
-0.0004551312595140189,
-0.008328856900334358,
-0.0426379032433033,
0.1476021558046341,
0.009872682392597198,
0.24322815239429474,
0.020333681255578995,
-0.022525597363710403,
-0.04614757373929024,
-0.05194641277194023,
0.009072146378457546,
0.17812691628932953,
-0.03974657878279686,
0.023767411708831787,
-0.10646246373653412,
-0.008219905197620392,
0.012208732776343822,
-0.15732595324516296,
0.13457034528255463,
-0.1392490267753601,
-0.07506828755140305,
0.008311589248478413,
0.06204977259039879,
-0.054318420588970184,
0.04600461572408676,
-0.022991392761468887,
0.07065770030021667,
0.05060172080993652,
-0.027295494452118874,
-0.0903494656085968,
-0.12914150953292847,
0.05472087115049362,
-0.009793069213628769,
0.12744347751140594,
0.01298854872584343,
0.09656523168087006,
0.08304319530725479,
0.01967364177107811,
-0.07553552836179733,
0.06875330954790115,
0.027717936784029007,
-0.01205891277641058,
0.04084930196404457,
0.11756265163421631,
-0.05045302212238312,
0.15208309888839722,
0.018205445259809494,
-0.03581325709819794,
-0.024466918781399727,
-0.04055231809616089,
-0.015513729304075241,
-0.1525709629058838,
-0.0025615859776735306,
-0.06715855747461319,
0.14690789580345154,
0.19503551721572876,
-0.05254928395152092,
-0.018656771630048752,
-0.04660070687532425,
0.08834297955036163,
-0.006641619838774204,
0.07519306242465973,
-0.00011747280950658023,
-0.18345779180526733,
0.01679888181388378,
-0.049499813467264175,
0.008489539846777916,
-0.19497178494930267,
-0.06423943489789963,
-0.027842268347740173,
-0.0413382314145565,
-0.09835468977689743,
0.13937698304653168,
0.06880127638578415,
0.037222497165203094,
-0.049208786338567734,
-0.0978504866361618,
-0.019516676664352417,
0.04972448572516441,
-0.12466569244861603,
-0.12558583915233612
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus java dataset.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/java/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_java
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus java dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.0875348374247551,
0.013697436079382896,
-0.0003455282421782613,
0.06394645571708679,
0.12497641891241074,
-0.003056386485695839,
0.07015896588563919,
0.0622900165617466,
0.008274346590042114,
-0.0482400581240654,
0.08769276738166809,
0.1497582495212555,
0.02762714959681034,
0.1312154233455658,
-0.03489409759640694,
-0.20416918396949768,
-0.0008053651545196772,
0.06260603666305542,
-0.1364232748746872,
0.12527789175510406,
0.13377933204174042,
-0.05561942979693413,
0.1017858013510704,
-0.009229477494955063,
-0.22977487742900848,
0.06711380928754807,
-0.024753574281930923,
-0.08570939302444458,
0.1276760995388031,
0.08113759756088257,
0.10691884160041809,
0.037618692964315414,
0.002452101558446884,
-0.22514183819293976,
0.0342220775783062,
-0.03248739242553711,
0.015348796732723713,
0.05422268807888031,
0.04324764385819435,
-0.03354261815547943,
0.19429150223731995,
-0.003324622754007578,
0.008941693231463432,
0.05411025509238243,
-0.10770773887634277,
-0.09157682210206985,
-0.017219066619873047,
-0.012689750641584396,
0.09100747853517532,
0.07052944600582123,
0.021522730588912964,
0.12098428606987,
-0.13321104645729065,
0.13232064247131348,
0.09553149342536926,
-0.15953649580478668,
-0.022312473505735397,
0.12433511018753052,
0.09789133816957474,
-0.04756445810198784,
-0.058550458401441574,
0.006124221254140139,
0.07550051063299179,
0.024089157581329346,
0.044558025896549225,
-0.14467084407806396,
-0.21428675949573517,
0.07047666609287262,
-0.05522088706493378,
-0.054558660835027695,
0.28957539796829224,
-0.001740014529787004,
-0.03579780086874962,
-0.04674946889281273,
-0.02384253218770027,
0.037143558263778687,
0.001583861536346376,
-0.01219947636127472,
0.013403519056737423,
-0.006495804525911808,
0.0001787974761100486,
-0.020048044621944427,
-0.1046537458896637,
-0.12809570133686066,
0.012318591587245464,
0.0632692351937294,
-0.008053308352828026,
0.028492100536823273,
-0.1689462959766388,
0.09569334238767624,
0.08020750433206558,
-0.09261907637119293,
0.02069508098065853,
-0.06649632006883621,
-0.01483930740505457,
-0.015727058053016663,
-0.04498734325170517,
-0.16703221201896667,
0.090079665184021,
0.0329529233276844,
-0.06404729187488556,
0.0523817278444767,
0.008363490924239159,
0.07629864662885666,
0.05736444145441055,
0.17173059284687042,
-0.005652987863868475,
-0.07473986595869064,
0.048203229904174805,
-0.026784855872392654,
-0.06084217131137848,
0.012999911792576313,
-0.07063441723585129,
-0.03990170732140541,
0.013609337620437145,
0.12503035366535187,
-0.10864166170358658,
0.07088617980480194,
-0.06570059806108475,
-0.034918107092380524,
0.013724357821047306,
-0.13763315975666046,
-0.028480611741542816,
0.003789276583120227,
-0.06344790011644363,
-0.04812151566147804,
0.1105416938662529,
-0.056825656443834305,
-0.11266995966434479,
-0.03793490305542946,
-0.07759636640548706,
-0.002286880975589156,
-0.10701865702867508,
-0.07657937705516815,
0.017111066728830338,
0.04218428209424019,
0.06876492500305176,
-0.11328937113285065,
-0.18311932682991028,
-0.00378196919336915,
0.0859009325504303,
-0.009058515541255474,
0.043713636696338654,
-0.09632625430822372,
-0.02627558447420597,
-0.03624427691102028,
-0.023713121190667152,
0.06400034576654434,
-0.06666994839906693,
0.07964087277650833,
0.08765355497598648,
0.05573558062314987,
-0.06110719218850136,
0.05496655032038689,
-0.1368434876203537,
0.06887786090373993,
-0.17572492361068726,
0.09287060797214508,
-0.04748747497797012,
0.12308106571435928,
-0.10658205300569534,
-0.056166231632232666,
0.04451017454266548,
0.06461703032255173,
0.051738351583480835,
0.1248263269662857,
-0.14438219368457794,
-0.03378137946128845,
0.1368870586156845,
-0.10836703330278397,
-0.21830464899539948,
0.06135636940598488,
-0.07434312254190445,
0.21410374343395233,
0.048365335911512375,
0.19819733500480652,
0.14343023300170898,
-0.02868485078215599,
0.07107708603143692,
0.09276288002729416,
-0.04263358190655708,
-0.08118170499801636,
0.061137605458498,
0.06848236173391342,
-0.13355247676372528,
0.06377530097961426,
-0.028956551104784012,
0.10540744662284851,
-0.03231040760874748,
-0.04164525493979454,
-0.010478834621608257,
-0.06200810521841049,
0.016047755256295204,
-0.004955985117703676,
0.08650654554367065,
-0.006605913396924734,
0.01244751363992691,
0.0616019144654274,
0.1054123044013977,
-0.12497183680534363,
-0.006952561903744936,
-0.093503437936306,
0.028618594631552696,
-0.11639437079429626,
0.032513875514268875,
-0.21166153252124786,
0.027220679447054863,
0.01945713721215725,
0.011345173232257366,
0.026673752814531326,
0.04674104228615761,
0.002225014613941312,
0.008788947947323322,
0.012042907066643238,
-0.00012668300769291818,
0.012038164772093296,
-0.01339554600417614,
-0.028828123584389687,
-0.10568856447935104,
-0.048531509935855865,
-0.05472125485539436,
-0.018339067697525024,
-0.1854812502861023,
-0.007206879090517759,
0.03074919991195202,
0.06931772083044052,
0.030629904940724373,
0.037145815789699554,
0.050374578684568405,
0.06326886266469955,
-0.047165270894765854,
-0.020603148266673088,
0.06363126635551453,
0.022122951224446297,
-0.09089092165231705,
0.08014687150716782,
-0.05041682347655296,
0.0392286479473114,
0.09161380678415298,
-0.16121985018253326,
-0.0484076589345932,
-0.04355262964963913,
-0.03548724204301834,
-0.03213665261864662,
0.005714211147278547,
-0.02016386389732361,
0.19718383252620697,
-0.00277286721393466,
0.17384465038776398,
-0.1252789944410324,
-0.056693870574235916,
-0.03044029138982296,
-0.018076809123158455,
0.02996285818517208,
0.1407017558813095,
0.08248013257980347,
-0.2236068695783615,
0.0575043261051178,
0.08426006138324738,
-0.021634353324770927,
0.214022696018219,
-0.041451066732406616,
-0.02952142432332039,
-0.030555350705981255,
0.06863059848546982,
-0.04190429672598839,
0.14710868895053864,
-0.22171849012374878,
-0.03171179071068764,
0.019685082137584686,
-0.007631672080606222,
0.1122758537530899,
-0.11735276877880096,
-0.002150582382455468,
0.01657246984541416,
-0.03956224396824837,
-0.09183894842863083,
0.04511590301990509,
0.003761471714824438,
0.029469814151525497,
-0.006956758908927441,
-0.016358038410544395,
0.034719642251729965,
-0.03919856995344162,
-0.11604341864585876,
0.23168928921222687,
-0.08178829401731491,
-0.26040002703666687,
-0.19569189846515656,
0.07312697917222977,
-0.013156517408788204,
-0.009771275334060192,
0.05569472163915634,
-0.03973342850804329,
-0.05276770517230034,
-0.043325118720531464,
0.11012377589941025,
-0.028776198625564575,
-0.048147059977054596,
-0.010282251052558422,
0.08146045356988907,
-0.00270624621771276,
-0.19431070983409882,
-0.013962035067379475,
0.02327197976410389,
0.07291260361671448,
0.0076109846122562885,
-0.13866771757602692,
0.10433061420917511,
0.07624371349811554,
-0.05397311970591545,
0.04713505133986473,
-0.02445952594280243,
0.20842666923999786,
-0.06126997619867325,
-0.06025610491633415,
0.16236791014671326,
-0.09642329066991806,
-0.003352442290633917,
0.028680074959993362,
0.003236632328480482,
-0.1170538067817688,
0.03786719590425491,
-0.039697349071502686,
-0.05638670548796654,
-0.25277459621429443,
-0.08142433315515518,
-0.08600790798664093,
0.09921654313802719,
0.02161657251417637,
0.026387520134449005,
-0.07102641463279724,
0.053963709622621536,
0.08214443922042847,
0.14172638952732086,
-0.0023885478731244802,
0.0621887668967247,
0.0461786612868309,
0.00000787666613177862,
-0.005724714137613773,
-0.11177843809127808,
-0.04886482656002045,
0.029043223708868027,
0.0967540293931961,
0.1904471069574356,
-0.0007648671162314713,
0.1683170348405838,
0.07690630853176117,
0.04392387717962265,
0.03241889551281929,
0.17099055647850037,
-0.12204890698194504,
0.019687142223119736,
-0.017568619921803474,
-0.047731269150972366,
-0.13728395104408264,
0.02280402183532715,
-0.07212063670158386,
0.061882730573415756,
-0.1281474381685257,
-0.057739004492759705,
0.06905921548604965,
0.09702833741903305,
-0.014093056321144104,
-0.25188183784484863,
-0.11165375262498856,
0.03986385837197304,
-0.07736600190401077,
-0.07006209343671799,
0.05375116318464279,
0.17215187847614288,
-0.12860530614852905,
-0.015176679007709026,
-0.04420911520719528,
0.16267751157283783,
-0.0767742246389389,
0.03270037844777107,
-0.04839164763689041,
-0.03533410280942917,
0.01931367628276348,
0.1675824224948883,
-0.21182399988174438,
0.23328521847724915,
0.005609455052763224,
0.029242709279060364,
-0.06598269194364548,
0.03204849734902382,
0.0026204015593975782,
0.09018553048372269,
0.1278182566165924,
-0.017440814524888992,
-0.037498705089092255,
-0.1433279663324356,
0.042860161513090134,
0.088055320084095,
0.0523323230445385,
-0.026728661730885506,
0.057409390807151794,
-0.022015240043401718,
0.022249603644013405,
-0.01924106292426586,
-0.07525847852230072,
-0.10074827075004578,
-0.09780948609113693,
-0.003032986307516694,
-0.03312069922685623,
0.05907544866204262,
-0.026102658361196518,
0.026925021782517433,
0.10447284579277039,
0.17686131596565247,
-0.08071305602788925,
-0.05718423053622246,
-0.10008548200130463,
0.02711847424507141,
0.1261013001203537,
-0.07769280672073364,
-0.010893245227634907,
0.0021051305811852217,
0.03895604982972145,
0.0037654642947018147,
-0.13072633743286133,
0.05351848527789116,
-0.06955447793006897,
0.00780505733564496,
-0.030631467700004578,
0.09126220643520355,
-0.018012814223766327,
-0.01911618560552597,
0.06623081117868423,
-0.0735074058175087,
-0.055885396897792816,
-0.14300747215747833,
-0.10915983468294144,
-0.04150675609707832,
0.06663099676370621,
0.024734467267990112,
-0.14429457485675812,
0.024876385927200317,
-0.004918987862765789,
-0.03161003440618515,
0.20293238759040833,
0.08884996175765991,
-0.02574251964688301,
0.02489391341805458,
0.1639450043439865,
-0.11602018773555756,
-0.2334480732679367,
0.005280734039843082,
-0.03230864554643631,
0.07774461060762405,
0.017152797430753708,
-0.1297965943813324,
0.0966779887676239,
-0.0212701428681612,
0.038408175110816956,
-0.0028145096730440855,
-0.2850101590156555,
-0.0927276760339737,
0.09940031915903091,
0.13139492273330688,
0.07567469775676727,
-0.1134088858962059,
-0.07127122581005096,
-0.08849947154521942,
-0.24174600839614868,
0.16580738127231598,
-0.10594948381185532,
0.0877404436469078,
-0.01802125759422779,
0.05255144089460373,
0.027449941262602806,
-0.056967593729496,
0.10794489830732346,
0.016404593363404274,
0.11107928305864334,
-0.028078187257051468,
-0.1109815165400505,
0.09532010555267334,
-0.03439773619174957,
0.1620626449584961,
-0.11545030772686005,
0.08731027692556381,
-0.22446514666080475,
-0.03686108812689781,
-0.047640107572078705,
0.04783904552459717,
-0.010010837577283382,
-0.0743076354265213,
-0.04631466791033745,
0.02392154559493065,
0.0339743047952652,
0.016516052186489105,
0.12039679288864136,
-0.04230856895446777,
0.0035285227932035923,
0.12763555347919464,
0.14849890768527985,
-0.043803393840789795,
-0.017025446519255638,
0.040696121752262115,
0.03064838983118534,
0.1105346530675888,
-0.2375105619430542,
0.08407941460609436,
0.11292080581188202,
0.015424934215843678,
0.12785789370536804,
0.06999306380748749,
-0.034878864884376526,
0.02407427504658699,
0.08550633490085602,
-0.13883844017982483,
-0.04545171931385994,
-0.06494591385126114,
-0.032525431364774704,
0.011911292560398579,
0.06969189643859863,
0.1323765367269516,
-0.07027289271354675,
-0.012021319009363651,
-0.0032027200795710087,
-0.019216537475585938,
-0.13435979187488556,
0.12141181528568268,
0.04016401618719101,
0.07558362931013107,
-0.085504911839962,
0.08729752153158188,
0.05737169831991196,
-0.14994265139102936,
-0.027014533057808876,
0.13614708185195923,
-0.13091707229614258,
-0.08283495903015137,
-0.0009928954532369971,
0.30476242303848267,
-0.08785919100046158,
-0.09488599002361298,
-0.13568812608718872,
-0.0589756965637207,
-0.00480257673189044,
0.21958544850349426,
0.09407008439302444,
0.09977125376462936,
-0.0452021062374115,
-0.017672209069132805,
-0.1107776090502739,
0.06677959859371185,
0.07870634645223618,
0.010086342692375183,
-0.09087855368852615,
0.08646351844072342,
-0.004493309184908867,
0.14587192237377167,
-0.05127015709877014,
-0.03204577788710594,
-0.16943508386611938,
0.07053176313638687,
-0.1295023113489151,
0.06592816114425659,
-0.07465513795614243,
0.0265691876411438,
0.01240604929625988,
0.008961480110883713,
-0.03397351875901222,
0.055781904608011246,
-0.0832022875547409,
0.012593800202012062,
-0.0011513333301991224,
0.08368167281150818,
-0.07915686815977097,
-0.018968697637319565,
0.08212091028690338,
-0.05834730342030525,
0.0918813943862915,
0.017350368201732635,
-0.06442798674106598,
0.1012454703450203,
-0.18230204284191132,
-0.015219482593238354,
0.04310361668467522,
0.013158856891095638,
0.06185366213321686,
-0.05620328709483147,
0.03308556228876114,
0.03108653984963894,
0.035992447286844254,
-0.01840687356889248,
0.10512319952249527,
-0.13565194606781006,
-0.09874997287988663,
-0.027431296184659004,
-0.11547094583511353,
-0.039944760501384735,
0.04154624044895172,
0.05203212797641754,
0.07814259082078934,
0.08607788383960724,
-0.02199070155620575,
0.03141273930668831,
-0.07487697154283524,
-0.015214871615171432,
0.048350680619478226,
-0.08761196583509445,
-0.0652468279004097,
-0.093665212392807,
0.019157055765390396,
-0.07295798510313034,
0.17917779088020325,
0.010520649142563343,
0.13619844615459442,
-0.011928456835448742,
-0.058892980217933655,
0.0081118643283844,
0.05523201823234558,
0.21566550433635712,
-0.04356808587908745,
0.048623524606227875,
-0.06566096842288971,
0.07222473621368408,
0.01657106727361679,
0.0591760091483593,
0.08517037332057953,
0.13076718151569366,
-0.016010455787181854,
0.11139610409736633,
0.034444570541381836,
0.041151013225317,
-0.04293122515082359,
-0.0702202096581459,
0.09094005823135376,
0.06136952340602875,
-0.04172206297516823,
0.09653545916080475,
0.12303636968135834,
-0.11214438825845718,
0.09893506020307541,
0.0013752031372860074,
-0.10303182899951935,
-0.04028548300266266,
-0.020668353885412216,
-0.04859589412808418,
-0.12325625866651535,
-0.0005443849368020892,
-0.1319676786661148,
-0.03752988949418068,
0.051451023668050766,
0.025116777047514915,
-0.0677337571978569,
0.18952828645706177,
0.007759132422506809,
-0.07555653154850006,
0.06747636198997498,
-0.006039418745785952,
0.01735023222863674,
-0.028778282925486565,
0.08784612268209457,
-0.0061198752373456955,
-0.020769067108631134,
-0.011700782924890518,
0.04509396851062775,
-0.037726566195487976,
-0.011415199376642704,
-0.07268580049276352,
-0.03392927721142769,
-0.03647568076848984,
0.039575133472681046,
-0.004547376651316881,
0.022212669253349304,
0.027247052639722824,
-0.04432417079806328,
0.0019617003854364157,
0.24082204699516296,
-0.043419938534498215,
-0.07224733382463455,
-0.13918447494506836,
0.17177271842956543,
0.05537329241633415,
0.059337515383958817,
0.015530291944742203,
-0.0547466054558754,
-0.035400863736867905,
0.2699294984340668,
0.18216174840927124,
-0.04876147210597992,
-0.0007213094504550099,
0.00996269192546606,
0.0165316890925169,
-0.005329003091901541,
0.12688973546028137,
0.03906630724668503,
0.21224625408649445,
-0.02446792647242546,
-0.05357332527637482,
-0.04532114416360855,
-0.051101215183734894,
0.0354292057454586,
0.11981749534606934,
0.02639816515147686,
-0.04894211143255234,
-0.03498129919171333,
0.09090722352266312,
-0.14609721302986145,
-0.11333595961332321,
0.03274228423833847,
-0.1482396125793457,
-0.08157680928707123,
-0.07629746198654175,
0.050488971173763275,
-0.032418906688690186,
0.04592205956578255,
-0.03576705977320671,
-0.011583673767745495,
0.06371704488992691,
0.0388018861413002,
-0.13238593935966492,
-0.10421805828809738,
0.049209050834178925,
-0.055664222687482834,
0.12114697694778442,
-0.026003092527389526,
0.10079795867204666,
0.09326222538948059,
0.024225132539868355,
-0.05229032039642334,
0.04322533681988716,
0.06295718997716904,
0.0478549487888813,
0.06272850185632706,
0.06944398581981659,
-0.024032380431890488,
0.14079567790031433,
-0.04788777977228165,
-0.119390107691288,
0.03177432343363762,
-0.01158355176448822,
-0.008139731362462044,
-0.11275683343410492,
-0.027326466515660286,
-0.08268268406391144,
0.09143614768981934,
0.1697247177362442,
-0.046309929341077805,
0.017417678609490395,
-0.07789275795221329,
0.1435597538948059,
0.004004329442977905,
-0.015456177294254303,
-0.07638128846883774,
-0.1352270543575287,
-0.019851481541991234,
0.03283907100558281,
-0.016769496724009514,
-0.22979772090911865,
-0.00013190227036830038,
-0.047559067606925964,
-0.015641456469893456,
-0.04228579252958298,
0.10924817621707916,
0.13639593124389648,
0.05022285133600235,
-0.026618963107466698,
-0.17581267654895782,
-0.010601256042718887,
0.06815240532159805,
-0.10393248498439789,
-0.15014556050300598
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/java/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 400,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_java_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 400,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 400,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 400,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
144
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 400,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1269683539867401,
-0.028424585238099098,
-0.00037947893724776804,
0.134071946144104,
0.10921682417392731,
0.01399298943579197,
0.058992475271224976,
0.06564412266016006,
-0.03533552587032318,
0.015025343745946884,
0.046038705855607986,
0.011633191257715225,
0.03195175901055336,
0.1913626343011856,
0.0015316976932808757,
-0.1277361810207367,
-0.010752755217254162,
0.04626402631402016,
-0.052168335765600204,
0.1303916871547699,
0.09716139733791351,
-0.08084306865930557,
0.04931063950061798,
-0.0667785257101059,
-0.24763770401477814,
0.05437900125980377,
-0.002721077064052224,
-0.07236424088478088,
0.10333578288555145,
0.04876554384827614,
0.1256743222475052,
-0.004014836624264717,
0.021921345964074135,
-0.13341368734836578,
0.010077755898237228,
0.014541956596076488,
0.03963382542133331,
0.022703183814883232,
0.05012699216604233,
0.060033585876226425,
0.13132429122924805,
0.008846347220242023,
0.04053346812725067,
0.06097102910280228,
-0.07442290335893631,
-0.13234524428844452,
-0.013771068304777145,
0.02093149907886982,
0.054045937955379486,
0.10062191635370255,
-0.012379838153719902,
0.1323721706867218,
-0.15740428864955902,
0.12749429047107697,
0.10743344575166702,
-0.22816415131092072,
-0.011334030888974667,
0.11211362481117249,
0.08712899684906006,
0.09877555817365646,
-0.05825096741318703,
-0.06486175954341888,
0.10426092892885208,
0.0519166924059391,
0.04597493261098862,
-0.09618637710809708,
-0.10774682462215424,
0.03076547384262085,
-0.0841071680188179,
-0.06468350440263748,
0.22025272250175476,
0.006158012896776199,
-0.07470803707838058,
-0.05883147567510605,
-0.027334047481417656,
-0.1382496953010559,
0.03645675629377365,
0.026505248621106148,
0.006679665297269821,
-0.03806116804480553,
0.00821569375693798,
0.035317499190568924,
-0.07987769693136215,
-0.1490766853094101,
0.020563824102282524,
0.10176758468151093,
0.056024640798568726,
0.028899794444441795,
-0.09271544963121414,
0.10301956534385681,
0.04125818610191345,
-0.06269580870866776,
-0.022354139015078545,
-0.01982097141444683,
-0.10799114406108856,
0.03075304999947548,
-0.05701820179820061,
-0.18842774629592896,
0.01267069298774004,
0.006544316187500954,
-0.04271634295582771,
0.04380849748849869,
0.029797613620758057,
0.03173138201236725,
0.025057947263121605,
0.20218054950237274,
0.03881661593914032,
-0.11667343974113464,
0.05194708704948425,
0.04494297876954079,
-0.041845519095659256,
-0.010976070538163185,
-0.06401389092206955,
-0.0971933901309967,
0.09604521095752716,
0.10244105756282806,
-0.1368732899427414,
0.03521844372153282,
-0.0644378736615181,
-0.0464559905230999,
0.0017490689642727375,
-0.1602264642715454,
-0.0004204068100079894,
0.025474313646554947,
-0.06771239638328552,
-0.03614778444170952,
0.08872073143720627,
-0.1705961525440216,
-0.15202754735946655,
-0.04498745873570442,
-0.07892560213804245,
-0.04023788496851921,
-0.16558526456356049,
-0.15784217417240143,
-0.008909576572477818,
-0.030672689899802208,
0.016224537044763565,
-0.08099347352981567,
-0.14019182324409485,
-0.02199341543018818,
0.019304022192955017,
0.006282428745180368,
-0.0029656055849045515,
-0.07575490325689316,
-0.014615288004279137,
-0.03149291127920151,
-0.04196707531809807,
0.01802339404821396,
-0.04596589505672455,
0.12707185745239258,
0.11328693479299545,
0.05177801847457886,
-0.015044301748275757,
0.059518977999687195,
-0.07782553881406784,
0.05945935472846031,
-0.11581417173147202,
0.09385484457015991,
-0.061206020414829254,
0.07699008285999298,
-0.03521841764450073,
-0.1142781600356102,
0.08043369650840759,
0.059431083500385284,
0.06480847299098969,
0.0340619757771492,
-0.13286450505256653,
-0.02782471664249897,
0.18111862242221832,
-0.12293265014886856,
-0.13591019809246063,
0.10838207602500916,
-0.039755403995513916,
0.07240027189254761,
0.08202189952135086,
0.13356661796569824,
0.15000773966312408,
-0.025723431259393692,
0.024821806699037552,
0.051390379667282104,
0.043840039521455765,
-0.13910502195358276,
0.08307640999555588,
0.06833525747060776,
-0.0882188230752945,
0.0635005384683609,
-0.02779228799045086,
0.10665508359670639,
-0.015032615512609482,
-0.02532143145799637,
-0.051539987325668335,
-0.0762946680188179,
-0.0009234411409124732,
0.010041545145213604,
0.06831309199333191,
-0.08292068541049957,
-0.058004673570394516,
0.092191681265831,
0.17092643678188324,
-0.13366349041461945,
-0.0020143557339906693,
-0.07998260855674744,
0.037736039608716965,
-0.08826981484889984,
0.022812891751527786,
-0.16814783215522766,
0.03602084144949913,
0.07737106829881668,
-0.028202183544635773,
0.05020282417535782,
0.1354275792837143,
0.013810028322041035,
0.046554118394851685,
0.003975114319473505,
-0.011144421063363552,
-0.11265821754932404,
-0.05541049689054489,
-0.06287199258804321,
-0.05899330973625183,
-0.09318654984235764,
-0.05541073903441429,
-0.02908783033490181,
-0.18418964743614197,
0.008099108003079891,
0.0013954606838524342,
0.00753608625382185,
0.026033509522676468,
-0.009549437090754509,
0.025200163945555687,
0.07566040009260178,
-0.052668940275907516,
-0.036750730127096176,
0.028902314603328705,
0.022295907139778137,
-0.03965729475021362,
-0.05350928753614426,
-0.07843156903982162,
0.012288440950214863,
0.10951350629329681,
0.03626527637243271,
-0.0743178054690361,
0.022767847403883934,
-0.022058729082345963,
-0.047903161495923996,
0.006692517548799515,
-0.05887087062001228,
0.15112872421741486,
0.0013851408148184419,
0.1990368813276291,
-0.16502346098423004,
-0.03907603770494461,
-0.022890811786055565,
0.021985234692692757,
0.0633937194943428,
0.13479290902614594,
-0.017732027918100357,
-0.09346006065607071,
0.06281698495149612,
0.014464608393609524,
-0.10319413989782333,
0.22427837550640106,
-0.05843403935432434,
-0.09780314564704895,
0.01895066350698471,
0.1017289012670517,
-0.020027238875627518,
0.16349197924137115,
-0.1935097724199295,
-0.026018735021352768,
0.02236616052687168,
0.005277158692479134,
0.06490354239940643,
-0.1264754980802536,
0.004537611734122038,
0.011144881136715412,
-0.07354807108640671,
-0.06944407522678375,
-0.0039960588328540325,
-0.0002479288959875703,
0.036759067326784134,
-0.005442990455776453,
-0.03255502134561539,
0.020885853096842766,
-0.03972284495830536,
-0.10234680771827698,
0.22099146246910095,
-0.09624076634645462,
-0.22195705771446228,
-0.20725935697555542,
0.10959437489509583,
-0.06448692083358765,
-0.015872886404395103,
0.0321044847369194,
-0.08512618392705917,
-0.059575002640485764,
-0.054138801991939545,
0.17435644567012787,
-0.058033838868141174,
-0.008470375090837479,
-0.01542976126074791,
0.07298582047224045,
0.014051014557480812,
-0.21357230842113495,
0.039015695452690125,
-0.007123928517103195,
-0.006601239554584026,
0.005839006509631872,
-0.09517865628004074,
0.08661821484565735,
0.1606118232011795,
-0.08310403674840927,
0.023669688031077385,
0.008401788771152496,
0.18704071640968323,
-0.03976241126656532,
-0.06037185713648796,
0.147189661860466,
-0.010617805644869804,
-0.011065191589295864,
0.01426843460649252,
-0.009704170748591423,
-0.09196636080741882,
0.06029101833701134,
-0.007488748989999294,
-0.018754592165350914,
-0.2709968388080597,
-0.009149329736828804,
-0.07779572904109955,
0.05386168882250786,
0.03761005774140358,
0.03689943999052048,
-0.08540283143520355,
0.027844196185469627,
0.05667538195848465,
0.14287014305591583,
-0.006177779287099838,
0.049689456820487976,
0.07064109295606613,
0.0013810533564537764,
0.01249851007014513,
-0.09985624998807907,
0.002691577887162566,
0.0715700313448906,
0.10009562969207764,
0.27072107791900635,
-0.10757151246070862,
0.2011926919221878,
0.04521867260336876,
0.057495929300785065,
0.05041724070906639,
0.14214830100536346,
-0.12559175491333008,
0.029651956632733345,
0.005292437970638275,
-0.010250244289636612,
-0.10909959673881531,
-0.0005762220243923366,
-0.06148071959614754,
0.09283287823200226,
-0.1114712506532669,
-0.06422498822212219,
0.008974949829280376,
0.15154750645160675,
0.0439305379986763,
-0.22264784574508667,
-0.13240419328212738,
0.015301050618290901,
-0.1004362553358078,
-0.10546049475669861,
0.0666402205824852,
0.23854058980941772,
-0.07918896526098251,
-0.04356611892580986,
-0.015089576132595539,
0.1318054050207138,
-0.03861323371529579,
-0.020257730036973953,
-0.03740137442946434,
0.07095091789960861,
0.017782606184482574,
0.13648585975170135,
-0.2862386405467987,
0.1308312714099884,
-0.009027361869812012,
0.05970631539821625,
-0.03433113545179367,
0.05046440288424492,
-0.04439540207386017,
0.06670348346233368,
0.03154950588941574,
-0.014262172393500805,
0.038925256580114365,
-0.15973146259784698,
0.012896610423922539,
0.04122976213693619,
0.02261921390891075,
0.06078054755926132,
0.06923363357782364,
-0.0052304561249911785,
0.05355371907353401,
-0.013209053315222263,
-0.1325145661830902,
-0.06135990098118782,
-0.06824279576539993,
-0.013726930133998394,
-0.04149380698800087,
-0.01719050481915474,
-0.047749657183885574,
-0.01872733049094677,
0.06320353597402573,
0.18836824595928192,
-0.08360616117715836,
-0.08040120452642441,
-0.07604879140853882,
0.05676419287919998,
0.1255316585302353,
-0.080082006752491,
0.03010988049209118,
-0.0015413772780448198,
0.04441044107079506,
-0.006813381798565388,
-0.07575828582048416,
0.051513709127902985,
-0.03002932481467724,
-0.07028790563344955,
-0.006833237130194902,
0.06988592445850372,
-0.0014591532526537776,
0.023748142644762993,
0.00854515004903078,
-0.09526156634092331,
-0.045573361217975616,
-0.12239498645067215,
-0.13444145023822784,
-0.04820476844906807,
0.013389687053859234,
0.04393988102674484,
-0.13402299582958221,
-0.052735570818185806,
-0.003985743969678879,
-0.032484691590070724,
0.135665625333786,
0.16131794452667236,
-0.05527432635426521,
0.0323667898774147,
0.14395242929458618,
-0.048206817358732224,
-0.18175917863845825,
0.030579274520277977,
0.04834406450390816,
0.11961924284696579,
-0.04528666287660599,
-0.1542797088623047,
0.05656623840332031,
0.018562201410531998,
0.03619269281625748,
0.041194405406713486,
-0.3159159719944,
-0.12014785408973694,
0.0840880423784256,
0.16058151423931122,
0.11970440298318863,
-0.1147131398320198,
-0.036985479295253754,
-0.06315850466489792,
-0.15969131886959076,
0.09706053137779236,
-0.04645402729511261,
0.13488095998764038,
-0.07218391448259354,
0.02946748211979866,
0.03816116228699684,
-0.04918395355343819,
0.07211293280124664,
0.02751990035176277,
0.12032656371593475,
-0.039495863020420074,
0.017904730513691902,
0.12448188662528992,
-0.030574964359402657,
0.18293681740760803,
-0.14324675500392914,
0.10003731399774551,
-0.22404927015304565,
-0.06390849500894547,
-0.08052913844585419,
0.0036418975796550512,
-0.0350053608417511,
-0.04685475304722786,
-0.07645586878061295,
0.03149835392832756,
-0.005680031608790159,
-0.010274345986545086,
0.048120129853487015,
-0.026553520932793617,
-0.02101093903183937,
0.10616800934076309,
0.10345935076475143,
-0.019063344225287437,
-0.07229182124137878,
0.0538303516805172,
0.050771865993738174,
0.11115188151597977,
-0.19787679612636566,
0.030441993847489357,
0.10708305984735489,
0.0164117943495512,
0.12082865089178085,
0.04321093112230301,
-0.10808427631855011,
0.03294283151626587,
0.08596630394458771,
-0.07656776905059814,
-0.06524341553449631,
-0.022108281031250954,
-0.082979217171669,
-0.07236497104167938,
0.05682118609547615,
0.09506803750991821,
-0.050524983555078506,
-0.022220613434910774,
-0.02360920049250126,
-0.026016654446721077,
-0.11367840319871902,
0.19262410700321198,
0.07913196831941605,
0.0895678773522377,
-0.0745994970202446,
0.06088056415319443,
0.08547951281070709,
-0.08874765038490295,
0.0002922855783253908,
0.1915743499994278,
-0.10464493930339813,
-0.047753725200891495,
0.07509981095790863,
0.20880946516990662,
-0.03157443553209305,
-0.0545562282204628,
-0.14009448885917664,
-0.08113943785429001,
0.030949825420975685,
0.1624288409948349,
0.10336662828922272,
0.09761247783899307,
-0.03252270817756653,
-0.004657505080103874,
-0.11116235703229904,
0.09265550225973129,
0.0733480229973793,
0.04558245465159416,
-0.10836321115493774,
0.14377419650554657,
0.03199896588921547,
0.12410319596529007,
-0.029585370793938637,
-0.011953813023865223,
-0.12703068554401398,
0.06160654127597809,
-0.10791183263063431,
0.02589746005833149,
-0.009820817038416862,
0.05354580655694008,
-0.023526031523942947,
-0.0002902464475482702,
-0.02589484676718712,
0.06660542637109756,
-0.08502914011478424,
0.0015598391182720661,
0.005731194745749235,
0.0458628386259079,
-0.052940212190151215,
-0.02130083553493023,
0.03483393043279648,
-0.09382686764001846,
0.1241413950920105,
-0.03136932849884033,
-0.029824664816260338,
0.08609525114297867,
-0.0386635921895504,
0.03753228113055229,
0.015870368108153343,
0.05366218835115433,
0.024752216413617134,
0.016357550397515297,
0.07998909056186676,
0.039944738149642944,
0.05848776176571846,
0.02609005942940712,
0.11198097467422485,
-0.13985483348369598,
-0.09297223389148712,
-0.060691867023706436,
-0.1052367091178894,
-0.06109215319156647,
0.10051489621400833,
0.05026040971279144,
0.10664654523134232,
0.09426973015069962,
-0.03257746249437332,
0.012612877413630486,
-0.13034991919994354,
-0.06295883655548096,
0.022454997524619102,
-0.040715694427490234,
-0.07858696579933167,
-0.05126658082008362,
0.04120543226599693,
-0.030238177627325058,
0.12019998580217361,
0.010529310442507267,
0.047258369624614716,
-0.019383620470762253,
-0.06349851936101913,
-0.00047481717774644494,
0.024150775745511055,
0.22321075201034546,
-0.07760988175868988,
0.039823267608881,
0.0008833302999846637,
0.017557047307491302,
0.0036399285309016705,
0.12027373909950256,
0.10993196815252304,
0.16855451464653015,
-0.04083481431007385,
0.10113871097564697,
0.016340026631951332,
-0.002315050456672907,
-0.058809589594602585,
0.023781299591064453,
0.021287597715854645,
0.06166144087910652,
-0.048772942274808884,
0.17868566513061523,
0.09011143445968628,
-0.11845242232084274,
0.1073697954416275,
0.02920178882777691,
-0.13173674046993256,
-0.032419800758361816,
0.026374610140919685,
-0.03479929640889168,
-0.15062780678272247,
0.024753183126449585,
-0.12809346616268158,
-0.01566091552376747,
0.05151309072971344,
0.05582769215106964,
-0.08304635435342789,
0.1686428040266037,
0.04375116899609566,
-0.05812342092394829,
0.06249428540468216,
-0.004138934426009655,
0.02798106148838997,
0.03205961361527443,
0.037706512957811356,
0.04166444391012192,
-0.03730383515357971,
0.03716849535703659,
0.021681390702724457,
-0.025236599147319794,
-0.012944224290549755,
-0.020429374650120735,
-0.010170727968215942,
-0.01812976412475109,
0.02179666981101036,
0.056820254772901535,
0.1691897064447403,
0.033916790038347244,
-0.07204649597406387,
-0.026119964197278023,
0.17429544031620026,
-0.033030860126018524,
-0.10320044308900833,
-0.13024306297302246,
0.134999617934227,
0.05460870638489723,
0.010814253240823746,
0.026391347870230675,
-0.08122619241476059,
-0.05720390006899834,
0.20731289684772491,
0.05432308837771416,
-0.02408476732671261,
-0.024145565927028656,
0.009253595024347305,
-0.006943584885448217,
-0.04500611498951912,
0.1980029046535492,
0.024204712361097336,
0.22424350678920746,
0.01868901401758194,
-0.00973543245345354,
-0.06742241978645325,
-0.038553040474653244,
0.0004638807731680572,
0.11992151290178299,
-0.037503939121961594,
-0.04078500717878342,
-0.08097865432500839,
-0.0006775640067644417,
0.0066232443787157536,
-0.08038728684186935,
0.09051048755645752,
-0.1296793818473816,
-0.10143114626407623,
-0.05080027133226395,
0.045031458139419556,
-0.05715356394648552,
0.019604390487074852,
-0.02394605241715908,
0.04879441857337952,
0.07771063596010208,
-0.028137505054473877,
-0.103730708360672,
-0.16235271096229553,
0.10019908845424652,
-0.05277242884039879,
0.12016850709915161,
-0.007652026601135731,
0.15650387108325958,
0.08721356093883514,
0.02677314169704914,
-0.06467968970537186,
0.11493019014596939,
0.03066527657210827,
0.04906749352812767,
0.05370459333062172,
0.12174419313669205,
-0.04442591592669487,
0.12821847200393677,
-0.05417637899518013,
-0.03626702353358269,
-0.028233975172042847,
-0.08633721619844437,
-0.015112942084670067,
-0.16791850328445435,
-0.02388080023229122,
-0.08949404209852219,
0.09300363808870316,
0.19680385291576385,
-0.04495853930711746,
-0.03015666827559471,
-0.09457849711179733,
0.10368959605693817,
-0.016560135409235954,
0.059307366609573364,
-0.035582009702920914,
-0.17659041285514832,
-0.0030360647942870855,
0.006825972814112902,
0.012857149355113506,
-0.2679794728755951,
-0.0034130269195884466,
-0.03554578498005867,
-0.03456342965364456,
-0.08154147863388062,
0.15790309011936188,
0.08749988675117493,
0.04923180118203163,
-0.036691516637802124,
-0.14632858335971832,
-0.0377902127802372,
0.05958767235279083,
-0.14300712943077087,
-0.14168238639831543
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/java/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_java_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08938748389482498,
0.06870485842227936,
-0.0010735170217230916,
0.10439030081033707,
0.038856688886880875,
0.027605291455984116,
0.020281724631786346,
0.10834112763404846,
-0.02499820478260517,
0.06341204792261124,
0.05516747012734413,
-0.07924050837755203,
0.052748631685972214,
0.18076500296592712,
0.02884948067367077,
-0.16160070896148682,
-0.028768420219421387,
0.027236156165599823,
-0.04257946461439133,
0.1027078852057457,
0.08339396864175797,
-0.08171916753053665,
0.07505527883768082,
-0.04512616991996765,
-0.12624941766262054,
0.04990740120410919,
-0.03559112548828125,
-0.024263080209493637,
0.0976453348994255,
0.06000947207212448,
0.10917145758867264,
-0.024492867290973663,
0.06267418712377548,
-0.21149706840515137,
0.0021414970979094505,
0.028146374970674515,
0.0645081102848053,
0.03584190830588341,
0.05893542245030403,
0.07766354084014893,
0.12901516258716583,
-0.010599172674119473,
0.04445584490895271,
0.05956675484776497,
-0.06541144847869873,
-0.08765165507793427,
-0.06329008936882019,
0.06130598112940788,
0.07711504399776459,
0.09631448239088058,
-0.006068364717066288,
0.009470866061747074,
-0.07009257376194,
0.09116582572460175,
0.11604329198598862,
-0.21026287972927094,
-0.018766725435853004,
0.12600715458393097,
0.09970751404762268,
0.059548161923885345,
-0.08350672572851181,
-0.03727283701300621,
0.10553031414747238,
0.04869528114795685,
0.06226508319377899,
-0.09763984382152557,
-0.0451011061668396,
-0.002901417203247547,
-0.0409870408475399,
-0.046017006039619446,
0.15473808348178864,
0.03436033055186272,
-0.04963241145014763,
-0.11753874272108078,
-0.04791173338890076,
-0.1943691074848175,
0.04098362848162651,
0.013916398398578167,
0.01659710705280304,
-0.00345238228328526,
0.021841052919626236,
-0.01635468192398548,
-0.0909237191081047,
-0.1112496554851532,
0.032983459532260895,
0.009117205627262592,
0.060311127454042435,
0.03301353380084038,
-0.0260621327906847,
0.08570610731840134,
-0.0036715781316161156,
-0.053849685937166214,
-0.013352065347135067,
0.015785526484251022,
-0.11211490631103516,
0.01614220440387726,
-0.007359684444963932,
-0.061808276921510696,
0.005135792773216963,
0.06006908044219017,
-0.11748161911964417,
0.08469519764184952,
0.09357239305973053,
0.013004780746996403,
0.021426862105727196,
0.218485027551651,
0.038864973932504654,
-0.15961094200611115,
0.017128704115748405,
0.013714813627302647,
-0.00015755656932014972,
0.006250759121030569,
-0.04970856383442879,
-0.047700975090265274,
0.012317642569541931,
0.06863526254892349,
-0.12539853155612946,
0.01583743840456009,
-0.050812337547540665,
-0.008679499849677086,
0.08981551975011826,
-0.124718077480793,
0.04134313389658928,
0.014278944581747055,
-0.043966278433799744,
-0.03486508131027222,
0.08006306737661362,
-0.12408378720283508,
-0.12100833654403687,
0.03622240200638771,
-0.04289054870605469,
-0.03705485537648201,
-0.11768188327550888,
-0.1062907725572586,
-0.0016388437943533063,
-0.03156520798802376,
-0.00102480489294976,
-0.09273597598075867,
-0.09191734343767166,
-0.02657042257487774,
0.03567158803343773,
-0.011509374715387821,
-0.027760230004787445,
-0.03628285974264145,
0.011278621852397919,
-0.007419060915708542,
-0.021229185163974762,
0.032838284969329834,
-0.03438520431518555,
0.08875724673271179,
0.0759284719824791,
0.0446149967610836,
0.0033448697067797184,
0.030520988628268242,
-0.09012661129236221,
0.0849735289812088,
-0.10388896614313126,
0.055702321231365204,
-0.014458458870649338,
0.06516937166452408,
-0.10320845991373062,
-0.08164214342832565,
0.024111449718475342,
0.05092477798461914,
0.05722013860940933,
0.020205294713377953,
-0.12090480327606201,
0.03248758241534233,
0.14757774770259857,
-0.11617592722177505,
-0.13154472410678864,
0.10961788892745972,
-0.0017049206653609872,
0.028293553739786148,
0.05990325286984444,
0.13849131762981415,
0.14571048319339752,
-0.0856858640909195,
-0.01587611623108387,
0.0764082744717598,
0.05998341366648674,
-0.06031802296638489,
0.06467580795288086,
0.008717619813978672,
0.009893960319459438,
0.03610188141465187,
0.05827801674604416,
0.04976917803287506,
0.005200790241360664,
-0.03316620737314224,
-0.045011863112449646,
-0.08410930633544922,
-0.06606075912714005,
-0.007579525001347065,
0.020085077732801437,
-0.0594051219522953,
-0.05417319014668465,
-0.00908836629241705,
0.1716737598180771,
-0.09948886185884476,
0.03518490865826607,
-0.0687108039855957,
-0.04220030829310417,
-0.08090207725763321,
0.027873558923602104,
-0.12112485617399216,
0.027761630713939667,
0.06628012657165527,
-0.053187910467386246,
0.03457297012209892,
0.08582740277051926,
0.00399242527782917,
0.025555266067385674,
-0.06006072461605072,
-0.04238034412264824,
-0.035707034170627594,
-0.06639329344034195,
-0.10180096328258514,
-0.03437759354710579,
-0.09044380486011505,
-0.033707283437252045,
-0.06835330277681351,
-0.17092829942703247,
-0.004510237369686365,
-0.001233952585607767,
0.030266862362623215,
0.029298745095729828,
-0.03486579656600952,
0.03354597091674805,
0.05631944537162781,
-0.043684571981430054,
-0.08259551227092743,
0.018901120871305466,
0.038803745061159134,
-0.0878961980342865,
-0.018585694953799248,
-0.09026041626930237,
-0.05782051011919975,
0.061919573694467545,
0.09326992928981781,
-0.11085106432437897,
-0.003448886564001441,
-0.027328547090291977,
-0.055846597999334335,
-0.05703962221741676,
-0.056263912469148636,
0.16700512170791626,
0.0119967395439744,
0.16644187271595,
-0.13574573397636414,
-0.06534644961357117,
-0.025151433423161507,
0.0074301790446043015,
0.02149553783237934,
0.155072420835495,
0.003791605122387409,
-0.08543528616428375,
0.03442475572228432,
-0.0332752987742424,
-0.050953928381204605,
0.1673380434513092,
-0.017077110707759857,
-0.0765702873468399,
-0.004604286048561335,
0.101405568420887,
-0.016182107850909233,
0.1661033183336258,
-0.06838216632604599,
0.00229103141464293,
-0.001172238728031516,
0.021187027916312218,
0.0443822517991066,
-0.11719290167093277,
0.028049852699041367,
0.04022129997611046,
-0.06465944647789001,
-0.0277346633374691,
-0.021080095320940018,
-0.03873228654265404,
0.03746503219008446,
0.015632351860404015,
0.03839907422661781,
-0.023879844695329666,
-0.035540446639060974,
-0.09998535364866257,
0.18040712177753448,
-0.07469981908798218,
-0.205304354429245,
-0.17751099169254303,
0.08270261436700821,
-0.029767271131277084,
-0.017733696848154068,
0.037937551736831665,
-0.09617544710636139,
-0.06335770338773727,
-0.10457924008369446,
0.11245981603860855,
-0.11532805114984512,
-0.007944363169372082,
-0.028796924278140068,
0.062314655631780624,
0.05664053559303284,
-0.16528987884521484,
0.02507413923740387,
-0.0037626337725669146,
0.014134779572486877,
-0.011730900034308434,
-0.04429004713892937,
0.08351926505565643,
0.10869467258453369,
-0.061973001807928085,
0.018695395439863205,
0.00308188796043396,
0.15874063968658447,
-0.057036690413951874,
0.047202568501234055,
0.18674606084823608,
0.014141888357698917,
0.026522669941186905,
0.05703668296337128,
0.015022668987512589,
-0.0926024541258812,
0.0594225637614727,
0.04959971830248833,
-0.033475905656814575,
-0.22167475521564484,
-0.020428836345672607,
-0.07441955804824829,
0.06890008598566055,
0.11432183533906937,
0.0573074072599411,
-0.15268228948116302,
0.012168154120445251,
-0.0012042367598041892,
0.1550457626581192,
-0.030763985589146614,
0.05828109383583069,
0.01753624528646469,
0.003352595493197441,
-0.004573560319840908,
-0.10523401200771332,
0.011190354824066162,
0.07961483299732208,
0.11453291773796082,
0.19469524919986725,
-0.0841618999838829,
0.19397100806236267,
0.01680547371506691,
0.09506002068519592,
0.04067045450210571,
0.07677771896123886,
-0.13667136430740356,
0.010860483162105083,
0.006967395078390837,
-0.021597998216748238,
-0.05231897532939911,
0.04634631425142288,
-0.041739195585250854,
0.07207927107810974,
-0.05609193444252014,
-0.004591883160173893,
0.02105581760406494,
0.201691672205925,
0.05628815293312073,
-0.15784138441085815,
-0.12984010577201843,
0.024180589243769646,
-0.09048786014318466,
-0.11068451404571533,
0.0716196820139885,
0.23081980645656586,
-0.0608142726123333,
0.017481669783592224,
-0.005446191877126694,
0.13263289630413055,
-0.10693254321813583,
-0.015985701233148575,
0.04282120242714882,
0.05873571336269379,
0.01101447269320488,
0.12229400128126144,
-0.2527526021003723,
0.07317071408033371,
0.015211056917905807,
0.0839393138885498,
-0.014300240203738213,
0.06369779258966446,
-0.05576813966035843,
0.021387482061982155,
0.07795804738998413,
0.01187689695507288,
-0.06593770533800125,
-0.18929177522659302,
-0.03472140431404114,
0.028887901455163956,
0.03952081501483917,
-0.013393361121416092,
0.07896803319454193,
-0.02371472865343094,
0.041180212050676346,
-0.034840505570173264,
-0.13777370750904083,
-0.04990450292825699,
-0.13383083045482635,
-0.029521867632865906,
0.006664451211690903,
-0.048961784690618515,
-0.028555691242218018,
0.04238563030958176,
0.043615855276584625,
0.24181462824344635,
-0.144003763794899,
-0.07817777246236801,
-0.09738650918006897,
0.05486706644296646,
0.13919207453727722,
-0.0886322632431984,
0.014628618024289608,
0.012865369208157063,
0.06454581022262573,
-0.041602104902267456,
-0.06377345323562622,
0.02864052541553974,
-0.05607329308986664,
-0.09175010770559311,
-0.033327434211969376,
0.1120237410068512,
-0.02225952222943306,
0.03665648400783539,
-0.0023377425968647003,
-0.07469630241394043,
-0.04346428066492081,
-0.13195721805095673,
-0.06884364038705826,
0.00558808259665966,
0.04262193292379379,
-0.026677275076508522,
-0.12085255980491638,
0.07971855252981186,
0.017617039382457733,
-0.09948340058326721,
0.06960873305797577,
0.1707698404788971,
-0.0726323202252388,
0.03880545124411583,
0.10509687662124634,
-0.06134829297661781,
-0.16783420741558075,
-0.04363374412059784,
0.03417855501174927,
0.07788031548261642,
-0.026485368609428406,
-0.14146071672439575,
0.06223966181278229,
0.012759660370647907,
0.01459071971476078,
0.011454080231487751,
-0.2756402790546417,
-0.1278626024723053,
0.002676556585356593,
0.07441822439432144,
0.0323648601770401,
-0.10545457899570465,
-0.05237312614917755,
-0.06824758648872375,
-0.07569080591201782,
0.03869401291012764,
0.06248025596141815,
0.1086166501045227,
-0.043956782668828964,
0.0280649084597826,
0.042621977627277374,
-0.033101294189691544,
0.06854347884654999,
-0.016086803749203682,
0.09986469894647598,
-0.02059176005423069,
0.017094293609261513,
0.0427408367395401,
-0.05841747298836708,
0.18788282573223114,
-0.15501768887043,
0.09874513745307922,
-0.18120886385440826,
-0.04244779050350189,
-0.027615124359726906,
-0.002178091323003173,
-0.04203764349222183,
-0.05375503748655319,
-0.10901013761758804,
0.031133370473980904,
0.05466213449835777,
-0.034471988677978516,
0.0531526654958725,
-0.026398438960313797,
-0.04518130421638489,
0.08622881770133972,
0.061896391212940216,
-0.00805707834661007,
-0.12666410207748413,
0.031214017421007156,
0.01604732684791088,
0.08871271461248398,
-0.19602395594120026,
0.02629588358104229,
0.10398221760988235,
0.015347863547503948,
0.10251526534557343,
0.0002543515875004232,
-0.08672606945037842,
0.02417803928256035,
0.0688878744840622,
-0.07083237171173096,
-0.08582169562578201,
-0.017683355137705803,
-0.048436760902404785,
-0.08497540652751923,
0.02646971307694912,
0.08614527434110641,
-0.059274762868881226,
-0.01397602166980505,
-0.006710112560540438,
0.02168743871152401,
-0.07030780613422394,
0.16871193051338196,
0.01938391663134098,
0.07969827204942703,
-0.06439003348350525,
0.08568917214870453,
0.11060460656881332,
-0.12245666235685349,
0.018949277698993683,
0.16911032795906067,
-0.0854359045624733,
-0.02503819204866886,
0.08054333925247192,
0.12216626852750778,
-0.010752931237220764,
-0.05812879651784897,
-0.09357573091983795,
-0.07766906917095184,
0.01966576837003231,
0.021138925105333328,
0.06664355844259262,
0.08748626708984375,
-0.028471047058701515,
-0.0022389711812138557,
-0.1218215674161911,
0.10716372728347778,
0.06430309265851974,
0.0519234836101532,
-0.13167914748191833,
0.12999941408634186,
0.039866670966148376,
0.08233991265296936,
0.002717001363635063,
0.02701343409717083,
-0.10614252835512161,
0.02916610613465309,
-0.03162096068263054,
0.03491486236453056,
-0.006262169685214758,
0.04941242188215256,
-0.03884177282452583,
0.034722700715065,
-0.029006242752075195,
0.04922828823328018,
-0.03469258174300194,
-0.027832571417093277,
-0.03808547183871269,
0.03837989270687103,
-0.06261707097291946,
-0.0181898046284914,
0.010380927473306656,
-0.07893486320972443,
0.09580322355031967,
-0.06795472651720047,
-0.007990741170942783,
-0.005571388639509678,
0.017583578824996948,
0.06792277097702026,
0.01929381862282753,
0.04025908559560776,
-0.01229089219123125,
-0.018381599336862564,
0.026058146730065346,
0.0193276759237051,
-0.025046687573194504,
-0.0073905037716031075,
0.0796118676662445,
-0.15433919429779053,
-0.08157109469175339,
-0.0807197168469429,
-0.08131840080022812,
-0.06128588691353798,
0.07323825359344482,
0.08890954405069351,
0.0730438306927681,
0.08073005825281143,
-0.0394490584731102,
0.008266204968094826,
-0.15617430210113525,
-0.04440963268280029,
0.0504920594394207,
-0.006913382094353437,
-0.11555252969264984,
-0.03541551157832146,
0.05078831687569618,
-0.045844871550798416,
0.13175898790359497,
-0.016443731263279915,
0.05438147485256195,
-0.007363952696323395,
-0.061043936759233475,
-0.04263431206345558,
0.010482561774551868,
0.1703006774187088,
-0.11300481855869293,
0.003269720356911421,
-0.00507354037836194,
0.007590139284729958,
0.02574017643928528,
0.1737412065267563,
0.07830077409744263,
0.12779748439788818,
0.05281233415007591,
0.07532455027103424,
-0.04634327441453934,
-0.03279706463217735,
-0.12012752145528793,
0.08254900574684143,
-0.027834361419081688,
0.0507722869515419,
-0.045216355472803116,
0.13562381267547607,
0.10742500424385071,
-0.14554089307785034,
0.1064433827996254,
0.0076162987388670444,
-0.09062010049819946,
-0.03421948477625847,
-0.08974003046751022,
-0.04985471069812775,
-0.0970025360584259,
0.0002246298681711778,
-0.10756689310073853,
0.021108224987983704,
0.05959818512201309,
0.03002525307238102,
-0.03461354970932007,
0.16178679466247559,
-0.021354658529162407,
-0.056263115257024765,
0.033252812922000885,
0.05238506942987442,
0.0333806537091732,
0.09693890810012817,
0.03453044593334198,
0.06601211428642273,
-0.06505178660154343,
0.06045807525515556,
0.03943268954753876,
0.00486429687589407,
0.004348222631961107,
0.014668761752545834,
-0.00912430789321661,
-0.04472167044878006,
-0.00868951715528965,
0.0751013457775116,
0.14170534908771515,
0.04670235514640808,
-0.04606911912560463,
-0.047823216766119,
0.2097674012184143,
-0.052694760262966156,
-0.04989783465862274,
-0.12480104714632034,
0.1461115926504135,
0.05260326340794563,
0.007182931061834097,
0.02113565243780613,
-0.07476679980754852,
-0.029808273538947105,
0.2270643711090088,
0.05350593850016594,
-0.03406001999974251,
-0.02785172127187252,
0.0013893931172788143,
-0.00877821072936058,
-0.031078293919563293,
0.14202584326267242,
-0.0001494184834882617,
0.21914690732955933,
0.012736269272863865,
0.0042257229797542095,
-0.04048293083906174,
-0.045993998646736145,
-0.017290471121668816,
0.21927350759506226,
-0.03722753748297691,
0.023875033482909203,
-0.09568469226360321,
-0.018376823514699936,
0.029283661395311356,
-0.13083180785179138,
0.12258531898260117,
-0.134768545627594,
-0.07703875005245209,
0.017353294417262077,
0.07347135245800018,
-0.04538654908537865,
0.03890419006347656,
-0.0158577561378479,
0.06102273613214493,
0.06324028968811035,
-0.02765088714659214,
-0.09477822482585907,
-0.14878258109092712,
0.04225916042923927,
-0.018264111131429672,
0.13611085712909698,
0.01938829943537712,
0.06995417177677155,
0.08016937226057053,
0.004252010956406593,
-0.08687181770801544,
0.09837999194860458,
0.032358333468437195,
-0.008119483478367329,
0.049843642860651016,
0.13126033544540405,
-0.03920656442642212,
0.15398100018501282,
0.011497285217046738,
-0.021400706842541695,
-0.02534077689051628,
-0.037546202540397644,
-0.0068327621556818485,
-0.14845409989356995,
-0.0029854595195502043,
-0.05594494193792343,
0.13620147109031677,
0.19699299335479736,
-0.04399648681282997,
-0.02452371083199978,
-0.05283648520708084,
0.08943644911050797,
-0.016433918848633766,
0.08958917111158371,
0.009293772280216217,
-0.1628875881433487,
0.0223713256418705,
-0.005622542928904295,
0.01657838746905327,
-0.18861231207847595,
-0.05570080876350403,
-0.027013041079044342,
-0.028421973809599876,
-0.09329912811517715,
0.1427670270204544,
0.06224353238940239,
0.03389676287770271,
-0.03800959885120392,
-0.1559898555278778,
-0.003680647350847721,
0.044662002474069595,
-0.12245145440101624,
-0.12267591059207916
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation java
Pretrained model on programming language java using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized java code functions: it works best with tokenized java functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the java function/method.
## Intended uses & limitations
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_java_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/java/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static < T , U > Function < T , U > castFunction ( Class < U > target ) { return new CastToClass < T , U > ( target ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_java_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation java
======================================================
Pretrained model on programming language java using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized java code functions: it works best with tokenized java functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the java function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the java function or be fine-tuned on other java code tasks. It can be used on unparsed and untokenized java code. However, if the java code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate java function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate java function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing java code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09645318239927292,
0.05396255850791931,
-0.0009846885222941637,
0.10613302141427994,
0.03980099782347679,
0.018563570454716682,
0.04426472261548042,
0.10812559723854065,
-0.021637339144945145,
0.05716537684202194,
0.045984894037246704,
-0.05302586033940315,
0.06440842896699905,
0.19258828461170197,
0.01561232190579176,
-0.1440037488937378,
-0.026183923706412315,
0.04430890083312988,
-0.06457855552434921,
0.10930002480745316,
0.08054552227258682,
-0.09011296927928925,
0.07559749484062195,
-0.051529642194509506,
-0.12265990674495697,
0.04140765592455864,
-0.02503947727382183,
-0.02712584100663662,
0.09526429325342178,
0.06563208252191544,
0.11697263270616531,
-0.021302752196788788,
0.062137752771377563,
-0.19271323084831238,
0.0025675378274172544,
0.02822135016322136,
0.06461777538061142,
0.04376312345266342,
0.04871697723865509,
0.09133264422416687,
0.10892294347286224,
-0.021312754601240158,
0.03840392455458641,
0.056494466960430145,
-0.06903795152902603,
-0.048183850944042206,
-0.07121625542640686,
0.06716018915176392,
0.09123113006353378,
0.09599984437227249,
-0.006047888658940792,
0.04227495566010475,
-0.08234395831823349,
0.08459591120481491,
0.11971308290958405,
-0.218436598777771,
-0.021476155146956444,
0.11208807677030563,
0.10753408074378967,
0.04953416436910629,
-0.0844409316778183,
-0.039297278970479965,
0.11187682300806046,
0.04378627985715866,
0.06861144304275513,
-0.09463924169540405,
-0.035337626934051514,
-0.002617219462990761,
-0.049447283148765564,
-0.04413584619760513,
0.17392383515834808,
0.041627105325460434,
-0.05507844313979149,
-0.11063013225793839,
-0.042664848268032074,
-0.18125426769256592,
0.041756145656108856,
0.0036075704265385866,
0.0058234925381839275,
-0.013743588700890541,
0.01614333689212799,
-0.012271828018128872,
-0.09051666408777237,
-0.11835046857595444,
0.0293845497071743,
0.006054185796529055,
0.057890038937330246,
0.03122039884328842,
-0.036786358803510666,
0.08453720808029175,
0.036458127200603485,
-0.05122726038098335,
-0.01023389957845211,
0.006441798526793718,
-0.10843835026025772,
0.005885428283363581,
-0.003720039501786232,
-0.07465184479951859,
-0.006748933345079422,
0.05473608151078224,
-0.09279964864253998,
0.06987738609313965,
0.09497146308422089,
0.018723569810390472,
0.012549118138849735,
0.21166756749153137,
0.03656642511487007,
-0.14873790740966797,
0.02348252572119236,
0.02983664534986019,
-0.0038287758361548185,
0.012599240988492966,
-0.056198421865701675,
-0.053093910217285156,
0.023689240217208862,
0.06873693317174911,
-0.131904736161232,
0.02686903066933155,
-0.05702846869826317,
-0.020931977778673172,
0.08701854944229126,
-0.12844623625278473,
0.031209129840135574,
0.01174087729305029,
-0.06460269540548325,
-0.03909821808338165,
0.08867444843053818,
-0.12751765549182892,
-0.12225871533155441,
0.014238031581044197,
-0.04664783552289009,
-0.03650110214948654,
-0.1254534125328064,
-0.11717075854539871,
-0.002464545425027609,
-0.032612185925245285,
0.000810426427051425,
-0.0957811251282692,
-0.09645166248083115,
-0.02464999444782734,
0.03858610987663269,
-0.0044236253015697,
-0.028634902089834213,
-0.046459097415208817,
0.0006605616072192788,
-0.011768468655645847,
-0.025903552770614624,
0.022379184141755104,
-0.030477160587906837,
0.09646011888980865,
0.06422708183526993,
0.0528409481048584,
-0.005592911504209042,
0.03063049353659153,
-0.08499372750520706,
0.08139503002166748,
-0.12321341782808304,
0.061533212661743164,
-0.008126497268676758,
0.06727147847414017,
-0.10491027683019638,
-0.07662258297204971,
0.018049202859401703,
0.04893619567155838,
0.07351329177618027,
0.03478001430630684,
-0.14504945278167725,
0.034349821507930756,
0.1435108780860901,
-0.1063399538397789,
-0.14013291895389557,
0.1056293323636055,
-0.011506330221891403,
0.05385002866387367,
0.06601763516664505,
0.13478703796863556,
0.15348510444164276,
-0.08498126268386841,
-0.030142106115818024,
0.07128525525331497,
0.04292195290327072,
-0.07103961706161499,
0.05777550861239433,
0.026849931105971336,
-0.001147935283370316,
0.019732361659407616,
0.06795140355825424,
0.05471248924732208,
-0.0011116318637505174,
-0.03656792268157005,
-0.031463515013456345,
-0.09331636130809784,
-0.054696157574653625,
-0.007348124869167805,
0.024088559672236443,
-0.05419382452964783,
-0.0532972514629364,
0.013534655794501305,
0.16362576186656952,
-0.10137786716222763,
0.02824663557112217,
-0.07911740988492966,
-0.04393071308732033,
-0.07663572579622269,
0.02358069270849228,
-0.11468058079481125,
0.03630394861102104,
0.06603299826383591,
-0.033665142953395844,
0.04522686079144478,
0.09355058521032333,
0.005502062384039164,
0.01858781836926937,
-0.05260128155350685,
-0.03989854454994202,
-0.03384540602564812,
-0.06698482483625412,
-0.1103375181555748,
-0.03155163675546646,
-0.09275636821985245,
-0.025992201641201973,
-0.06180411949753761,
-0.1750382035970688,
-0.0010174732888117433,
-0.02054218016564846,
0.026980455964803696,
0.02801436558365822,
-0.028911439701914787,
0.037137776613235474,
0.04480010271072388,
-0.04387396574020386,
-0.08452848345041275,
0.021226195618510246,
0.03709074482321739,
-0.08426127582788467,
-0.020468253642320633,
-0.09674672037363052,
-0.0755983293056488,
0.07766484469175339,
0.09675909578800201,
-0.11350008100271225,
-0.0037800755817443132,
-0.029407501220703125,
-0.05051467940211296,
-0.04633627086877823,
-0.05769696459174156,
0.15789280831813812,
0.012935332953929901,
0.1589869111776352,
-0.14694452285766602,
-0.07053137570619583,
-0.026736805215477943,
0.01010302547365427,
0.034827377647161484,
0.152033269405365,
0.01285263616591692,
-0.11154276132583618,
0.0319572389125824,
-0.026856478303670883,
-0.047033291310071945,
0.1632954180240631,
-0.017376048490405083,
-0.07209847122430801,
-0.002015581587329507,
0.10233069956302643,
-0.0019810341764241457,
0.18533417582511902,
-0.05707290768623352,
0.002368326298892498,
-0.006437706295400858,
0.014468559995293617,
0.039612364023923874,
-0.12188716977834702,
0.024267196655273438,
0.0335364006459713,
-0.0666046142578125,
-0.01895405910909176,
-0.02459997497498989,
-0.036954283714294434,
0.041576117277145386,
0.01839158497750759,
0.026981523260474205,
-0.017013873904943466,
-0.03557039424777031,
-0.10405626893043518,
0.17602257430553436,
-0.07234213501214981,
-0.21230466663837433,
-0.16697843372821808,
0.10628426820039749,
-0.012208071537315845,
-0.012800350785255432,
0.02673441171646118,
-0.08673474192619324,
-0.061239708214998245,
-0.10168704390525818,
0.12251096218824387,
-0.09728354215621948,
-0.0076548331417143345,
-0.021344708278775215,
0.06542038917541504,
0.05319781228899956,
-0.16818511486053467,
0.03307752311229706,
-0.010461218655109406,
0.014738264493644238,
-0.015992773696780205,
-0.06570707261562347,
0.07527519017457962,
0.11146550625562668,
-0.07546606659889221,
0.01691029593348503,
-0.0035162034910172224,
0.16344493627548218,
-0.05975116044282913,
0.06034611538052559,
0.17822253704071045,
0.011747648939490318,
0.02033117413520813,
0.058327700942754745,
0.0032195451203733683,
-0.09155495464801788,
0.06728488206863403,
0.044625986367464066,
-0.02854298986494541,
-0.2269948422908783,
-0.017189398407936096,
-0.07147791236639023,
0.07640259712934494,
0.11382168531417847,
0.042987335473299026,
-0.14343298971652985,
0.025893019512295723,
-0.00682463776320219,
0.16859765350818634,
-0.02449609525501728,
0.05544823035597801,
-0.003397455671802163,
0.011570123955607414,
-0.005318632815033197,
-0.10298901796340942,
0.009564056992530823,
0.06883949786424637,
0.11216861009597778,
0.19354680180549622,
-0.09599142521619797,
0.17193135619163513,
0.014233247376978397,
0.10528355091810226,
0.03692077472805977,
0.1090649664402008,
-0.13199366629123688,
0.012423403561115265,
0.003012683242559433,
-0.01901215873658657,
-0.06545381993055344,
0.0427282378077507,
-0.043791383504867554,
0.08338072150945663,
-0.062124233692884445,
0.012597601860761642,
0.018739748746156693,
0.18637573719024658,
0.08597201853990555,
-0.16575852036476135,
-0.13075406849384308,
0.01454075239598751,
-0.091220423579216,
-0.11227438598871231,
0.07138993591070175,
0.23705442249774933,
-0.05581289902329445,
0.00837669987231493,
-0.013730235397815704,
0.13300521671772003,
-0.0932532250881195,
-0.019926652312278748,
0.035169217735528946,
0.06047402694821358,
0.006597990170121193,
0.12176495790481567,
-0.27340343594551086,
0.07333564758300781,
0.016818270087242126,
0.09382206201553345,
-0.021362025290727615,
0.05347080156207085,
-0.051183030009269714,
-0.0037597978953272104,
0.08000890910625458,
0.010285005904734135,
-0.031211871653795242,
-0.1873609572649002,
-0.047098785638809204,
0.022814011201262474,
0.03956049308180809,
-0.0004561559180729091,
0.08642269670963287,
-0.0179811492562294,
0.04049959033727646,
-0.02844156324863434,
-0.11923990398645401,
-0.06966079771518707,
-0.12445893883705139,
-0.039528582245111465,
-0.002639592858031392,
-0.04828295484185219,
-0.02660263516008854,
0.044815585017204285,
0.03969249129295349,
0.2218584567308426,
-0.14598436653614044,
-0.07257696241140366,
-0.0859331488609314,
0.05617697536945343,
0.1408124715089798,
-0.08217058330774307,
0.017961159348487854,
0.023535648360848427,
0.05275888741016388,
-0.04157458618283272,
-0.06733773648738861,
0.03354381397366524,
-0.0535714365541935,
-0.07996127009391785,
-0.034403763711452484,
0.10634424537420273,
-0.009471775963902473,
0.049219124019145966,
0.014300919137895107,
-0.09022772312164307,
-0.03021325170993805,
-0.12197580188512802,
-0.0824122354388237,
-0.019008662551641464,
0.05039196088910103,
-0.015753386542201042,
-0.126553013920784,
0.06885816901922226,
-0.006636087782680988,
-0.0885096862912178,
0.06862728297710419,
0.14966991543769836,
-0.06988649070262909,
0.029537485912442207,
0.08534328639507294,
-0.058338265866041183,
-0.1815018355846405,
-0.03410252928733826,
0.03948378935456276,
0.08430546522140503,
-0.031546421349048615,
-0.12799622118473053,
0.05868532136082649,
-0.0026836665347218513,
0.02227122150361538,
0.016716215759515762,
-0.25848791003227234,
-0.12852482497692108,
0.0001024107332341373,
0.0767420306801796,
0.0373377725481987,
-0.09575126320123672,
-0.04662294685840607,
-0.06619453430175781,
-0.06794612109661102,
0.07360587269067764,
0.060934893786907196,
0.10799318552017212,
-0.037511855363845825,
0.022941848263144493,
0.043944988399744034,
-0.032496970146894455,
0.05640474706888199,
-0.015892522409558296,
0.10385260730981827,
-0.025983035564422607,
-0.0008325631497427821,
0.0550374798476696,
-0.06469640880823135,
0.18947449326515198,
-0.1629149168729782,
0.09943835437297821,
-0.17924752831459045,
-0.0402039997279644,
-0.03417618200182915,
-0.006203295662999153,
-0.040938641875982285,
-0.048085566610097885,
-0.11705836653709412,
0.045677587389945984,
0.05476461723446846,
-0.02468976564705372,
0.03706735372543335,
-0.016874130815267563,
-0.04908663406968117,
0.0680740550160408,
0.0919460579752922,
-0.002794787986204028,
-0.10248607397079468,
0.0393153615295887,
0.024143928661942482,
0.09968119859695435,
-0.19399279356002808,
0.02927711047232151,
0.10713937133550644,
0.009253295138478279,
0.1002475842833519,
0.0073344227857887745,
-0.09233797341585159,
0.019611477851867676,
0.06558642536401749,
-0.06993401795625687,
-0.073544442653656,
-0.013410295359790325,
-0.02593068964779377,
-0.08750099688768387,
0.03350517526268959,
0.08318236470222473,
-0.0630098283290863,
-0.012502787634730339,
-0.006434741895645857,
0.012662455439567566,
-0.0802103579044342,
0.17809739708900452,
0.021988514810800552,
0.0819549411535263,
-0.05534406006336212,
0.08206897228956223,
0.09872224926948547,
-0.11587687581777573,
0.028496673330664635,
0.16078999638557434,
-0.08589575439691544,
-0.02344132959842682,
0.10250577330589294,
0.13142545521259308,
-0.007449717726558447,
-0.053916700184345245,
-0.09703647345304489,
-0.08182656764984131,
0.020132487639784813,
0.0655025839805603,
0.0632147416472435,
0.09316178411245346,
-0.01919146254658699,
-0.0038424984086304903,
-0.1314082145690918,
0.10162460058927536,
0.07484946399927139,
0.0452340729534626,
-0.12940387427806854,
0.14377227425575256,
0.030257320031523705,
0.08521462231874466,
-0.0006729848682880402,
0.036488842219114304,
-0.1105465367436409,
0.034368496388196945,
-0.03444518521428108,
0.042489875108003616,
-0.006722045596688986,
0.04175809025764465,
-0.04370615631341934,
0.04278416186571121,
-0.02700859308242798,
0.044204581528902054,
-0.03495822474360466,
-0.024265222251415253,
-0.038350775837898254,
0.030626848340034485,
-0.05169568210840225,
-0.025810442864894867,
0.004724840167909861,
-0.08499443531036377,
0.09043091535568237,
-0.06715931743383408,
-0.011828018352389336,
-0.007588589563965797,
0.020563334226608276,
0.052620310336351395,
0.009763160720467567,
0.0479823462665081,
-0.0012658413033932447,
-0.007282250095158815,
0.024013439193367958,
0.020623166114091873,
-0.009763621725142002,
-0.008021065965294838,
0.07717327773571014,
-0.13735035061836243,
-0.07987488061189651,
-0.08700000494718552,
-0.06949256360530853,
-0.06556379050016403,
0.08418609946966171,
0.07770320028066635,
0.06866541504859924,
0.08831142634153366,
-0.03506617248058319,
0.0099577521905303,
-0.1646306812763214,
-0.041550710797309875,
0.05042840540409088,
-0.010160950012505054,
-0.10967367887496948,
-0.03748251125216484,
0.061040621250867844,
-0.03483084589242935,
0.10668998211622238,
-0.006686353590339422,
0.042001161724328995,
-0.012871014885604382,
-0.0661298930644989,
-0.056415170431137085,
0.013548672199249268,
0.18478833138942719,
-0.11468813568353653,
0.007831406779587269,
-0.011668709106743336,
0.00949376542121172,
0.02625129744410515,
0.16537828743457794,
0.09852459281682968,
0.1397639513015747,
0.03357476368546486,
0.08643938601016998,
-0.04720517620444298,
-0.029277769848704338,
-0.10605373978614807,
0.08385636657476425,
-0.028109971433877945,
0.04042942821979523,
-0.03575760871171951,
0.13234007358551025,
0.0861932560801506,
-0.14744870364665985,
0.10494482517242432,
-0.0008252509869635105,
-0.10055612027645111,
-0.03206683695316315,
-0.09348271042108536,
-0.04333367943763733,
-0.09815666824579239,
0.0004771713574882597,
-0.1089974194765091,
0.004586029797792435,
0.053196679800748825,
0.031643737107515335,
-0.03630813956260681,
0.15996626019477844,
-0.033049385994672775,
-0.05569580942392349,
0.041944410651922226,
0.049299027770757675,
0.028401220217347145,
0.07793527096509933,
0.03394579514861107,
0.06396704912185669,
-0.060553357005119324,
0.06448405236005783,
0.030472531914711,
0.006501316092908382,
0.013909351080656052,
0.03081778809428215,
-0.010791067965328693,
-0.04088051617145538,
-0.013041295111179352,
0.08410221338272095,
0.14844973385334015,
0.04364658147096634,
-0.04159336909651756,
-0.049733422696590424,
0.18902365863323212,
-0.05682898312807083,
-0.06488782167434692,
-0.12651923298835754,
0.14850370585918427,
0.03705838322639465,
0.01060168631374836,
0.018631426617503166,
-0.07558240741491318,
-0.02098732627928257,
0.25122401118278503,
0.048276856541633606,
-0.05076051875948906,
-0.03269423916935921,
-0.005208952818065882,
-0.008333307690918446,
-0.04752764105796814,
0.14850550889968872,
0.010452859103679657,
0.21612004935741425,
0.010438995435833931,
-0.009930362924933434,
-0.03881276771426201,
-0.0457485131919384,
-0.003988654352724552,
0.18861763179302216,
-0.03688367083668709,
0.025952935218811035,
-0.09284085035324097,
-0.016941962763667107,
0.020906632766127586,
-0.1412689983844757,
0.11944066733121872,
-0.13125887513160706,
-0.0715128630399704,
0.006191866006702185,
0.06933209300041199,
-0.04670203849673271,
0.04572743922472,
-0.01959824375808239,
0.0734030231833458,
0.0524379126727581,
-0.02755388803780079,
-0.0940050333738327,
-0.1331542730331421,
0.04676153510808945,
-0.026299817487597466,
0.13420216739177704,
0.013819463551044464,
0.08178605139255524,
0.08086967468261719,
0.0023939323145896196,
-0.08684590458869934,
0.08717293292284012,
0.031624600291252136,
0.009474705904722214,
0.05451582744717598,
0.1262112706899643,
-0.040438830852508545,
0.16112308204174042,
0.00456673139706254,
-0.03573978692293167,
-0.02531244233250618,
-0.03134633228182793,
-0.01276011485606432,
-0.15320821106433868,
0.0038286030758172274,
-0.056293416768312454,
0.14331384003162384,
0.193642258644104,
-0.05177784338593483,
-0.01559430081397295,
-0.05200575664639473,
0.08200838416814804,
-0.01119025144726038,
0.0797918438911438,
0.0006098300218582153,
-0.15599067509174347,
0.008212340995669365,
-0.038865894079208374,
0.014256703667342663,
-0.19927704334259033,
-0.04207129776477814,
-0.03880976513028145,
-0.04388481378555298,
-0.1023704782128334,
0.1442623883485794,
0.06667446345090866,
0.0408100001513958,
-0.04235280677676201,
-0.13169081509113312,
-0.012740693055093288,
0.05127393454313278,
-0.12269245833158493,
-0.12324974685907364
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus javascript dataset.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/javascript/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_javascript
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus javascript dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11461012810468674,
-0.0035359400790184736,
-0.0008824251126497984,
0.06468670815229416,
0.15548823773860931,
0.0082349618896842,
0.08658581972122192,
0.04456472396850586,
0.013144658878445625,
-0.04290953651070595,
0.09481451660394669,
0.12595465779304504,
0.022058678790926933,
0.1323317438364029,
-0.01876942627131939,
-0.17378461360931396,
-0.035975851118564606,
0.05388245731592178,
-0.1362789273262024,
0.13216103613376617,
0.10999716073274612,
-0.05213123932480812,
0.09860560297966003,
0.02678159438073635,
-0.21826249361038208,
0.06846225261688232,
0.007867471314966679,
-0.07744510471820831,
0.12747734785079956,
0.07735329121351242,
0.11276119202375412,
0.0368160679936409,
0.007403124123811722,
-0.2215440571308136,
0.03711774945259094,
-0.020954228937625885,
0.0017202511662617326,
0.04505430534482002,
0.04536253586411476,
-0.09547664225101471,
0.19017325341701508,
-0.013913631439208984,
0.004751363303512335,
0.07824850082397461,
-0.11022023111581802,
-0.054692208766937256,
-0.010552224703133106,
-0.04456780105829239,
0.10219511389732361,
0.0783027783036232,
0.014635940082371235,
0.1188167929649353,
-0.15354204177856445,
0.1268504410982132,
0.11059929430484772,
-0.1769643872976303,
-0.02259374037384987,
0.07469552755355835,
0.08469392359256744,
-0.09629032760858536,
-0.04474705085158348,
0.021497055888175964,
0.07222572714090347,
0.015619175508618355,
0.03040517121553421,
-0.1302701234817505,
-0.14820976555347443,
0.05741337314248085,
-0.05717526376247406,
-0.06474626064300537,
0.2790777385234833,
0.022652819752693176,
-0.0365578792989254,
-0.05105200409889221,
-0.040664248168468475,
0.0589628703892231,
-0.017983313649892807,
0.029783470556139946,
-0.00744770048186183,
-0.0051270476542413235,
-0.06681642681360245,
-0.02055198699235916,
-0.10675365477800369,
-0.11751110851764679,
0.0067968606017529964,
0.07791779190301895,
-0.01939668320119381,
0.025235194712877274,
-0.18738779425621033,
0.10748340934515,
0.11155831813812256,
-0.0974305272102356,
0.02152913622558117,
-0.06926069408655167,
-0.023466389626264572,
-0.0266446303576231,
-0.04812353849411011,
-0.13331137597560883,
0.10572692006826401,
0.0865279957652092,
-0.007918286137282848,
0.06255071610212326,
0.010040581226348877,
0.06365688145160675,
0.047667331993579865,
0.20074611902236938,
0.03656597062945366,
-0.09887246787548065,
0.051293693482875824,
0.000435913068940863,
-0.048600584268569946,
0.008032571524381638,
-0.0833040326833725,
-0.037750761955976486,
0.011534435674548149,
0.1323920488357544,
-0.07478129118680954,
0.07874185591936111,
-0.07194870710372925,
-0.020416047424077988,
0.028239451348781586,
-0.14095422625541687,
-0.018585070967674255,
0.029342737048864365,
-0.06843279302120209,
-0.032794203609228134,
0.11565475165843964,
-0.06602802127599716,
-0.10192817449569702,
-0.06694625318050385,
-0.07647140324115753,
0.015075994655489922,
-0.09826536476612091,
-0.06166263297200203,
0.012703930959105492,
0.04028043523430824,
0.07551392167806625,
-0.13581034541130066,
-0.1460268348455429,
-0.005535236094146967,
0.07340206950902939,
0.02713039331138134,
0.0601491741836071,
-0.12842854857444763,
-0.02563612535595894,
-0.015045269392430782,
-0.018831096589565277,
0.026109278202056885,
-0.06543436646461487,
0.07612115889787674,
0.10142695903778076,
0.031110050156712532,
-0.030551951378583908,
0.03943389654159546,
-0.14451858401298523,
0.07473336905241013,
-0.1598336547613144,
0.10665369033813477,
-0.02237825281918049,
0.11507774889469147,
-0.10618549585342407,
-0.05201151221990585,
0.032894980162382126,
0.055891867727041245,
0.05818672478199005,
0.17181137204170227,
-0.10942483693361282,
-0.07752726227045059,
0.14096955955028534,
-0.0769459456205368,
-0.23206613957881927,
0.06856031715869904,
-0.0813804417848587,
0.1598816215991974,
0.07435569167137146,
0.1716044396162033,
0.1629386991262436,
-0.07633230090141296,
0.04213416203856468,
0.1115201860666275,
-0.053551483899354935,
-0.057047635316848755,
0.061795055866241455,
0.03737811744213104,
-0.12498179078102112,
0.04768538475036621,
-0.021212156862020493,
0.15130670368671417,
-0.04510960727930069,
-0.040006574243307114,
-0.005931742489337921,
-0.06309612095355988,
0.07938283681869507,
0.0073220329359173775,
0.06360567361116409,
-0.0017612099181860685,
-0.024186259135603905,
0.045755427330732346,
0.09708250313997269,
-0.12846724689006805,
-0.006975571624934673,
-0.1210377886891365,
0.050632499158382416,
-0.12090007960796356,
0.02274591475725174,
-0.20957958698272705,
-0.048254791647195816,
-0.003249741392210126,
0.02301093004643917,
0.05093787610530853,
0.006152339745312929,
0.02268536202609539,
0.008541957475244999,
0.017662731930613518,
-0.02856879122555256,
-0.010636896826326847,
-0.02813543938100338,
-0.01579553261399269,
-0.09939732402563095,
-0.06873305886983871,
-0.059694789350032806,
0.03201281279325485,
-0.16384732723236084,
-0.0029238108545541763,
0.026544511318206787,
0.0559278279542923,
0.022171583026647568,
0.03512921556830406,
0.03166094422340393,
0.0462224967777729,
-0.05463724583387375,
-0.028270315378904343,
0.06041471287608147,
0.015432738699018955,
-0.12148012220859528,
0.029065342620015144,
-0.06936729699373245,
0.03688448294997215,
0.1150563433766365,
-0.18266618251800537,
-0.0425114706158638,
-0.05483003705739975,
-0.04157252609729767,
-0.006797537207603455,
0.011464055627584457,
-0.039784274995326996,
0.23523816466331482,
0.005660706665366888,
0.17582674324512482,
-0.1106969490647316,
-0.039125509560108185,
-0.030700456351041794,
-0.024550477042794228,
0.02176782861351967,
0.11787008494138718,
0.10358690470457077,
-0.19636526703834534,
0.027277208864688873,
0.11872928589582443,
0.005946303717792034,
0.22557280957698822,
-0.054425228387117386,
-0.02293008379638195,
-0.015975408256053925,
0.05865349993109703,
-0.04294413700699806,
0.12924796342849731,
-0.24062444269657135,
-0.058267783373594284,
-0.004165245220065117,
-0.026490647345781326,
0.11081412434577942,
-0.11800013482570648,
0.002358518773689866,
0.019950183108448982,
-0.03499842435121536,
-0.14659591019153595,
0.03675173968076706,
0.014494446106255054,
0.02814982458949089,
-0.00079946598270908,
-0.03217706456780434,
0.04640364646911621,
-0.0422600656747818,
-0.13529735803604126,
0.22895902395248413,
-0.07077545672655106,
-0.21448981761932373,
-0.1732424795627594,
0.060951221734285355,
-0.03434126079082489,
0.00013869014219380915,
0.04118100926280022,
-0.060483187437057495,
-0.021283568814396858,
-0.03360018879175186,
0.15907523036003113,
-0.04124320298433304,
-0.034780219197273254,
-0.0028410255908966064,
0.06987737119197845,
-0.012825299054384232,
-0.1954353302717209,
-0.01862414740025997,
0.011957436800003052,
0.031236182898283005,
-0.0258120596408844,
-0.1555672138929367,
0.12978962063789368,
0.11581948399543762,
-0.032786719501018524,
0.054724983870983124,
-0.0381384901702404,
0.22038409113883972,
-0.07935962080955505,
-0.05612299218773842,
0.18669156730175018,
-0.0809822753071785,
0.010714615695178509,
-0.0009055985137820244,
0.0060368487611413,
-0.09603305906057358,
0.03865261748433113,
-0.0428406298160553,
-0.07074383646249771,
-0.231038436293602,
-0.10076731443405151,
-0.08940484374761581,
0.0840388610959053,
0.03337441757321358,
0.0029791593551635742,
-0.09602640569210052,
0.0614977665245533,
0.05511915311217308,
0.11589314788579941,
0.009190072305500507,
0.05949791520833969,
0.03684278577566147,
0.022258099168539047,
-0.00197216821834445,
-0.11250118166208267,
-0.07468652725219727,
0.0111958347260952,
0.06488393247127533,
0.19302992522716522,
0.008241245523095131,
0.15282127261161804,
0.06974279880523682,
0.030034275725483894,
0.014151254668831825,
0.16839665174484253,
-0.07568487524986267,
0.017039943486452103,
0.0012183309299871325,
-0.03857753798365593,
-0.14475733041763306,
0.0242433063685894,
-0.022935640066862106,
0.040557172149419785,
-0.16801102459430695,
-0.019959600642323494,
0.09216884523630142,
0.061227455735206604,
0.015148297883570194,
-0.27841269969940186,
-0.10820983350276947,
0.03293969854712486,
-0.07280364632606506,
-0.060630254447460175,
0.06975669413805008,
0.11409500241279602,
-0.1292305588722229,
0.016567660495638847,
-0.048629119992256165,
0.15450121462345123,
-0.05812307447195053,
0.026783686131238937,
-0.03214392066001892,
-0.03196597099304199,
0.011681007221341133,
0.15693742036819458,
-0.19303852319717407,
0.21518924832344055,
0.020716268569231033,
0.05359519273042679,
-0.07884866744279861,
0.036640796810388565,
0.035471558570861816,
0.0838090255856514,
0.12743356823921204,
-0.03228461369872093,
-0.011909481137990952,
-0.16012975573539734,
0.05305321142077446,
0.09288375079631805,
0.09195494651794434,
-0.0484808050096035,
0.0795411616563797,
-0.02227286994457245,
0.009657802060246468,
-0.013028684072196484,
-0.09657083451747894,
-0.1299695074558258,
-0.10758765041828156,
-0.030814768746495247,
-0.05847654119133949,
0.040283411741256714,
-0.018528591841459274,
0.0332365408539772,
0.06259172409772873,
0.1574561446905136,
-0.06422657519578934,
-0.0704076811671257,
-0.08659180253744125,
0.023615680634975433,
0.10658147931098938,
-0.09792820364236832,
0.017683083191514015,
0.0018672867445275187,
0.029695114120841026,
-0.005209050606936216,
-0.16518479585647583,
0.052444394677877426,
-0.08108486980199814,
0.008016930893063545,
-0.029121914878487587,
0.13806012272834778,
-0.007233787793666124,
-0.019400447607040405,
0.07715905457735062,
-0.07641121000051498,
-0.05286146327853203,
-0.14759604632854462,
-0.07843979448080063,
-0.08331295102834702,
0.049832381308078766,
0.048483509570360184,
-0.09719178825616837,
0.013379920274019241,
-0.02924962155520916,
0.026985159143805504,
0.19319245219230652,
0.09494185447692871,
-0.030758721753954887,
-0.013628657907247543,
0.13179911673069,
-0.09628883004188538,
-0.23237363994121552,
0.011583887040615082,
-0.030125798657536507,
0.0685155913233757,
0.012367798015475273,
-0.15640568733215332,
0.09160232543945312,
-0.07026161253452301,
0.04107191041111946,
0.026367032900452614,
-0.31285378336906433,
-0.08523494750261307,
0.11407842487096786,
0.09984198957681656,
0.06677345931529999,
-0.0997757613658905,
-0.05513607710599899,
-0.08281738311052322,
-0.20346599817276,
0.1534106731414795,
-0.15209627151489258,
0.07457318156957626,
0.0006585290539078414,
0.05803006514906883,
0.03000224381685257,
-0.055210694670677185,
0.0949845016002655,
0.05150808021426201,
0.11374648660421371,
-0.011435470543801785,
-0.13553330302238464,
0.13794317841529846,
-0.04455830529332161,
0.1363283097743988,
-0.10337000340223312,
0.09238632023334503,
-0.20546762645244598,
-0.03739846870303154,
-0.05653882026672363,
0.07111496478319168,
-0.007944086566567421,
-0.05659383162856102,
-0.0620049424469471,
-0.0053640566766262054,
0.02001909725368023,
0.032747481018304825,
0.11457230895757675,
-0.037773702293634415,
-0.01502803061157465,
0.09569206833839417,
0.191043421626091,
-0.03979054093360901,
-0.03756558522582054,
0.04190865159034729,
0.018413422629237175,
0.09601085633039474,
-0.23235614597797394,
0.09032157063484192,
0.11152046173810959,
0.035077713429927826,
0.10669742524623871,
0.09210249781608582,
-0.03164904937148094,
0.0628313273191452,
0.08713460713624954,
-0.1210654005408287,
-0.059174906462430954,
-0.09119950979948044,
-0.10305238515138626,
-0.021220613270998,
0.09052182734012604,
0.13690385222434998,
-0.023661743849515915,
-0.005051795393228531,
-0.019589584320783615,
-0.03781585767865181,
-0.13023343682289124,
0.09989959001541138,
0.04617099091410637,
0.06334198266267776,
-0.08138599991798401,
0.06339500844478607,
0.021739989519119263,
-0.13202428817749023,
-0.034819480031728745,
0.09826638549566269,
-0.12570582330226898,
-0.08410654962062836,
-0.03354115039110184,
0.31721988320350647,
-0.10280388593673706,
-0.07525206357240677,
-0.14150799810886383,
-0.05269709974527359,
-0.014219663105905056,
0.20715291798114777,
0.08883816003799438,
0.09669423848390579,
-0.052606262266635895,
0.004290872253477573,
-0.1079057827591896,
0.0608099065721035,
0.09638874232769012,
-0.01299726590514183,
-0.0901162326335907,
0.12076336145401001,
0.0019341500010341406,
0.16419875621795654,
-0.05707070231437683,
-0.03527488932013512,
-0.15689586102962494,
0.0904013141989708,
-0.11245295405387878,
0.051430512219667435,
-0.0825275257229805,
0.03378938511013985,
-0.0025837107095867395,
0.019148893654346466,
-0.05234568566083908,
0.04744087904691696,
-0.08373872190713882,
0.013966095633804798,
0.019603325054049492,
0.07817638665437698,
-0.09686703234910965,
0.00661702174693346,
0.08004160225391388,
-0.07570171356201172,
0.0953608974814415,
0.031796518713235855,
-0.051483988761901855,
0.12433484196662903,
-0.20015500485897064,
-0.041394829750061035,
0.03262448310852051,
0.017192702740430832,
0.039861638098955154,
-0.012444478459656239,
0.053559836000204086,
0.02344585955142975,
0.0430818535387516,
-0.022691288962960243,
0.07754692435264587,
-0.1173877939581871,
-0.0711045190691948,
-0.00788444560021162,
-0.11344754695892334,
-0.049565766006708145,
0.0477462112903595,
0.04155954346060753,
0.06900747865438461,
0.10698019713163376,
-0.025126762688159943,
0.054038580507040024,
-0.11157670617103577,
-0.01706859841942787,
0.04918717220425606,
-0.06384994089603424,
-0.05789225548505783,
-0.10798408091068268,
0.04357178136706352,
-0.07257182896137238,
0.1758565455675125,
0.013937301933765411,
0.1810729205608368,
-0.023164229467511177,
-0.0016429111128672957,
0.04299284517765045,
0.047683678567409515,
0.19100096821784973,
-0.03652071952819824,
0.0448596328496933,
-0.059056755155324936,
0.06484582275152206,
0.050965964794158936,
0.04773273319005966,
0.12417124956846237,
0.07795965671539307,
-0.014808573760092258,
0.1289520561695099,
0.01987878605723381,
0.027321448549628258,
-0.08150225877761841,
-0.11334669589996338,
0.09304061532020569,
0.05152955278754234,
-0.04034040495753288,
0.06475403904914856,
0.13185852766036987,
-0.0798054188489914,
0.07524911314249039,
0.013904367573559284,
-0.09324809163808823,
-0.05802498012781143,
-0.014287559315562248,
-0.04272007197141647,
-0.12446481734514236,
0.025224581360816956,
-0.10637932270765305,
-0.059881389141082764,
0.05125061795115471,
0.0023357104510068893,
-0.06464514136314392,
0.22258877754211426,
0.007582412101328373,
-0.08381974697113037,
0.07541582733392715,
-0.02195902355015278,
0.014528276398777962,
0.027188260108232498,
0.0657709389925003,
-0.0049817501567304134,
-0.05128112807869911,
0.005175197962671518,
0.03824405372142792,
-0.06354720890522003,
0.014407025650143623,
-0.08402188867330551,
-0.0464312769472599,
-0.05124867334961891,
0.05613694712519646,
-0.0033027841709554195,
0.025445464998483658,
0.023517362773418427,
-0.0307264793664217,
-0.014252658002078533,
0.20057256519794464,
-0.043073203414678574,
-0.06078652665019035,
-0.14015421271324158,
0.16885167360305786,
0.04483865201473236,
0.05241524800658226,
-0.004447767976671457,
-0.07230926305055618,
-0.018793554976582527,
0.27158963680267334,
0.19353361427783966,
-0.04916567727923393,
0.0070248376578092575,
-0.009308040142059326,
0.016044916585087776,
0.015197087079286575,
0.14423775672912598,
0.015872206538915634,
0.21281205117702484,
-0.02590256743133068,
-0.08728208392858505,
-0.04721275344491005,
-0.04636382684111595,
0.05449594184756279,
0.1465175598859787,
0.027947846800088882,
-0.03103867545723915,
-0.034943778067827225,
0.10791631788015366,
-0.15445561707019806,
-0.08136054128408432,
0.03016054630279541,
-0.12623102962970734,
-0.05516742914915085,
-0.049556586891412735,
0.04944758862257004,
-0.0035701398737728596,
0.08554328978061676,
-0.0445539616048336,
-0.034309033304452896,
0.058507274836301804,
0.030726589262485504,
-0.16287639737129211,
-0.07619105279445648,
0.041920892894268036,
-0.07809485495090485,
0.14144675433635712,
-0.025214044377207756,
0.10716748982667923,
0.10365436226129532,
0.04132024571299553,
-0.029749484732747078,
0.029682165011763573,
0.0746564045548439,
0.06613517552614212,
0.0810181125998497,
0.04447226598858833,
-0.017833270132541656,
0.12911991775035858,
-0.04402249678969383,
-0.09761583060026169,
0.05818012356758118,
-0.012221543118357658,
-0.02000897377729416,
-0.13730880618095398,
-0.02481418289244175,
-0.08702115714550018,
0.09031267464160919,
0.1392880529165268,
-0.046433623880147934,
0.017740225419402122,
-0.05457593873143196,
0.13277946412563324,
0.019644999876618385,
0.002387539017945528,
-0.10434921085834503,
-0.11779492348432541,
-0.024508653208613396,
0.03128926828503609,
-0.03562212735414505,
-0.20237988233566284,
0.0051552350632846355,
-0.04932582005858421,
-0.011834883131086826,
-0.04681479558348656,
0.1109357699751854,
0.11702036112546921,
0.034412093460559845,
-0.01893928274512291,
-0.14149180054664612,
-0.0019234437495470047,
0.08458065986633301,
-0.13082818686962128,
-0.15198442339897156
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/javascript/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_javascript_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.14530567824840546,
-0.035613398998975754,
-0.0004570320015773177,
0.1272222101688385,
0.12550851702690125,
0.027078358456492424,
0.051693394780159,
0.06341953575611115,
-0.025936199352145195,
0.02549123764038086,
0.055332183837890625,
-0.0005543120787478983,
0.03064402937889099,
0.19599118828773499,
0.013862576335668564,
-0.0897655338048935,
-0.04016843065619469,
0.04519400745630264,
-0.0397581122815609,
0.13639365136623383,
0.07542537897825241,
-0.06751739233732224,
0.057481538504362106,
-0.05913752317428589,
-0.2495286613702774,
0.06274472177028656,
0.017852799966931343,
-0.06014922261238098,
0.08956455439329147,
0.04899289086461067,
0.12283135950565338,
-0.0030401358380913734,
0.015075789764523506,
-0.14639030396938324,
0.013287074863910675,
0.009883793070912361,
0.027229605242609978,
0.013387637212872505,
0.038319181650877,
0.018719609826803207,
0.14652879536151886,
-0.0010568026918917894,
0.030010618269443512,
0.07670417428016663,
-0.07710422575473785,
-0.10232501477003098,
-0.0016017904272302985,
0.0003788963076658547,
0.04963293671607971,
0.11383342742919922,
-0.02067379094660282,
0.12020724266767502,
-0.1538153439760208,
0.13028636574745178,
0.09850657731294632,
-0.22581355273723602,
-0.01827659271657467,
0.09892284870147705,
0.07881256192922592,
0.06955193728208542,
-0.053147461265325546,
-0.05388499051332474,
0.10799101740121841,
0.04412497207522392,
0.03657733276486397,
-0.09753330051898956,
-0.08262401074171066,
0.01040539238601923,
-0.07049258053302765,
-0.07174155116081238,
0.2271026074886322,
0.02173944003880024,
-0.08067379146814346,
-0.04834990203380585,
-0.02905166894197464,
-0.12071377784013748,
0.027502011507749557,
0.04513193666934967,
-0.0012235002359375358,
-0.03396220877766609,
-0.03049558214843273,
0.028729699552059174,
-0.07312126457691193,
-0.14742818474769592,
0.023906908929347992,
0.10202697664499283,
0.044673189520835876,
0.020375099033117294,
-0.1113119125366211,
0.10644996166229248,
0.05127515271306038,
-0.0691772848367691,
-0.026313459500670433,
-0.013118745759129524,
-0.09375032037496567,
0.02206760086119175,
-0.04863274469971657,
-0.1656855195760727,
0.030004195868968964,
0.03669053316116333,
-0.030896693468093872,
0.05559186637401581,
0.02619258500635624,
0.03663695231080055,
0.007055159658193588,
0.22698965668678284,
0.08350590616464615,
-0.14440220594406128,
0.0631265714764595,
0.06301206350326538,
-0.0233891773968935,
-0.0016406257636845112,
-0.07885733991861343,
-0.09526701271533966,
0.0996643528342247,
0.10888035595417023,
-0.11720512807369232,
0.0372101254761219,
-0.07218720763921738,
-0.02678856998682022,
0.0140251899138093,
-0.15592491626739502,
0.006304364185780287,
0.04282607138156891,
-0.08037827908992767,
-0.045383866876363754,
0.09930368512868881,
-0.1759442836046219,
-0.14460350573062897,
-0.06495559215545654,
-0.07211874425411224,
-0.02564813569188118,
-0.16500891745090485,
-0.15316130220890045,
-0.007666103541851044,
-0.02352501079440117,
0.028026437386870384,
-0.11150646209716797,
-0.13043586909770966,
-0.03700239583849907,
0.011166507378220558,
0.02355172298848629,
0.004300989210605621,
-0.10290103405714035,
-0.019739001989364624,
-0.01758410409092903,
-0.03758706524968147,
0.0007682309951633215,
-0.044351134449243546,
0.12901057302951813,
0.12025216966867447,
0.04543406516313553,
-0.01322450302541256,
0.05620332434773445,
-0.09604772925376892,
0.06345890462398529,
-0.10699857771396637,
0.10688953101634979,
-0.03124929778277874,
0.08200088143348694,
-0.037057358771562576,
-0.1072901263833046,
0.054855894297361374,
0.058420632034540176,
0.07778402417898178,
0.06728092581033707,
-0.11709646880626678,
-0.050671495497226715,
0.18766306340694427,
-0.10300132632255554,
-0.1517144739627838,
0.10305880755186081,
-0.03370354324579239,
0.07227303832769394,
0.10121416300535202,
0.1404462307691574,
0.15347714722156525,
-0.04018520563840866,
-0.00019766301556956023,
0.05705132707953453,
0.03823559731245041,
-0.12882882356643677,
0.079201340675354,
0.046264391392469406,
-0.09082339704036713,
0.05618179216980934,
-0.01169400941580534,
0.13432472944259644,
-0.02087443880736828,
-0.0223560594022274,
-0.04868432134389877,
-0.08449707925319672,
0.02529558166861534,
0.024940289556980133,
0.060102082788944244,
-0.08264432847499847,
-0.08385512977838516,
0.08232059329748154,
0.1606212854385376,
-0.1338653713464737,
-0.00175026198849082,
-0.0934349074959755,
0.05840373784303665,
-0.08966560661792755,
0.029170837253332138,
-0.16492989659309387,
0.005283742677420378,
0.06661513447761536,
-0.013831284828484058,
0.06755290180444717,
0.10075931996107101,
0.026804232969880104,
0.03322979435324669,
0.002492505358532071,
-0.03344830870628357,
-0.12653890252113342,
-0.06801458448171616,
-0.05667875334620476,
-0.06269459426403046,
-0.0909029170870781,
-0.06113181635737419,
0.005327442195266485,
-0.18843615055084229,
0.013486649841070175,
-0.006173593457788229,
-0.013455187901854515,
0.026049254462122917,
-0.010806298814713955,
0.0248425155878067,
0.07229582220315933,
-0.06437189877033234,
-0.038621172308921814,
0.03798395395278931,
0.020812353119254112,
-0.058205705136060715,
-0.08440892398357391,
-0.09295742958784103,
-0.0014207415515556931,
0.12160952389240265,
0.020855290815234184,
-0.08175943791866302,
0.02097984589636326,
-0.01836097612977028,
-0.03345099836587906,
0.016523364931344986,
-0.07656331360340118,
0.1649351269006729,
-0.009572203271090984,
0.20141534507274628,
-0.15762650966644287,
-0.036711882799863815,
-0.027309106662869453,
0.021128984168171883,
0.059949714690446854,
0.13578611612319946,
-0.0060744197107851505,
-0.0770905613899231,
0.05375064164400101,
0.03833543136715889,
-0.09337269514799118,
0.2495276778936386,
-0.053241197019815445,
-0.08755112439393997,
0.03526591882109642,
0.10503751039505005,
-0.017506591975688934,
0.15564152598381042,
-0.22324426472187042,
-0.04529310762882233,
-0.00001924019306898117,
-0.00867299735546112,
0.06361686438322067,
-0.1244402751326561,
0.008777311071753502,
0.01104346290230751,
-0.07160010188817978,
-0.09345090389251709,
-0.0028168493881821632,
-0.008630386553704739,
0.041374266147613525,
-0.0041554393246769905,
-0.041606683284044266,
0.01944568008184433,
-0.038710206747055054,
-0.12561878561973572,
0.22092275321483612,
-0.0827297642827034,
-0.1906881481409073,
-0.1848941445350647,
0.12515412271022797,
-0.067591093480587,
-0.016632402315735817,
0.03324608504772186,
-0.09209240972995758,
-0.029098067432641983,
-0.056162603199481964,
0.18375560641288757,
-0.06817331910133362,
-0.007036933675408363,
-0.017212552949786186,
0.06893236935138702,
0.010010319761931896,
-0.2053356170654297,
0.02712295390665531,
-0.0173823069781065,
-0.03272930532693863,
-0.005618907045572996,
-0.11433747410774231,
0.11350534111261368,
0.17586474120616913,
-0.07136096060276031,
0.025600343942642212,
-0.007103194948285818,
0.19694046676158905,
-0.049375269562006,
-0.051922839134931564,
0.15091705322265625,
-0.005380598362535238,
-0.01059628650546074,
0.006446606945246458,
-0.018145134672522545,
-0.08986465632915497,
0.07011710107326508,
-0.017884690314531326,
-0.033157434314489365,
-0.2642355263233185,
-0.016303816810250282,
-0.07864967733621597,
0.04306606575846672,
0.04215999320149422,
0.027182355523109436,
-0.10886704921722412,
0.02751619927585125,
0.046039268374443054,
0.13501806557178497,
-0.0006256564520299435,
0.051018986850976944,
0.04737253114581108,
0.015783829614520073,
0.015577933751046658,
-0.09941333532333374,
-0.0013031016569584608,
0.057496577501297,
0.08126001805067062,
0.26896876096725464,
-0.09803911298513412,
0.18498685956001282,
0.043630558997392654,
0.04351380839943886,
0.034175898879766464,
0.12834502756595612,
-0.10648095607757568,
0.032940927892923355,
0.016560252755880356,
-0.007042802404612303,
-0.1175801232457161,
0.00789775513112545,
-0.03921760618686676,
0.0951504185795784,
-0.13610216975212097,
-0.03511330112814903,
0.0231708362698555,
0.129868283867836,
0.0625951737165451,
-0.2413288950920105,
-0.1324990689754486,
0.015203563496470451,
-0.08709537237882614,
-0.09534015506505966,
0.07118451595306396,
0.21707028150558472,
-0.07521439343690872,
-0.02376442588865757,
-0.005151789169758558,
0.12947620451450348,
-0.019450809806585312,
-0.024161219596862793,
-0.02531745471060276,
0.061325330287218094,
0.014301596209406853,
0.127879336476326,
-0.2966037094593048,
0.12724781036376953,
-0.002441132441163063,
0.08361659944057465,
-0.030205918475985527,
0.0446341373026371,
-0.018788620829582214,
0.06957337260246277,
0.04144911468029022,
-0.014403987675905228,
0.04988352954387665,
-0.17121967673301697,
0.02183566801249981,
0.0479268953204155,
0.033000968396663666,
0.05192546173930168,
0.07768536359071732,
-0.0014997043181210756,
0.04737558215856552,
-0.01792249269783497,
-0.1342271864414215,
-0.09555745869874954,
-0.061130501329898834,
-0.03954942151904106,
-0.03941327705979347,
-0.015853147953748703,
-0.03507818654179573,
-0.0044605927541852,
0.07057908922433853,
0.17602254450321198,
-0.08814854919910431,
-0.08224130421876907,
-0.061755020171403885,
0.06527354568243027,
0.0856417715549469,
-0.10002700984477997,
0.037472259253263474,
-0.0074630663730204105,
0.02638072706758976,
-0.0067626070231199265,
-0.09250591695308685,
0.05609418451786041,
-0.04347869008779526,
-0.05908413603901863,
-0.0070469072088599205,
0.08556442707777023,
0.0012935104314237833,
0.035303790122270584,
0.020720867440104485,
-0.10079822689294815,
-0.03948749974370003,
-0.12024351954460144,
-0.10707423090934753,
-0.0567922480404377,
0.001991381635889411,
0.05367683991789818,
-0.1280960589647293,
-0.07412201911211014,
-0.01714245416224003,
-0.008422685787081718,
0.13696563243865967,
0.15599341690540314,
-0.059636190533638,
-0.006030064541846514,
0.13016530871391296,
-0.054277267307043076,
-0.19437114894390106,
0.048583488911390305,
0.04590099677443504,
0.11780886352062225,
-0.05302150547504425,
-0.1692478358745575,
0.044399015605449677,
-0.01374632678925991,
0.03842766955494881,
0.06694445013999939,
-0.32723695039749146,
-0.1217203289270401,
0.08825869858264923,
0.14535726606845856,
0.13842833042144775,
-0.12600459158420563,
-0.03443807363510132,
-0.06201036274433136,
-0.1256076842546463,
0.07621194422245026,
-0.0849493220448494,
0.12562114000320435,
-0.06800079345703125,
0.022103682160377502,
0.03696497157216072,
-0.044029369950294495,
0.061557870358228683,
0.044627025723457336,
0.11629089713096619,
-0.03013443388044834,
-0.0002175399276893586,
0.1391075849533081,
-0.04155660420656204,
0.1760212630033493,
-0.14385780692100525,
0.09518475085496902,
-0.22384048998355865,
-0.053390856832265854,
-0.08745290338993073,
0.02176339365541935,
-0.02957659400999546,
-0.032756540924310684,
-0.08232270926237106,
0.015625402331352234,
-0.009528277441859245,
0.007666089106351137,
0.029334362596273422,
-0.037585943937301636,
-0.02171146497130394,
0.09083855152130127,
0.1361697018146515,
-0.012966468930244446,
-0.06406769901514053,
0.06602292507886887,
0.045082803815603256,
0.11045464873313904,
-0.19657815992832184,
0.02757825329899788,
0.11197324842214584,
0.027473004534840584,
0.11641345918178558,
0.04845928028225899,
-0.10505440086126328,
0.06318353861570358,
0.08819784969091415,
-0.05926830694079399,
-0.06932730972766876,
-0.041224800050258636,
-0.1218452900648117,
-0.08855341374874115,
0.050576481968164444,
0.09733131527900696,
-0.029587948694825172,
-0.014642936177551746,
-0.03443039208650589,
-0.03033284842967987,
-0.11496054381132126,
0.17673920094966888,
0.08043809980154037,
0.07683045417070389,
-0.0623161680996418,
0.0526406429708004,
0.06267258524894714,
-0.07792731374502182,
0.007300940807908773,
0.16602496802806854,
-0.10020086914300919,
-0.04742501676082611,
0.059602927416563034,
0.23678188025951385,
-0.03382965922355652,
-0.05389794334769249,
-0.14119358360767365,
-0.07424557209014893,
0.01916862651705742,
0.16102366149425507,
0.10291394591331482,
0.08159182965755463,
-0.019855206832289696,
0.006257002241909504,
-0.11352968961000443,
0.09290332347154617,
0.07780050486326218,
0.030140485614538193,
-0.09712281078100204,
0.1435365527868271,
0.04629950597882271,
0.12635380029678345,
-0.026967955753207207,
-0.014625700190663338,
-0.13816212117671967,
0.0766296535730362,
-0.09257732331752777,
0.029034648090600967,
-0.014693235047161579,
0.05042191967368126,
-0.03205909952521324,
0.0027157871518284082,
-0.044959355145692825,
0.062007639557123184,
-0.08307959139347076,
0.0002328080590814352,
0.016905527561903,
0.05366785079240799,
-0.06097758933901787,
-0.012019923888146877,
0.021886218339204788,
-0.09800488501787186,
0.12430314719676971,
-0.02533533424139023,
-0.02280883863568306,
0.09723026305437088,
-0.07018716633319855,
0.022882945835590363,
0.01676305942237377,
0.04691029340028763,
0.0070592970587313175,
0.03397303819656372,
0.09070039540529251,
0.040764231234788895,
0.0607609748840332,
0.024043511599302292,
0.10887559503316879,
-0.126724973320961,
-0.06901109218597412,
-0.048210568726062775,
-0.10629300773143768,
-0.061773620545864105,
0.11076335608959198,
0.030061984434723854,
0.0988808199763298,
0.10742193460464478,
-0.0398448146879673,
0.019265010952949524,
-0.1525191366672516,
-0.06426214426755905,
0.027177421376109123,
-0.018053287640213966,
-0.09651652723550797,
-0.05967593193054199,
0.05297134816646576,
-0.03332986310124397,
0.10608278959989548,
0.02306545339524746,
0.06226738542318344,
-0.025316061452031136,
-0.03722851723432541,
0.008972332812845707,
0.01195534598082304,
0.1890369951725006,
-0.08166927099227905,
0.041337769478559494,
0.0038366299122571945,
0.01856742426753044,
0.03183498606085777,
0.11977236717939377,
0.15066416561603546,
0.1488094925880432,
-0.034725844860076904,
0.12015064805746078,
-0.0011228244984522462,
-0.003940022550523281,
-0.08308729529380798,
-0.0010732031660154462,
0.020873665809631348,
0.057612936943769455,
-0.04126825928688049,
0.1744508445262909,
0.10848082602024078,
-0.10621076077222824,
0.0893140584230423,
0.02807452157139778,
-0.1283622533082962,
-0.04829810932278633,
0.027715090662240982,
-0.039021000266075134,
-0.14946044981479645,
0.037893250584602356,
-0.11458184570074081,
-0.03458631783723831,
0.035772763192653656,
0.03739234432578087,
-0.0826941654086113,
0.1908542811870575,
0.04131782799959183,
-0.07110848277807236,
0.06887752562761307,
-0.016587432473897934,
0.022944970056414604,
0.04001804068684578,
0.034098073840141296,
0.035221848636865616,
-0.05773380398750305,
0.04631572961807251,
0.02878732793033123,
-0.04983971267938614,
-0.002548539312556386,
-0.026113728061318398,
-0.01391975674778223,
-0.024860519915819168,
0.03942330554127693,
0.06362061947584152,
0.1630433350801468,
0.03821516036987305,
-0.06928732991218567,
-0.021342817693948746,
0.1494193822145462,
-0.02449030987918377,
-0.08626606315374374,
-0.12450941652059555,
0.13051627576351166,
0.04889664053916931,
0.005762631073594093,
0.01112616341561079,
-0.09501335769891739,
-0.03473978862166405,
0.21337930858135223,
0.06568717956542969,
-0.02531595155596733,
-0.01904333010315895,
-0.01045999862253666,
-0.0015912104863673449,
-0.029021747410297394,
0.20918162167072296,
0.018178075551986694,
0.2172914445400238,
0.01884905807673931,
-0.029044758528470993,
-0.07278949022293091,
-0.03933876007795334,
0.02520136907696724,
0.12571246922016144,
-0.03043302707374096,
-0.03484662249684334,
-0.08332708477973938,
0.013280291110277176,
-0.01047145202755928,
-0.0571744404733181,
0.09882315248250961,
-0.13474461436271667,
-0.08016694337129593,
-0.03851545229554176,
0.03755230829119682,
-0.03865724429488182,
0.039356280118227005,
-0.03419769927859306,
0.02985965646803379,
0.06516730785369873,
-0.03818252310156822,
-0.12164963781833649,
-0.16029195487499237,
0.08141393959522247,
-0.06745739281177521,
0.1383875012397766,
-0.01852468028664589,
0.16096031665802002,
0.09437981247901917,
0.037078607827425,
-0.05107122287154198,
0.1124982163310051,
0.037553757429122925,
0.07372014224529266,
0.06675729900598526,
0.10861514508724213,
-0.04420626536011696,
0.14582812786102295,
-0.043591391295194626,
-0.010440222918987274,
-0.01000269502401352,
-0.08101607114076614,
-0.027328096330165863,
-0.18563568592071533,
-0.01814214326441288,
-0.0961008295416832,
0.0932774543762207,
0.1831558495759964,
-0.047083500772714615,
-0.026777073740959167,
-0.08094215393066406,
0.10030403733253479,
0.0019559634383767843,
0.07917946577072144,
-0.04800504818558693,
-0.1680319458246231,
-0.0005028598243370652,
0.006002616602927446,
0.0006486248457804322,
-0.2723923921585083,
0.0020421065855771303,
-0.052183374762535095,
-0.02792983129620552,
-0.09449432045221329,
0.16379965841770172,
0.07232332229614258,
0.04107009619474411,
-0.03787769377231598,
-0.1380825638771057,
-0.02921026386320591,
0.07634817808866501,
-0.16148626804351807,
-0.14667561650276184
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the javascript function/method.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/javascript/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 32,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_javascript_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the javascript function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 32,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 32,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 32,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 32,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1149074137210846,
0.054762475192546844,
-0.0014456305652856827,
0.10912257432937622,
0.050929296761751175,
0.02249201573431492,
0.041471607983112335,
0.10511977225542068,
0.003251769347116351,
0.06794537603855133,
0.05375247076153755,
-0.07037774473428726,
0.051231399178504944,
0.1815849095582962,
0.019680580124258995,
-0.1239471435546875,
-0.05496350675821304,
0.027465123683214188,
-0.05707119032740593,
0.11115209758281708,
0.0636720284819603,
-0.083124540746212,
0.07237006723880768,
-0.043129127472639084,
-0.13019728660583496,
0.035874851047992706,
-0.012067955918610096,
-0.012645927257835865,
0.09459875524044037,
0.06014354154467583,
0.10983547568321228,
-0.017055710777640343,
0.057063452899456024,
-0.19242997467517853,
0.006041952408850193,
0.02705131098628044,
0.06620026379823685,
0.036142051219940186,
0.04743322730064392,
0.075080506503582,
0.11492852121591568,
-0.025348259136080742,
0.021676957607269287,
0.06859204173088074,
-0.06047382950782776,
-0.03168365731835365,
-0.07597098499536514,
0.04624857008457184,
0.09293106943368912,
0.09876976162195206,
-0.006975578609853983,
0.037452347576618195,
-0.0883355662226677,
0.08660279959440231,
0.10906888544559479,
-0.2292066365480423,
-0.0240799430757761,
0.08409653604030609,
0.08944665640592575,
0.02129119634628296,
-0.07908684760332108,
-0.03155830129981041,
0.10637117922306061,
0.048236992210149765,
0.04951582849025726,
-0.10150644928216934,
-0.005469940137118101,
-0.012898692861199379,
-0.04591255635023117,
-0.04891378432512283,
0.15404970943927765,
0.06755319237709045,
-0.06288737058639526,
-0.10575667023658752,
-0.04707779735326767,
-0.14847885072231293,
0.03408268839120865,
0.031502898782491684,
-0.0055160862393677235,
-0.014762635342776775,
-0.01273211371153593,
-0.011739492416381836,
-0.09042800962924957,
-0.11433527618646622,
0.03534473478794098,
0.01703364960849285,
0.052469413727521896,
0.03542264923453331,
-0.0406859815120697,
0.08743835985660553,
0.03915427252650261,
-0.05919121950864792,
-0.014995146542787552,
0.006702841259539127,
-0.10350017994642258,
0.005881158635020256,
0.007825994864106178,
-0.04877876862883568,
0.011441757902503014,
0.053148526698350906,
-0.08888831734657288,
0.07999715954065323,
0.08274975419044495,
0.017968639731407166,
0.00844134297221899,
0.20630314946174622,
0.07550074905157089,
-0.1674097180366516,
0.026435237377882004,
0.053793083876371384,
0.0036679317709058523,
0.011822734959423542,
-0.055697448551654816,
-0.04812803864479065,
0.007644948083907366,
0.07394275814294815,
-0.1123650074005127,
0.013795626349747181,
-0.05983010679483414,
-0.008057820610702038,
0.10365685820579529,
-0.12300010770559311,
0.037470344454050064,
0.027252687141299248,
-0.06726690381765366,
-0.03407752513885498,
0.08389579504728317,
-0.13731743395328522,
-0.11687543243169785,
-0.002665903652086854,
-0.04475715383887291,
-0.02222139574587345,
-0.11869115382432938,
-0.10466450452804565,
0.005671155173331499,
-0.02461104467511177,
0.003273606998845935,
-0.10506792366504669,
-0.07766605913639069,
-0.037269677966833115,
0.032579436898231506,
0.0038886803667992353,
-0.021330425515770912,
-0.0584428496658802,
0.0021057354751974344,
0.0020317984744906425,
-0.032422419637441635,
0.014555300585925579,
-0.03190306946635246,
0.10520526021718979,
0.09279227256774902,
0.03737899661064148,
0.014132306911051273,
0.023135187104344368,
-0.08886005729436874,
0.08939152956008911,
-0.10018313676118851,
0.07252871990203857,
0.00654988270252943,
0.06223422661423683,
-0.09536636620759964,
-0.07972128689289093,
0.0021473250817507505,
0.05413970723748207,
0.07021380215883255,
0.06310155242681503,
-0.11625275760889053,
0.011224828660488129,
0.16285952925682068,
-0.08533214777708054,
-0.14776864647865295,
0.11265506595373154,
-0.0162971131503582,
0.026746034622192383,
0.07655923813581467,
0.12397294491529465,
0.16478203237056732,
-0.09866967052221298,
-0.05088597163558006,
0.08590516448020935,
0.0536300390958786,
-0.07917597144842148,
0.05711381137371063,
0.011559167876839638,
-0.00921677891165018,
0.015346487984061241,
0.06669281423091888,
0.08252448588609695,
-0.011717152781784534,
-0.02544151432812214,
-0.03208751231431961,
-0.09150127321481705,
-0.03535918891429901,
0.0004974615294486284,
0.005731969140470028,
-0.06397125124931335,
-0.07239179313182831,
-0.0029737381264567375,
0.16564245522022247,
-0.09764456748962402,
0.01846575364470482,
-0.08083246648311615,
-0.04326406121253967,
-0.07998982071876526,
0.024433627724647522,
-0.12379488348960876,
0.0033237431198358536,
0.056737590581178665,
-0.01776040717959404,
0.054839637130498886,
0.08204025030136108,
0.011838562786579132,
0.025499524548649788,
-0.046148236840963364,
-0.04822517931461334,
-0.048727042973041534,
-0.08256686478853226,
-0.10010123252868652,
-0.03406621888279915,
-0.10305903106927872,
-0.0394810289144516,
-0.04714181646704674,
-0.16607441008090973,
0.0002818978682626039,
-0.019341347739100456,
0.010073177516460419,
0.026998644694685936,
-0.031109515577554703,
0.04003405570983887,
0.05043242499232292,
-0.04861271381378174,
-0.0814305767416954,
0.02608843706548214,
0.03572798892855644,
-0.08054585009813309,
-0.05100943148136139,
-0.09438344836235046,
-0.07747574895620346,
0.08116813004016876,
0.09225159883499146,
-0.10514239221811295,
-0.015668269246816635,
-0.028470003977417946,
-0.03922759369015694,
-0.04566477611660957,
-0.06407996267080307,
0.17020757496356964,
0.021590879186987877,
0.16293084621429443,
-0.1380545049905777,
-0.05716320127248764,
-0.023128559812903404,
0.008567647077143192,
0.03508584946393967,
0.14683102071285248,
0.03360675647854805,
-0.10429630428552628,
0.02275000512599945,
-0.013466516509652138,
-0.032619889825582504,
0.17395246028900146,
-0.02502536028623581,
-0.066902756690979,
0.002831239951774478,
0.120852030813694,
-0.008991495706140995,
0.1718032956123352,
-0.07821747660636902,
-0.016161605715751648,
-0.018467195332050323,
0.0075500779785215855,
0.034317534416913986,
-0.11946644634008408,
0.022218933328986168,
0.029958711937069893,
-0.06792928278446198,
-0.039840083569288254,
-0.020156538113951683,
-0.02938978374004364,
0.04110387712717056,
0.02340279519557953,
0.01156529225409031,
-0.009501803666353226,
-0.04007640480995178,
-0.11223047971725464,
0.17467528581619263,
-0.05423882603645325,
-0.18870489299297333,
-0.1575237512588501,
0.10422874987125397,
-0.02765481173992157,
-0.01624453440308571,
0.02554914727807045,
-0.09840719401836395,
-0.03576924279332161,
-0.09227269887924194,
0.13762922585010529,
-0.09200901538133621,
0.0001674498344073072,
-0.015045713633298874,
0.0652751624584198,
0.05241268500685692,
-0.16505849361419678,
0.020592620596289635,
-0.015296868979930878,
0.007130371872335672,
-0.03337692469358444,
-0.05832228064537048,
0.09106125682592392,
0.12193188816308975,
-0.05782892927527428,
0.022861063480377197,
-0.002299945568665862,
0.16202494502067566,
-0.05845964699983597,
0.05541997030377388,
0.2080165147781372,
0.030293051153421402,
0.02618483453989029,
0.034984227269887924,
0.010119695216417313,
-0.08399243652820587,
0.07073555141687393,
0.0520702488720417,
-0.04100538790225983,
-0.2152518928050995,
-0.019444046542048454,
-0.0784442201256752,
0.06424460560083389,
0.11217445135116577,
0.03693637624382973,
-0.17599089443683624,
0.02117849327623844,
-0.014039762318134308,
0.1566472351551056,
-0.031853239983320236,
0.05096627399325371,
0.00383358565159142,
0.02157287858426571,
-0.005098299589008093,
-0.10630233585834503,
-0.0022332947701215744,
0.0673893466591835,
0.09864947944879532,
0.1993848830461502,
-0.094395212829113,
0.16200394928455353,
0.019660647958517075,
0.09783754497766495,
0.021105244755744934,
0.09178891777992249,
-0.11020484566688538,
0.010590272955596447,
0.01152087189257145,
-0.018093004822731018,
-0.0791814997792244,
0.04204507917165756,
-0.022053971886634827,
0.07692855596542358,
-0.08655279874801636,
0.03791932016611099,
0.025454340502619743,
0.1876179724931717,
0.10000400245189667,
-0.17409895360469818,
-0.14011816680431366,
0.015057383105158806,
-0.09174428880214691,
-0.11008873581886292,
0.08073121309280396,
0.23200805485248566,
-0.05539553984999657,
0.017057841643691063,
-0.01804063469171524,
0.12986131012439728,
-0.09723248332738876,
-0.01349939126521349,
0.050164174288511276,
0.06035282090306282,
0.011607143096625805,
0.11115877330303192,
-0.24588239192962646,
0.07125522941350937,
0.025471026077866554,
0.10230766981840134,
-0.02010834962129593,
0.060938019305467606,
-0.02672315575182438,
-0.01405110489577055,
0.0801737830042839,
0.0014616772532463074,
-0.03265948221087456,
-0.19543397426605225,
-0.04235208034515381,
0.029153240844607353,
0.05586232617497444,
-0.009539701044559479,
0.10326273739337921,
-0.012898397631943226,
0.030512874945998192,
-0.03452093526721001,
-0.14406761527061462,
-0.08203191310167313,
-0.12143324315547943,
-0.05790375545620918,
-0.007976451888680458,
-0.0756528377532959,
-0.01641710288822651,
0.04694531485438347,
0.052661895751953125,
0.23059408366680145,
-0.14435707032680511,
-0.08021368086338043,
-0.07539494335651398,
0.06742916256189346,
0.12309969216585159,
-0.09565402567386627,
0.008803948760032654,
0.013575627468526363,
0.037123292684555054,
-0.04132504388689995,
-0.07642454653978348,
0.030214494094252586,
-0.05682654306292534,
-0.0770886018872261,
-0.03918500244617462,
0.1347479373216629,
-0.014043227769434452,
0.049130961298942566,
0.012038457207381725,
-0.09693939238786697,
-0.024452045559883118,
-0.13390874862670898,
-0.05261562764644623,
-0.04777279868721962,
0.050376199185848236,
-0.01613822765648365,
-0.10739000141620636,
0.08317896723747253,
-0.007210094016045332,
-0.06024521216750145,
0.06583740562200546,
0.16041241586208344,
-0.06912930309772491,
-0.0004430726985447109,
0.08055128902196884,
-0.046268727630376816,
-0.17309868335723877,
-0.022002991288900375,
0.039240654557943344,
0.0860694870352745,
-0.03411728888750076,
-0.15224651992321014,
0.05303552374243736,
-0.022057386115193367,
0.0179408248513937,
0.0491776242852211,
-0.27673420310020447,
-0.11303459852933884,
-0.008063080720603466,
0.06120686233043671,
0.052357058972120285,
-0.09805294126272202,
-0.04626235365867615,
-0.053183663636446,
-0.04642535373568535,
0.057292211800813675,
0.039498064666986465,
0.10865869373083115,
-0.035997044295072556,
0.03554050996899605,
0.04636646807193756,
-0.03444237634539604,
0.0480354018509388,
-0.002456405432894826,
0.11170011013746262,
-0.013102645985782146,
-0.004849743563681841,
0.04850255325436592,
-0.07643847167491913,
0.19008706510066986,
-0.1729581207036972,
0.10129355639219284,
-0.17869651317596436,
-0.04257328435778618,
-0.047212857753038406,
0.004389197565615177,
-0.03776382654905319,
-0.03287962079048157,
-0.10860450565814972,
0.02889648638665676,
0.03196047618985176,
-0.012211713008582592,
0.042578715831041336,
-0.016598200425505638,
-0.05083274096250534,
0.07881265133619308,
0.10042542219161987,
0.001985102891921997,
-0.12450156360864639,
0.033655256032943726,
0.022474536672234535,
0.08936043828725815,
-0.2147589474916458,
0.02556850016117096,
0.10640853643417358,
0.021696146577596664,
0.08879902958869934,
0.010166398249566555,
-0.09072596579790115,
0.0494234673678875,
0.06621236354112625,
-0.06697788834571838,
-0.08268377184867859,
-0.030411604791879654,
-0.061310410499572754,
-0.11086820065975189,
0.0377245768904686,
0.09541009366512299,
-0.034524787217378616,
-0.003946742042899132,
-0.01764543354511261,
0.002846264047548175,
-0.07888016849756241,
0.16698826849460602,
0.02415592595934868,
0.08188533782958984,
-0.051679499447345734,
0.06118985265493393,
0.07713227719068527,
-0.10743660479784012,
0.02603900618851185,
0.1619916409254074,
-0.07853343337774277,
-0.030442869290709496,
0.07053125649690628,
0.1324780434370041,
-0.001745477318763733,
-0.044770725071430206,
-0.1082332655787468,
-0.07643096148967743,
0.018620772287249565,
0.04859054088592529,
0.059564258903265,
0.09076990932226181,
-0.020412152633070946,
-0.002717511495575309,
-0.13225997984409332,
0.09761600196361542,
0.07396449148654938,
0.030514463782310486,
-0.13094763457775116,
0.14037413895130157,
0.03504907712340355,
0.10170061141252518,
0.0018306123092770576,
0.028590504080057144,
-0.09841470420360565,
0.04004485532641411,
-0.04409216716885567,
0.037884220480918884,
-0.015416986308991909,
0.0465208999812603,
-0.052886322140693665,
0.050557296723127365,
-0.03594289347529411,
0.04548163339495659,
-0.0397467240691185,
-0.028249062597751617,
-0.02664285898208618,
0.04133237525820732,
-0.06264197826385498,
-0.01465003751218319,
0.00018405875016469508,
-0.09294778108596802,
0.09532701224088669,
-0.06352549046278,
-0.006586070172488689,
0.005136657506227493,
0.030475130304694176,
0.03933144733309746,
0.011880338191986084,
0.04791209101676941,
-0.010826930403709412,
0.016667896881699562,
0.04328269511461258,
0.017042452469468117,
0.0027739391662180424,
-0.010077089071273804,
0.0783141627907753,
-0.1352618932723999,
-0.06860731542110443,
-0.07337550073862076,
-0.0796164870262146,
-0.06529536098241806,
0.08762361109256744,
0.08444780856370926,
0.06818407773971558,
0.09646032750606537,
-0.03918037936091423,
0.020471828058362007,
-0.19045451283454895,
-0.042346831411123276,
0.05124188959598541,
0.0005473043420352042,
-0.11981012672185898,
-0.04817768931388855,
0.07046173512935638,
-0.03580481931567192,
0.08544984459877014,
-0.007361187599599361,
0.07092739641666412,
-0.016862863674759865,
-0.0653395801782608,
-0.03391185775399208,
0.002107591601088643,
0.12984740734100342,
-0.11726660281419754,
-0.001113612437620759,
0.00118131167255342,
0.004583468660712242,
0.04307841509580612,
0.15588438510894775,
0.13015352189540863,
0.12646737694740295,
0.05659922957420349,
0.09625314921140671,
-0.04501453787088394,
-0.03428291156888008,
-0.13710612058639526,
0.07048282027244568,
-0.02107100374996662,
0.040680307894945145,
-0.04217653349041939,
0.11474237591028214,
0.09024572372436523,
-0.12704111635684967,
0.09094824641942978,
0.0014736599987372756,
-0.09153563529253006,
-0.04205341637134552,
-0.07563275843858719,
-0.04115762934088707,
-0.09551557898521423,
0.016605976969003677,
-0.08996542543172836,
-0.005369547288864851,
0.06165867671370506,
0.022936347872018814,
-0.03969956561923027,
0.1631307750940323,
-0.015680337324738503,
-0.05691065266728401,
0.05455435439944267,
0.03797641396522522,
0.029483193531632423,
0.10431206971406937,
0.023188656195998192,
0.06584305316209793,
-0.07978609949350357,
0.06894970685243607,
0.036701053380966187,
-0.01050157006829977,
0.02054096944630146,
0.018274592235684395,
-0.020743610337376595,
-0.05133967474102974,
0.005043209530413151,
0.09510951489210129,
0.16283629834651947,
0.04571380838751793,
-0.043566156178712845,
-0.05088585242629051,
0.1624697595834732,
-0.054802048951387405,
-0.03488750755786896,
-0.12196913361549377,
0.1521802693605423,
0.0309029258787632,
0.004077569581568241,
0.0051721627824008465,
-0.07667697221040726,
-0.017129583284258842,
0.22354662418365479,
0.051160722970962524,
-0.0528029203414917,
-0.028868593275547028,
-0.01922268234193325,
-0.00914763007313013,
-0.043001580983400345,
0.1599176973104477,
0.0017047618748620152,
0.22525162994861603,
0.015914222225546837,
-0.03288479894399643,
-0.03717183321714401,
-0.044265519827604294,
0.014123492874205112,
0.19296570122241974,
-0.0444430410861969,
0.043285563588142395,
-0.10101189464330673,
-0.005642673000693321,
0.02586238645017147,
-0.12363296747207642,
0.13738858699798584,
-0.13386613130569458,
-0.06502553075551987,
0.02426842413842678,
0.049188245087862015,
-0.03207239508628845,
0.06682959944009781,
-0.03133753687143326,
0.06029932573437691,
0.050762008875608444,
-0.04291191324591637,
-0.11298777908086777,
-0.13666768372058868,
0.04200715944170952,
-0.020640982314944267,
0.14229698479175568,
0.013532017357647419,
0.08956276625394821,
0.0880293920636177,
0.003724255133420229,
-0.08044207841157913,
0.08150951564311981,
0.03849858418107033,
0.013685283251106739,
0.05552263557910919,
0.1105421707034111,
-0.03931543603539467,
0.18037967383861542,
0.015081167221069336,
-0.01546452846378088,
-0.02848798595368862,
-0.049970388412475586,
-0.02479385957121849,
-0.1838952898979187,
0.0023945604916661978,
-0.05792110785841942,
0.13791555166244507,
0.18845322728157043,
-0.050035104155540466,
-0.020040960982441902,
-0.03702377900481224,
0.08983799070119858,
0.0008313374710269272,
0.09341279417276382,
-0.018804339691996574,
-0.15054598450660706,
0.014267033897340298,
-0.02004503272473812,
0.0072724176570773125,
-0.21095885336399078,
-0.04765160009264946,
-0.03450559079647064,
-0.028262948617339134,
-0.10040968656539917,
0.14683133363723755,
0.04554712772369385,
0.03540446609258652,
-0.04027653485536575,
-0.14598487317562103,
-0.00897331815212965,
0.0598335899412632,
-0.1313486397266388,
-0.12577885389328003
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation javascript
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the javascript function/method.
## Intended uses & limitations
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_javascript_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/javascript/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 40,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "function isStandardBrowserEnv ( ) { if ( typeof navigator !== 'undefined' && ( navigator . product === 'ReactNative' || navigator . product === 'NativeScript' || navigator . product === 'NS' ) ) { return false ; } return ( typeof window !== 'undefined' && typeof document !== 'undefined' ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_javascript_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation javascript
============================================================
Pretrained model on programming language javascript using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized javascript code functions: it works best with tokenized javascript functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the javascript function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the javascript function or be fine-tuned on other javascript code tasks. It can be used on unparsed and untokenized javascript code. However, if the javascript code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 40,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 40,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 40,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
60,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate javascript function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 40,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing javascript code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11693233996629715,
0.04049395024776459,
-0.001214333693496883,
0.09937872737646103,
0.045249875634908676,
0.023041771724820137,
0.04743624106049538,
0.09931275993585587,
-0.005891371984034777,
0.06427446007728577,
0.041526686400175095,
-0.07307141274213791,
0.06291349977254868,
0.19640086591243744,
0.022483306005597115,
-0.11541546881198883,
-0.04456730931997299,
0.040242649614810944,
-0.06338462978601456,
0.11501237004995346,
0.07091269642114639,
-0.08976374566555023,
0.0760326161980629,
-0.03192072734236717,
-0.1356477290391922,
0.03198450803756714,
-0.009667291305959225,
-0.016605285927653313,
0.0970834493637085,
0.06259281188249588,
0.12301933020353317,
-0.012455541640520096,
0.060667358338832855,
-0.1949768215417862,
0.004496258217841387,
0.028975797817111015,
0.06454426795244217,
0.04613649472594261,
0.050484780222177505,
0.06921567022800446,
0.10316108167171478,
-0.023105768486857414,
0.02468966320157051,
0.062086258083581924,
-0.06916000694036484,
-0.04265575483441353,
-0.07983246445655823,
0.055048972368240356,
0.09041370451450348,
0.09807845950126648,
-0.01050812192261219,
0.05450265109539032,
-0.0872596874833107,
0.08759176731109619,
0.12498913705348969,
-0.24389579892158508,
-0.02293393388390541,
0.08964554965496063,
0.09380034357309341,
0.024319998919963837,
-0.07998504489660263,
-0.03610609844326973,
0.10979703068733215,
0.03890958055853844,
0.05388543754816055,
-0.09190398454666138,
-0.0025925817899405956,
-0.007686382159590721,
-0.047373294830322266,
-0.04785590618848801,
0.18478591740131378,
0.058669690042734146,
-0.062317706644535065,
-0.1011451929807663,
-0.04424270614981651,
-0.16684959828853607,
0.03609909117221832,
0.014834660105407238,
-0.004096105229109526,
-0.010065809823572636,
-0.021160971373319626,
0.00017769186524674296,
-0.0942242220044136,
-0.11217659711837769,
0.025371309369802475,
0.010207687504589558,
0.051821283996105194,
0.029660655185580254,
-0.056948866695165634,
0.09518367052078247,
0.05130714178085327,
-0.05874781683087349,
-0.012981549836695194,
0.009545264765620232,
-0.1082020103931427,
-0.010810641571879387,
-0.0037118932232260704,
-0.059358809143304825,
0.004343649372458458,
0.07196256518363953,
-0.08321745693683624,
0.0829898789525032,
0.07709354907274246,
0.02345322072505951,
0.01604234054684639,
0.2223055064678192,
0.07354699820280075,
-0.1726195514202118,
0.029322100803256035,
0.03991413488984108,
0.001158716855570674,
0.016338475048542023,
-0.0527493953704834,
-0.05283832922577858,
0.02807227149605751,
0.0723249539732933,
-0.12149536609649658,
0.027382655069231987,
-0.06800555437803268,
-0.010242546908557415,
0.09403920918703079,
-0.1312306821346283,
0.03261028602719307,
0.02807803265750408,
-0.06375359743833542,
-0.025951117277145386,
0.09575954079627991,
-0.13442319631576538,
-0.11540816724300385,
-0.008189233019948006,
-0.0455191507935524,
-0.025558872148394585,
-0.11627883464097977,
-0.10361884534358978,
-0.0021858462132513523,
-0.005556493531912565,
-0.0026912996545434,
-0.1113089770078659,
-0.08022403717041016,
-0.028692487627267838,
0.034012798219919205,
0.008833209052681923,
-0.019211456179618835,
-0.05176970735192299,
0.0006682644598186016,
-0.005016993265599012,
-0.026299232617020607,
0.00028313661459833384,
-0.027469703927636147,
0.10150337219238281,
0.08491960167884827,
0.041890501976013184,
0.01422873605042696,
0.02095470018684864,
-0.09524092823266983,
0.08648521453142166,
-0.127522274851799,
0.07322090864181519,
0.009900376200675964,
0.05381884425878525,
-0.1070057600736618,
-0.07341023534536362,
-0.005432369653135538,
0.04361456260085106,
0.07842665165662766,
0.061612874269485474,
-0.13288737833499908,
0.012357286177575588,
0.15253213047981262,
-0.09624190628528595,
-0.15118563175201416,
0.11151537299156189,
-0.01773245632648468,
0.042304009199142456,
0.07798860222101212,
0.13140052556991577,
0.14847858250141144,
-0.09946514666080475,
-0.04183681309223175,
0.07936374098062515,
0.05049903318285942,
-0.054980695247650146,
0.051394008100032806,
0.012598126195371151,
-0.01677500270307064,
0.012474039569497108,
0.0590842105448246,
0.08184342831373215,
-0.013749786652624607,
-0.03457370772957802,
-0.03084784559905529,
-0.09541300684213638,
-0.03358513489365578,
-0.005271288566291332,
0.017581157386302948,
-0.05391369387507439,
-0.06606853008270264,
-0.0027943956665694714,
0.1628684550523758,
-0.10314618051052094,
0.0274409931153059,
-0.09398052841424942,
-0.036674078553915024,
-0.08057639002799988,
0.024249661713838577,
-0.12863768637180328,
0.00482647679746151,
0.06121757626533508,
-0.034234195947647095,
0.0495651550590992,
0.08039094507694244,
0.011133627966046333,
0.017733845859766006,
-0.04655022174119949,
-0.04864398017525673,
-0.04765276238322258,
-0.07586207985877991,
-0.10226811468601227,
-0.03911137208342552,
-0.10156724601984024,
-0.03124064952135086,
-0.035468488931655884,
-0.17420431971549988,
-0.003939575515687466,
-0.009581753052771091,
0.024376582354307175,
0.030652055516839027,
-0.03514440730214119,
0.021864937618374825,
0.04077330604195595,
-0.05369329825043678,
-0.07983481884002686,
0.01745288074016571,
0.04539712145924568,
-0.09050041437149048,
-0.04559708759188652,
-0.09258536249399185,
-0.0851517766714096,
0.08517209440469742,
0.07979755848646164,
-0.12377917021512985,
-0.012321515008807182,
-0.03188377991318703,
-0.04057081788778305,
-0.04631736874580383,
-0.06752927601337433,
0.1701577752828598,
0.01698589324951172,
0.16082575917243958,
-0.13722652196884155,
-0.06901813298463821,
-0.025920914486050606,
0.0081630302593112,
0.0345718190073967,
0.1382036805152893,
0.013354112394154072,
-0.1013002023100853,
0.019874276593327522,
-0.004882491659373045,
-0.037118516862392426,
0.16899584233760834,
-0.03010469116270542,
-0.05919242277741432,
-0.003668704768642783,
0.11503831297159195,
-0.00467739999294281,
0.18626342713832855,
-0.0690840631723404,
-0.013746694661676884,
-0.016206275671720505,
0.008713629096746445,
0.03606802597641945,
-0.1197471097111702,
0.029757758602499962,
0.031564995646476746,
-0.06369581073522568,
-0.03425334393978119,
-0.02145710401237011,
-0.03371062129735947,
0.04352154955267906,
0.019627096131443977,
0.025047359988093376,
-0.010263658128678799,
-0.04070195183157921,
-0.11838312447071075,
0.17544060945510864,
-0.0601881742477417,
-0.18043404817581177,
-0.15303237736225128,
0.11587300151586533,
-0.020080886781215668,
-0.013393810950219631,
0.022351736202836037,
-0.09532088786363602,
-0.04312289506196976,
-0.09192005544900894,
0.14323030412197113,
-0.08786530792713165,
0.0024431771598756313,
-0.004937745165079832,
0.055963270366191864,
0.057462017983198166,
-0.16766758263111115,
0.022047551348805428,
-0.014331340789794922,
-0.0012760284589603543,
-0.02797345258295536,
-0.0732540562748909,
0.08687348663806915,
0.12714534997940063,
-0.060605913400650024,
0.0233899112790823,
-0.002166010905057192,
0.17902541160583496,
-0.06668373942375183,
0.06023265793919563,
0.18384124338626862,
0.01706312596797943,
0.023181108757853508,
0.03552623093128204,
0.004167935345321894,
-0.08788175135850906,
0.0693889632821083,
0.0404617004096508,
-0.036334313452243805,
-0.21483509242534637,
-0.025331638753414154,
-0.07586795836687088,
0.06102016195654869,
0.11003606021404266,
0.033698875457048416,
-0.1593569815158844,
0.026431020349264145,
-0.012644835747778416,
0.17011813819408417,
-0.017146917060017586,
0.05875582993030548,
-0.0034229999873787165,
0.02724735252559185,
-0.0014739652397111058,
-0.10505511611700058,
-0.0017156458925455809,
0.060279957950115204,
0.09761625528335571,
0.1961703598499298,
-0.09001751244068146,
0.16917714476585388,
0.01573876664042473,
0.11374563723802567,
0.030354656279087067,
0.09750949591398239,
-0.11201445758342743,
0.01538331899791956,
0.01350446231663227,
-0.02179962955415249,
-0.07844548672437668,
0.03908831998705864,
-0.03731203079223633,
0.08325733989477158,
-0.0863986536860466,
0.023793712258338928,
0.03587307408452034,
0.1837308257818222,
0.09348735213279724,
-0.18290871381759644,
-0.1425234079360962,
0.012059797532856464,
-0.08502869307994843,
-0.10203038901090622,
0.07472658157348633,
0.2171599119901657,
-0.06558482348918915,
0.017048586159944534,
-0.015149805694818497,
0.12737944722175598,
-0.09445816278457642,
-0.015334242954850197,
0.05009038746356964,
0.05728588253259659,
0.007569116540253162,
0.10889177024364471,
-0.25050094723701477,
0.07257283478975296,
0.02200569584965706,
0.10172973573207855,
-0.021302388980984688,
0.05610986053943634,
-0.033262304961681366,
-0.0011236758437007666,
0.07927624136209488,
0.008162014186382294,
-0.02749950811266899,
-0.1962461620569229,
-0.02874842658638954,
0.02890503779053688,
0.05439549311995506,
-0.013183755800127983,
0.0969923660159111,
-0.016809048131108284,
0.03589291125535965,
-0.028704870492219925,
-0.12130098789930344,
-0.09518681466579437,
-0.11432112753391266,
-0.0535070039331913,
-0.0076691703870892525,
-0.052518632262945175,
-0.02232217602431774,
0.04946518316864967,
0.031747158616781235,
0.22976550459861755,
-0.15017084777355194,
-0.06877528876066208,
-0.0752582922577858,
0.05818597227334976,
0.12235669046640396,
-0.09451404958963394,
0.011051289737224579,
0.01843447983264923,
0.05694267153739929,
-0.04043989256024361,
-0.08603508025407791,
0.029813922941684723,
-0.06296710669994354,
-0.08450605720281601,
-0.039862483739852905,
0.1292787343263626,
0.004434904083609581,
0.0443389005959034,
0.023930635303258896,
-0.0950038954615593,
-0.02694978192448616,
-0.12943978607654572,
-0.06135036423802376,
-0.036873623728752136,
0.04851420596241951,
-0.012770483270287514,
-0.11696168035268784,
0.06859079003334045,
-0.014406087808310986,
-0.06027064844965935,
0.056926097720861435,
0.15265139937400818,
-0.07216423749923706,
0.0104901734739542,
0.08086857944726944,
-0.052490826696157455,
-0.17558811604976654,
-0.014857416041195393,
0.04226382449269295,
0.08235783874988556,
-0.024030335247516632,
-0.14646004140377045,
0.06636038422584534,
-0.025640195235610008,
0.016321314498782158,
0.021040357649326324,
-0.26691263914108276,
-0.12026114016771317,
0.00735090347006917,
0.05836029723286629,
0.03758368641138077,
-0.09869255125522614,
-0.04774905741214752,
-0.061250120401382446,
-0.06675542145967484,
0.06761325895786285,
0.029847536236047745,
0.09810227900743484,
-0.027155762538313866,
0.03452133759856224,
0.047647085040807724,
-0.03282076120376587,
0.05224963650107384,
0.011259939521551132,
0.11002692580223083,
-0.016208479180932045,
-0.003075338900089264,
0.0632021427154541,
-0.07122889906167984,
0.1795462965965271,
-0.15088443458080292,
0.10208339244127274,
-0.17161434888839722,
-0.03194314241409302,
-0.04749337583780289,
0.011222758330404758,
-0.03768870234489441,
-0.03704050928354263,
-0.12235534191131592,
0.03882194310426712,
0.049235742539167404,
-0.013979533687233925,
0.0421900749206543,
-0.004545229021459818,
-0.04781044274568558,
0.0664561316370964,
0.10485843569040298,
0.009643801487982273,
-0.11788845807313919,
0.04307170212268829,
0.020351072773337364,
0.09506119787693024,
-0.18555794656276703,
0.035552289336919785,
0.10600548982620239,
0.017429163679480553,
0.09543875604867935,
0.016366546973586082,
-0.09987430274486542,
0.03763948008418083,
0.06361322104930878,
-0.07375586032867432,
-0.0654798224568367,
-0.02643376775085926,
-0.04590027406811714,
-0.0971403494477272,
0.04274652153253555,
0.09344895929098129,
-0.0399046428501606,
-0.004552794154733419,
-0.012798314914107323,
0.0009559581521898508,
-0.08511493355035782,
0.1662089079618454,
0.019020935520529747,
0.0803692638874054,
-0.053366318345069885,
0.07333560287952423,
0.08301719278097153,
-0.10281582176685333,
0.025662295520305634,
0.13813959062099457,
-0.0907275453209877,
-0.02171078696846962,
0.08364015817642212,
0.14823973178863525,
-0.01299984846264124,
-0.047911979258060455,
-0.10120061039924622,
-0.08110307902097702,
0.007521637249737978,
0.05407271534204483,
0.06592988222837448,
0.09782067686319351,
-0.016490068286657333,
0.0029830578714609146,
-0.13359858095645905,
0.0974767729640007,
0.08709808439016342,
0.033950380980968475,
-0.12332753837108612,
0.16058120131492615,
0.029972555115818977,
0.09701143950223923,
-0.0006122720660641789,
0.0266958586871624,
-0.11161988228559494,
0.039776742458343506,
-0.0373486764729023,
0.037027571350336075,
-0.01644924283027649,
0.041908521205186844,
-0.05204775929450989,
0.04688221961259842,
-0.035253070294857025,
0.0442868247628212,
-0.03880152106285095,
-0.022517843171954155,
-0.026555687189102173,
0.03339448943734169,
-0.06232248619198799,
-0.011225278489291668,
0.007281355559825897,
-0.0991334542632103,
0.09775456041097641,
-0.054201316088438034,
-0.0006890288786962628,
0.008306815288960934,
0.01792723871767521,
0.04174234718084335,
0.0014593589585274458,
0.04618208110332489,
-0.010369669646024704,
-0.0006892571691423655,
0.031857311725616455,
0.021163547411561012,
-0.005297068972140551,
-0.01315932348370552,
0.08653862029314041,
-0.13189372420310974,
-0.06987836956977844,
-0.0754980519413948,
-0.06746088713407516,
-0.07162658870220184,
0.08515023440122604,
0.07582356035709381,
0.07538352161645889,
0.09422546625137329,
-0.041375596076250076,
0.01445860881358385,
-0.1923285275697708,
-0.04451466724276543,
0.05617376044392586,
0.0033412121701985598,
-0.11443953216075897,
-0.045363347977399826,
0.0715990960597992,
-0.040152885019779205,
0.09790556132793427,
-0.016619211062788963,
0.05500754341483116,
-0.01567307487130165,
-0.060620833188295364,
-0.04848372936248779,
0.007260698825120926,
0.15327835083007812,
-0.11402006447315216,
0.002800648333504796,
-0.013944270089268684,
0.005648538935929537,
0.03778364136815071,
0.1564098745584488,
0.13215482234954834,
0.12399235367774963,
0.029035184532403946,
0.0940583348274231,
-0.05145015940070152,
-0.041096676141023636,
-0.11423450708389282,
0.06844806671142578,
-0.03434222191572189,
0.036400798708200455,
-0.031161723658442497,
0.1261422485113144,
0.08296220004558563,
-0.1378900706768036,
0.09186231344938278,
-0.006026783958077431,
-0.09346598386764526,
-0.04005170613527298,
-0.08454546332359314,
-0.037923313677310944,
-0.10128113627433777,
0.010008958168327808,
-0.09650436043739319,
-0.007609310559928417,
0.04279937595129013,
0.022193361073732376,
-0.04030811786651611,
0.17315207421779633,
-0.04150146245956421,
-0.060539308935403824,
0.047608766704797745,
0.04255085065960884,
0.020938929170370102,
0.09665828198194504,
0.026064740493893623,
0.0629243478178978,
-0.07126165181398392,
0.06958336383104324,
0.03683948144316673,
-0.0026610686909407377,
0.020454799756407738,
0.031341683119535446,
-0.02146913856267929,
-0.046901874244213104,
-0.007362885866314173,
0.0863950103521347,
0.13175815343856812,
0.04352862387895584,
-0.0316726379096508,
-0.05432426184415817,
0.16417953372001648,
-0.05169130861759186,
-0.03972905874252319,
-0.1300191879272461,
0.1437932401895523,
0.029141955077648163,
0.0032767001539468765,
0.010286707431077957,
-0.08364298939704895,
-0.007242568768560886,
0.24377445876598358,
0.05320363491773605,
-0.05188630893826485,
-0.024709271267056465,
-0.01713551953434944,
-0.00861125998198986,
-0.037903379648923874,
0.1488896608352661,
0.00187780917622149,
0.2230386734008789,
0.01615109108388424,
-0.01913946308195591,
-0.04483000189065933,
-0.04576239734888077,
0.01965978927910328,
0.20067133009433746,
-0.0453670397400856,
0.032576873898506165,
-0.10132461786270142,
-0.007212276104837656,
0.020810425281524658,
-0.1488209366798401,
0.1334720253944397,
-0.1357044130563736,
-0.058457113802433014,
0.019636699929833412,
0.05640875920653343,
-0.043479662388563156,
0.06233275681734085,
-0.033353012055158615,
0.06130841746926308,
0.06073164939880371,
-0.03363201767206192,
-0.09889325499534607,
-0.13149750232696533,
0.04658135026693344,
-0.02245989814400673,
0.13833686709403992,
0.014013283886015415,
0.09715066105127335,
0.08292043209075928,
0.012483182363212109,
-0.07698389887809753,
0.07655367255210876,
0.038345541805028915,
0.01927390694618225,
0.050500717014074326,
0.11601283401250839,
-0.03805498033761978,
0.16977623105049133,
0.017920225858688354,
-0.02663251757621765,
-0.02056558057665825,
-0.043979015201330185,
-0.01816563867032528,
-0.17846715450286865,
-0.002724661258980632,
-0.05864030122756958,
0.1417999267578125,
0.18461501598358154,
-0.05319266766309738,
-0.012379365973174572,
-0.039421312510967255,
0.08960529416799545,
0.005350896622985601,
0.09251285344362259,
-0.010107526555657387,
-0.16137447953224182,
0.014511603862047195,
-0.03411322459578514,
0.006542035844177008,
-0.19032400846481323,
-0.05081437900662422,
-0.036573998630046844,
-0.038033828139305115,
-0.10088654607534409,
0.1472320854663849,
0.05351465567946434,
0.03773292899131775,
-0.041445545852184296,
-0.10831478983163834,
-0.0071485405787825584,
0.06174812838435173,
-0.12529125809669495,
-0.1272432804107666
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus php dataset.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/php/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_php
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus php dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12262928485870361,
0.04760627821087837,
-0.0012154183350503445,
0.07886946201324463,
0.12940795719623566,
0.004873304627835751,
0.07704035937786102,
0.036619167774915695,
-0.0347420759499073,
-0.03032386675477028,
0.10819561034440994,
0.11034061014652252,
0.008230275474488735,
0.10252544283866882,
-0.01134067215025425,
-0.17104800045490265,
-0.010767240077257156,
0.06730972975492477,
-0.21671870350837708,
0.13199742138385773,
0.11851270496845245,
-0.06558118015527725,
0.08601730316877365,
0.021582957357168198,
-0.16458946466445923,
0.08412481099367142,
-0.026823580265045166,
-0.046219903975725174,
0.11477005481719971,
0.0747961550951004,
0.10903230309486389,
0.014466359280049801,
0.004460393451154232,
-0.233376145362854,
0.039286114275455475,
-0.03056967630982399,
-0.007004658225923777,
0.03763728216290474,
0.06088586524128914,
-0.05420137941837311,
0.09736865758895874,
-0.007025453262031078,
0.018516432493925095,
0.05514291301369667,
-0.08719692379236221,
-0.07074491679668427,
-0.011377259157598019,
0.023999623954296112,
0.06588952243328094,
0.09725863486528397,
0.022855110466480255,
0.08127547800540924,
-0.15506626665592194,
0.10440061241388321,
0.11186080425977707,
-0.1556883007287979,
-0.019516535103321075,
0.08174820244312286,
0.06318126618862152,
-0.08528182655572891,
-0.04502119868993759,
-0.002742310054600239,
0.04305224493145943,
0.008077838458120823,
-0.00821888167411089,
-0.12990133464336395,
-0.11894845217466354,
0.07736934721469879,
-0.06583396345376968,
-0.08417248725891113,
0.30025655031204224,
0.01911121793091297,
-0.011828262358903885,
-0.04981391131877899,
-0.04083075001835823,
0.053098343312740326,
-0.03974553197622299,
0.0005041530821472406,
0.028536846861243248,
0.013614614494144917,
-0.033435191959142685,
-0.02855677902698517,
-0.1170823723077774,
-0.13066856563091278,
0.009108278900384903,
0.05284993350505829,
0.013345818035304546,
0.01639782264828682,
-0.1682516485452652,
0.11027464270591736,
0.11093065142631531,
-0.05371757224202156,
0.027423422783613205,
-0.03585556522011757,
0.006804251577705145,
0.019649123772978783,
-0.08437584340572357,
-0.18006815016269684,
0.13459685444831848,
0.1164764016866684,
-0.08270339667797089,
0.0744962990283966,
0.05236651748418808,
0.05991159752011299,
0.008883330971002579,
0.1828211396932602,
0.00821502786129713,
-0.05623449385166168,
0.05227431654930115,
-0.024988772347569466,
-0.07367564737796783,
0.04209761694073677,
-0.07607656717300415,
-0.03506382182240486,
0.017043642699718475,
0.10747820883989334,
-0.08368456363677979,
0.08886928111314774,
-0.06414017826318741,
-0.021532094106078148,
0.01633272133767605,
-0.14897379279136658,
-0.017526209354400635,
0.037004221230745316,
-0.03213954716920853,
-0.04694126918911934,
0.11347247660160065,
-0.07932165265083313,
-0.15784600377082825,
-0.03357440605759621,
-0.06607287377119064,
-0.008363917469978333,
-0.07249923050403595,
-0.0403604581952095,
0.026641206815838814,
0.07297472655773163,
0.05846766382455826,
-0.11587851494550705,
-0.13543817400932312,
-0.015063595026731491,
0.0923188328742981,
0.022119244560599327,
0.03070998191833496,
-0.10241621732711792,
0.0042912014760077,
-0.021739793941378593,
-0.018138572573661804,
0.06065567210316658,
-0.0797632560133934,
0.06525345891714096,
0.07369360327720642,
0.012300543487071991,
-0.07460524886846542,
0.05211435630917549,
-0.11318839341402054,
0.0709679126739502,
-0.12551377713680267,
0.09131905436515808,
-0.03620119392871857,
0.08725175261497498,
-0.11970026791095734,
-0.05745616555213928,
0.011630876921117306,
0.055927153676748276,
0.06744839996099472,
0.1683690994977951,
-0.07072876393795013,
-0.05456588789820671,
0.18006297945976257,
-0.0820271372795105,
-0.21747203171253204,
0.05060485005378723,
-0.07236688584089279,
0.1694127321243286,
0.05525492876768112,
0.19060862064361572,
0.13479328155517578,
-0.06404291838407516,
0.07582956552505493,
0.10001631081104279,
-0.014829853549599648,
-0.08675727248191833,
0.07200316339731216,
0.0007430611294694245,
-0.11827385425567627,
0.07129202038049698,
-0.0471414253115654,
0.08109548687934875,
-0.011490928940474987,
-0.05575262010097504,
-0.005935200955718756,
-0.062359198927879333,
0.05176370590925217,
0.0030755391344428062,
0.09046275913715363,
0.008509018458425999,
-0.010347381234169006,
0.0864151120185852,
0.09488870203495026,
-0.13032583892345428,
0.028679264709353447,
-0.09653836488723755,
0.07499799132347107,
-0.10940288007259369,
0.035337500274181366,
-0.23279063403606415,
-0.008769621141254902,
-0.015922199934720993,
0.004262278787791729,
0.06710365414619446,
0.009961425326764584,
0.038383688777685165,
-0.03212357312440872,
0.002673052018508315,
-0.00797353032976389,
-0.000022038080714992248,
-0.01335128303617239,
-0.045871760696172714,
-0.09447258710861206,
-0.020684923976659775,
-0.038175828754901886,
0.05873965099453926,
-0.16601502895355225,
-0.010973574593663216,
0.06439600884914398,
0.057128168642520905,
-0.008840201422572136,
0.026113804429769516,
0.01727687567472458,
0.04664018005132675,
-0.02148444950580597,
0.0024178270250558853,
0.07700604200363159,
0.03184773772954941,
-0.0984547957777977,
0.03753891587257385,
-0.024071795865893364,
0.05575600266456604,
0.11477924883365631,
-0.18189719319343567,
-0.04731661081314087,
-0.11792001873254776,
-0.04310713708400726,
-0.01096101850271225,
0.049748655408620834,
-0.023605553433299065,
0.24043118953704834,
0.006393253803253174,
0.19207946956157684,
-0.13113726675510406,
-0.059940095990896225,
-0.026383964344859123,
-0.014203071594238281,
0.041681211441755295,
0.15847896039485931,
0.08450357615947723,
-0.1418822705745697,
0.05680086463689804,
0.11675970256328583,
-0.014658446423709393,
0.1432337462902069,
-0.08221453428268433,
-0.03806329518556595,
-0.007090677972882986,
0.08593928068876266,
-0.03546794503927231,
0.14315484464168549,
-0.2596267759799957,
-0.03086424432694912,
0.01280622836202383,
-0.008229797706007957,
0.11268039792776108,
-0.14003410935401917,
0.02234337478876114,
0.015464953146874905,
-0.05633513629436493,
-0.09999621659517288,
0.02806790918111801,
0.00933816283941269,
0.033488769084215164,
0.02673506736755371,
0.008002368733286858,
0.043828003108501434,
-0.029686471447348595,
-0.13052891194820404,
0.24799737334251404,
-0.08390206843614578,
-0.24599212408065796,
-0.17442481219768524,
0.037579674273729324,
-0.056233324110507965,
-0.020020581781864166,
0.05716891214251518,
-0.08938515931367874,
-0.03863527625799179,
-0.039112068712711334,
0.07971645146608353,
-0.05461287125945091,
-0.03199930116534233,
-0.03621898218989372,
0.06689143925905228,
0.04308051988482475,
-0.18800096213817596,
-0.016093986108899117,
0.003501336555927992,
0.05702546611428261,
-0.02657814882695675,
-0.12681391835212708,
0.11996336281299591,
0.1038338840007782,
-0.031083112582564354,
0.06979965418577194,
-0.023936856538057327,
0.22739292681217194,
-0.06095912307500839,
-0.0910612940788269,
0.15452416241168976,
-0.09576360881328583,
0.02231455035507679,
0.0030331916641443968,
0.020967066287994385,
-0.09950436651706696,
0.014011658728122711,
-0.009704535827040672,
-0.04786228761076927,
-0.2560635209083557,
-0.11983998864889145,
-0.08486350625753403,
0.0287313349545002,
0.05497843027114868,
0.06084080785512924,
-0.09917228668928146,
0.06347830593585968,
0.06414846330881119,
0.09353846311569214,
-0.0075041260570287704,
0.05429311469197273,
0.12887068092823029,
-0.007460867986083031,
-0.024818703532218933,
-0.11898493021726608,
-0.0802903026342392,
0.04524555802345276,
0.10023193806409836,
0.1659216284751892,
0.0026679104194045067,
0.1448908895254135,
0.08739381283521652,
0.04788118228316307,
-0.010629955679178238,
0.1614646017551422,
-0.0912775844335556,
0.03870979696512222,
-0.007468048948794603,
-0.04204998537898064,
-0.1118837296962738,
0.03256922960281372,
-0.04853528365492821,
0.019635895267128944,
-0.1636836975812912,
-0.11283131688833237,
0.07981999963521957,
0.10857048630714417,
-0.012630186043679714,
-0.2404697984457016,
-0.09991899877786636,
-0.006136797368526459,
-0.09299591928720474,
-0.0652763843536377,
0.06198379769921303,
0.06564455479383469,
-0.11897740513086319,
0.007312208879739046,
-0.04550378397107124,
0.17756038904190063,
-0.06418716907501221,
0.02143850177526474,
-0.036685362458229065,
-0.0637383908033371,
0.0231575109064579,
0.16719147562980652,
-0.16900081932544708,
0.24166081845760345,
0.013931556604802608,
0.02307438664138317,
-0.07935979962348938,
0.026534339413046837,
0.008023289032280445,
0.0711965337395668,
0.1301904171705246,
-0.018702076748013496,
0.05036752671003342,
-0.13175565004348755,
0.03500528261065483,
0.10102719068527222,
0.0785847008228302,
-0.026170639321208,
0.06117613613605499,
-0.036297496408224106,
0.023765217512845993,
-0.013016351498663425,
-0.06938708573579788,
-0.07257036864757538,
-0.14201514422893524,
-0.020070886239409447,
-0.04826807603240013,
0.049688056111335754,
-0.019484957680106163,
0.0310247540473938,
0.045817017555236816,
0.17494334280490875,
-0.09585664421319962,
-0.07388953119516373,
-0.10680001974105835,
0.013029116205871105,
0.10699988901615143,
-0.09787636995315552,
0.03202376514673233,
-0.03189215809106827,
-0.001257964177057147,
0.01599852181971073,
-0.14358074963092804,
0.04009275138378143,
-0.05306731164455414,
0.0023385558743029833,
-0.024331439286470413,
0.11785714328289032,
-0.02360720932483673,
-0.019569730386137962,
0.06278139352798462,
-0.07797306776046753,
-0.09209495782852173,
-0.1384000927209854,
-0.1157410740852356,
-0.0893067941069603,
0.09793616831302643,
0.020800303667783737,
-0.13009709119796753,
0.09156661480665207,
-0.006228778976947069,
-0.0030727426055818796,
0.2267014980316162,
0.09021448343992233,
-0.024488531053066254,
0.01620701141655445,
0.1564667820930481,
-0.11232444643974304,
-0.27545031905174255,
-0.030617723241448402,
-0.03763485699892044,
0.02603120356798172,
-0.015664558857679367,
-0.1463547646999359,
0.12929578125476837,
-0.05169260501861572,
0.029544692486524582,
-0.01269586756825447,
-0.2651468813419342,
-0.10508961975574493,
0.1234230101108551,
0.11159229278564453,
0.07429447025060654,
-0.1340339034795761,
-0.06641264259815216,
-0.08774331212043762,
-0.1740356683731079,
0.12378684431314468,
-0.11006639152765274,
0.082959845662117,
-0.007445476017892361,
0.05853297561407089,
0.012530605308711529,
-0.05197453126311302,
0.1017027273774147,
0.011681284755468369,
0.09012142568826675,
-0.028160294517874718,
-0.0885629653930664,
0.11754115670919418,
-0.0348907895386219,
0.13227123022079468,
-0.09786419570446014,
0.09977711737155914,
-0.23912550508975983,
-0.05291642248630524,
-0.028790703043341637,
0.03650360181927681,
-0.004970479756593704,
-0.04570024460554123,
-0.07298785448074341,
-0.006572564598172903,
0.05111750215291977,
0.02070360630750656,
0.07691248506307602,
-0.036901265382766724,
-0.03764912113547325,
0.10307062417268753,
0.1504017412662506,
-0.051146235316991806,
-0.13650187849998474,
0.03602183237671852,
0.011856094934046268,
0.09518123418092728,
-0.23696278035640717,
0.09205339848995209,
0.10998693108558655,
0.0270924624055624,
0.09469205886125565,
0.0797029510140419,
-0.024459538981318474,
0.014639617875218391,
0.08570519834756851,
-0.13507264852523804,
-0.09310352057218552,
-0.0654577761888504,
-0.11154911667108536,
-0.043084144592285156,
0.08413206785917282,
0.12173021584749222,
-0.028065793216228485,
-0.0018933838000521064,
-0.0008011278114281595,
-0.035286445170640945,
-0.13666321337223053,
0.14727076888084412,
0.06416239589452744,
0.04950134828686714,
-0.09191901236772537,
0.06727571040391922,
0.044020239263772964,
-0.16898779571056366,
-0.008366351015865803,
0.10971211642026901,
-0.12197969108819962,
-0.08200690895318985,
0.02991180121898651,
0.26404592394828796,
-0.09276256710290909,
-0.11105155944824219,
-0.15069465339183807,
-0.04752015694975853,
0.028780262917280197,
0.17503581941127777,
0.10576356202363968,
0.07573447376489639,
-0.024332145228981972,
-0.011414985172450542,
-0.07695970684289932,
0.07378756254911423,
0.10983534157276154,
-0.020304000005126,
-0.08252271264791489,
0.0569087415933609,
-0.00204598275013268,
0.15315914154052734,
-0.04597173631191254,
-0.03763118386268616,
-0.17712248861789703,
0.0855940580368042,
-0.14724156260490417,
0.07227043062448502,
-0.0515754371881485,
0.04103505611419678,
0.010714331641793251,
0.022809306159615517,
-0.034131281077861786,
0.06120027229189873,
-0.0811849981546402,
0.009427023120224476,
0.013242391869425774,
0.08876445144414902,
-0.10152477771043777,
0.01637013629078865,
0.0791422426700592,
-0.04076681658625603,
0.09380859136581421,
0.02932831458747387,
-0.09037821739912033,
0.1102731004357338,
-0.21593187749385834,
-0.023837920278310776,
0.025553997606039047,
0.0414920337498188,
0.026447072625160217,
-0.004112796857953072,
0.04701744392514229,
0.050579387694597244,
0.037077683955430984,
-0.009364018216729164,
0.13682295382022858,
-0.12555386126041412,
-0.0915583074092865,
-0.0766422376036644,
-0.11559824645519257,
-0.0317155197262764,
0.02403712272644043,
0.040690239518880844,
0.09561042487621307,
0.0948786810040474,
-0.04289185255765915,
0.03113187476992607,
-0.09130161255598068,
-0.01643199473619461,
0.06258826702833176,
-0.05706740543246269,
-0.07813455909490585,
-0.11657276749610901,
0.025581877678632736,
-0.05075637623667717,
0.2386331707239151,
0.001829805551096797,
0.13424904644489288,
-0.00595724955201149,
0.012397249229252338,
0.0476570799946785,
0.03165788576006889,
0.21101954579353333,
-0.0335560105741024,
0.03791658952832222,
-0.06957688927650452,
0.07759533077478409,
0.03240098059177399,
0.09378817677497864,
0.06998982280492783,
0.1460052877664566,
-0.019080257043242455,
0.11804229766130447,
0.0035906280390918255,
0.06698235869407654,
-0.03320050612092018,
-0.06684892624616623,
0.08020275086164474,
0.0698043629527092,
-0.07062719762325287,
0.13433833420276642,
0.10284752398729324,
-0.08120587468147278,
0.10038686543703079,
0.008353189565241337,
-0.09305467456579208,
-0.04571501538157463,
-0.03208451718091965,
-0.064858578145504,
-0.14871251583099365,
0.0028512070421129465,
-0.10355640947818756,
-0.0764349102973938,
0.08252251148223877,
0.025703029707074165,
-0.0531664676964283,
0.22060301899909973,
0.014973213896155357,
-0.07186220586299896,
0.04837231710553169,
-0.016980314627289772,
0.01976759172976017,
0.03334276005625725,
0.05857377126812935,
0.010715940035879612,
-0.046721719205379486,
0.011981552466750145,
0.0655512586236,
-0.04876922816038132,
-0.0036231924314051867,
-0.06717211753129959,
-0.009591198526322842,
-0.05317304655909538,
0.07203220576047897,
0.0062725236639380455,
0.09944625198841095,
0.017221972346305847,
-0.050882309675216675,
-0.012068766169250011,
0.15876121819019318,
-0.02466759830713272,
-0.08041749149560928,
-0.13216504454612732,
0.16458986699581146,
0.054712820798158646,
0.044922009110450745,
0.01720765233039856,
-0.06581538170576096,
-0.014541764743626118,
0.3312808573246002,
0.2047710120677948,
-0.06427378207445145,
0.010111432522535324,
0.04875277727842331,
0.023115374147892,
0.029657764360308647,
0.15189988911151886,
0.02556792087852955,
0.24802890419960022,
-0.014889980666339397,
-0.09478969126939774,
-0.047573018819093704,
-0.046963032335042953,
0.003366398625075817,
0.14621153473854065,
0.0412418469786644,
-0.04987327381968498,
-0.03781063109636307,
0.10026782006025314,
-0.13482987880706787,
-0.10979405045509338,
0.05491257831454277,
-0.12124012410640717,
-0.07154269516468048,
-0.05960661545395851,
0.013585829176008701,
-0.01798424869775772,
0.023548685014247894,
-0.031910110265016556,
-0.006636586505919695,
0.03512117266654968,
0.03095722198486328,
-0.1531020551919937,
-0.08724387735128403,
0.07170771062374115,
-0.03755025938153267,
0.16513460874557495,
-0.00027630801196210086,
0.12023650109767914,
0.0896209105849266,
0.038092128932476044,
-0.03123498521745205,
0.01675710454583168,
0.07693451642990112,
0.0028023060876876116,
0.06983889639377594,
-0.01644425466656685,
-0.020767204463481903,
0.04750363528728485,
-0.04588143527507782,
-0.03610702231526375,
0.061551257967948914,
0.011822688393294811,
-0.03164464607834816,
-0.1469152569770813,
-0.01894978992640972,
-0.12661726772785187,
0.07679063826799393,
0.1498335301876068,
-0.04018877074122429,
0.021775418892502785,
-0.08412735164165497,
0.12383437156677246,
-0.009225428104400635,
-0.06360487639904022,
-0.08296214044094086,
-0.12738695740699768,
-0.010990479029715061,
0.0095487916842103,
-0.037409599870443344,
-0.24147361516952515,
-0.0026693278923630714,
-0.02636227197945118,
0.013167647644877434,
-0.02112038992345333,
0.13212887942790985,
0.11363628506660461,
0.025373701006174088,
-0.01704387366771698,
-0.13040247559547424,
-0.022755742073059082,
0.06642117351293564,
-0.10523014515638351,
-0.12614238262176514
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/php/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_php_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.14234483242034912,
-0.023130258545279503,
0.0007755187107250094,
0.1229771077632904,
0.13340570032596588,
0.022457772865891457,
0.08526068925857544,
0.06874878704547882,
-0.012896365486085415,
0.01205246988683939,
0.07387494295835495,
0.0009137712768279016,
0.0349484384059906,
0.13188683986663818,
0.009108449332416058,
-0.15355722606182098,
-0.020222706720232964,
0.09222683310508728,
-0.13894209265708923,
0.11570187658071518,
0.07915978878736496,
-0.10107462853193283,
0.09425051510334015,
-0.01066735852509737,
-0.17712989449501038,
0.03221225365996361,
-0.04858027771115303,
-0.05571768432855606,
0.10490699857473373,
0.04066785052418709,
0.1310197114944458,
-0.006836432032287121,
0.048281505703926086,
-0.11060448735952377,
0.011816256679594517,
0.019935064017772675,
0.05292055755853653,
0.050011347979307175,
0.0509231761097908,
0.09411947429180145,
0.07094763964414597,
-0.021092720329761505,
0.037507135421037674,
0.0439276397228241,
-0.07051513344049454,
-0.01489595603197813,
-0.03670979291200638,
0.10368373245000839,
0.10959377139806747,
0.1419229805469513,
0.019968798384070396,
0.045258693397045135,
-0.09410962462425232,
0.06792030483484268,
0.08158935606479645,
-0.2703770399093628,
-0.02782008796930313,
0.0648060068488121,
0.03350178897380829,
-0.014625637792050838,
-0.07441088557243347,
-0.030759787186980247,
0.0664103627204895,
0.05922006443142891,
0.09502232074737549,
-0.09187763929367065,
-0.04838847741484642,
-0.02955801784992218,
-0.08263269811868668,
-0.024429315701127052,
0.23274126648902893,
0.049012988805770874,
-0.03575781360268593,
-0.0444154366850853,
-0.055273257195949554,
-0.10014161467552185,
0.014347116462886333,
-0.006330646574497223,
0.010641035623848438,
-0.0021837190724909306,
-0.027429156005382538,
-0.030169842764735222,
-0.10328564047813416,
-0.10453757643699646,
-0.033233385533094406,
0.06967061012983322,
0.07526799291372299,
0.044969283044338226,
-0.045867741107940674,
0.07399194687604904,
0.04279587045311928,
-0.03626929596066475,
0.02847076952457428,
-0.0028888657689094543,
-0.04102364927530289,
-0.004887741524726152,
-0.04500504583120346,
-0.21749182045459747,
0.008496562950313091,
0.01701423153281212,
-0.11258119344711304,
0.07637779414653778,
0.1623193919658661,
0.06679469347000122,
-0.024245770648121834,
0.19687359035015106,
0.0033996356651186943,
-0.09816167503595352,
0.030485782772302628,
0.026649659499526024,
-0.036867670714855194,
0.013396456837654114,
-0.11192641407251358,
-0.04223352670669556,
0.05001800134778023,
-0.01391986757516861,
-0.12315455079078674,
0.08444094657897949,
-0.016848770901560783,
-0.04334234446287155,
0.05212469398975372,
-0.08349131792783737,
0.012928230687975883,
-0.0016927131218835711,
-0.0958552211523056,
0.006402228493243456,
0.11140749603509903,
-0.089034803211689,
-0.14358031749725342,
0.0010419061873108149,
-0.0549049898982048,
-0.0018978900043293834,
-0.1115415170788765,
-0.10328163951635361,
-0.0053823720663785934,
-0.025277798995375633,
0.0028172023594379425,
-0.14961732923984528,
-0.13406960666179657,
-0.04270613193511963,
0.08110746741294861,
0.01463500689715147,
-0.04543624445796013,
-0.06134237349033356,
0.031127706170082092,
-0.002239509718492627,
-0.039313867688179016,
0.02851303294301033,
-0.04660361632704735,
0.07914629578590393,
0.06470129638910294,
0.06460796296596527,
-0.04786679521203041,
0.05898382142186165,
-0.09033980965614319,
0.030903086066246033,
-0.1479242742061615,
0.08117054402828217,
0.04862266033887863,
0.11671643704175949,
-0.0900593176484108,
-0.10789216309785843,
-0.04659010097384453,
0.049204275012016296,
0.07637949287891388,
0.07303093373775482,
-0.03984036669135094,
-0.009521553292870522,
0.09881839901208878,
-0.09210401773452759,
-0.16280677914619446,
0.10719772428274155,
-0.017759524285793304,
0.08610928803682327,
0.07746115326881409,
0.15379032492637634,
0.1320922076702118,
-0.05460364371538162,
0.008844485506415367,
0.07270214706659317,
-0.033813051879405975,
-0.20180143415927887,
0.05628316476941109,
0.05633150413632393,
-0.10254503041505814,
0.020358284935355186,
0.037310268729925156,
0.11881697177886963,
-0.042766183614730835,
-0.026539843529462814,
-0.024668170139193535,
-0.11084320396184921,
-0.005858927499502897,
0.006862160284072161,
0.09494961053133011,
-0.04644705355167389,
-0.0663955807685852,
0.047462303191423416,
0.08240583539009094,
-0.0813455730676651,
0.02924373559653759,
-0.07614228874444962,
-0.048958003520965576,
-0.09864655882120132,
0.01223937887698412,
-0.14906461536884308,
0.008407151326537132,
0.0012029481586068869,
0.02853766269981861,
0.059909939765930176,
0.09755565226078033,
0.04648791626095772,
0.02770087867975235,
-0.027078092098236084,
-0.033375490456819534,
-0.030131325125694275,
-0.04360393062233925,
-0.12791746854782104,
-0.00811319425702095,
-0.07016026973724365,
-0.0090540936216712,
-0.00836133025586605,
-0.14317968487739563,
0.03557179495692253,
-0.08615495264530182,
-0.008740740828216076,
-0.013094409368932247,
0.008038855157792568,
0.04728519916534424,
0.07250203937292099,
-0.023502156138420105,
-0.06155490130186081,
0.08705036342144012,
0.07725019752979279,
-0.07492481917142868,
-0.0006484152982011437,
-0.061564330011606216,
0.015481044538319111,
0.08618863672018051,
-0.07796737551689148,
-0.10431724041700363,
-0.02444976381957531,
-0.03669116646051407,
-0.06115804612636566,
-0.00283798947930336,
-0.018850604072213173,
0.28507915139198303,
0.004152562934905291,
0.18524174392223358,
-0.08628480136394501,
-0.021821312606334686,
-0.01926521584391594,
-0.025319453328847885,
0.08187045902013779,
0.1028384268283844,
0.060645923018455505,
-0.11304081976413727,
0.05173881724476814,
-0.01479580719023943,
-0.06818408519029617,
0.07520770281553268,
-0.019400324672460556,
-0.04900137335062027,
0.05042657256126404,
0.05527380853891373,
-0.011505152098834515,
0.1046038419008255,
-0.14675188064575195,
-0.03291669860482216,
-0.0016234868671745062,
0.020192516967654228,
0.055967144668102264,
-0.1611669957637787,
0.0282596405595541,
0.036820635199546814,
-0.021476343274116516,
-0.03764218091964722,
-0.036660369485616684,
-0.03011758252978325,
0.03379678353667259,
0.05675916746258736,
-0.021751709282398224,
0.027463363483548164,
0.018182506784796715,
-0.09226512908935547,
0.21297776699066162,
-0.0188936498016119,
-0.25143805146217346,
-0.1347987949848175,
0.03671705722808838,
-0.011433854699134827,
-0.008357688784599304,
0.056622739881277084,
-0.09383317083120346,
-0.04532783478498459,
-0.05247323960065842,
0.1221054270863533,
-0.07368446886539459,
0.036678850650787354,
0.007028604857623577,
0.031892888247966766,
0.06741771847009659,
-0.11450066417455673,
0.004356143064796925,
-0.03143615275621414,
-0.00018521310994401574,
0.010305637493729591,
-0.08333292603492737,
0.08688608556985855,
0.1604112833738327,
-0.06916647404432297,
0.038230232894420624,
-0.025311032310128212,
0.1376676708459854,
-0.0430230051279068,
0.007725445553660393,
0.1852097362279892,
0.016520272940397263,
0.026284577324986458,
0.02510090358555317,
0.022211119532585144,
-0.07503842562437057,
0.05151248723268509,
0.034186627715826035,
-0.05071861669421196,
-0.20996811985969543,
-0.05354133993387222,
-0.07023198157548904,
0.0059288484044373035,
0.15350328385829926,
0.06109475716948509,
-0.06606658548116684,
0.07627496868371964,
0.035980116575956345,
0.15096907317638397,
-0.05293617025017738,
0.04152154549956322,
0.07932170480489731,
0.04286538437008858,
0.013869144022464752,
-0.09718367457389832,
-0.04226766154170036,
0.0537903793156147,
0.09060971438884735,
0.22175082564353943,
-0.07956906408071518,
0.11787772178649902,
0.03107531927525997,
0.036272212862968445,
0.0174979567527771,
0.1469700038433075,
-0.09269871562719345,
-0.005484065040946007,
0.006382405292242765,
0.019872397184371948,
-0.07640960067510605,
0.01885269582271576,
-0.07212959229946136,
0.05634433031082153,
-0.14678514003753662,
0.03514863923192024,
0.03741460666060448,
0.21644644439220428,
0.03263315185904503,
-0.2945314943790436,
-0.15247052907943726,
-0.05832843855023384,
-0.0879354402422905,
-0.07862640917301178,
0.061601944267749786,
0.15761692821979523,
-0.04861773923039436,
0.009497081860899925,
-0.04289136826992035,
0.16012077033519745,
-0.11806933581829071,
-0.0004754659894388169,
0.06140614300966263,
0.05858613923192024,
0.020358195528388023,
0.10552427917718887,
-0.22977115213871002,
0.12916439771652222,
-0.007636700756847858,
0.08814942091703415,
-0.028824802488088608,
-0.0018591862171888351,
-0.013576671481132507,
0.08177848905324936,
0.05179692059755325,
0.01579544134438038,
0.008745742961764336,
-0.18510912358760834,
-0.08131404966115952,
0.0221098680049181,
-0.01512176264077425,
0.021819831803441048,
0.08191145956516266,
-0.013762962073087692,
0.04609961435198784,
-0.01354901771992445,
-0.08525854349136353,
-0.054489534348249435,
-0.09426963329315186,
-0.06167835742235184,
0.05520985648036003,
-0.03381084278225899,
-0.016351308673620224,
-0.008202909491956234,
0.045319560915231705,
0.1938367336988449,
-0.06354283541440964,
-0.0995647981762886,
-0.1017225906252861,
0.05742829665541649,
0.06522934883832932,
-0.07355762273073196,
0.040163032710552216,
0.02410629577934742,
0.0046802242286503315,
-0.0006574422004632652,
-0.07703635096549988,
0.0657452940940857,
-0.06111545115709305,
0.004140689503401518,
-0.00997175183147192,
0.08572334796190262,
0.003238542703911662,
0.026159372180700302,
0.010404350236058235,
-0.09280548989772797,
-0.07187926024198532,
-0.11469270288944244,
-0.06408274918794632,
-0.09286624193191528,
0.085597462952137,
-0.035072389990091324,
-0.0732625275850296,
0.1548328399658203,
-0.00426744669675827,
-0.007583807222545147,
0.15492644906044006,
-0.008075443096458912,
-0.031666193157434464,
-0.04970037564635277,
0.10610967129468918,
-0.03434373065829277,
-0.2233240306377411,
-0.0253082774579525,
0.06528005003929138,
0.05162500962615013,
-0.11248913407325745,
-0.16576167941093445,
0.11419910937547684,
0.01807544007897377,
0.030983900651335716,
0.04205377399921417,
-0.2976166009902954,
-0.10094250738620758,
0.058763936161994934,
0.09100794792175293,
0.21762551367282867,
-0.11165347695350647,
-0.018391259014606476,
-0.04158458113670349,
-0.07881413400173187,
0.09088724851608276,
-0.05319011211395264,
0.1376808136701584,
-0.053709279745817184,
0.07685772329568863,
0.011904812417924404,
-0.04717370495200157,
0.032038021832704544,
0.04084845632314682,
0.07460805773735046,
-0.04522407054901123,
0.02640567347407341,
0.04375224933028221,
-0.08886019885540009,
0.20452779531478882,
-0.12994077801704407,
0.05228186398744583,
-0.16175447404384613,
-0.07961610704660416,
-0.035451311618089676,
0.0138141680508852,
0.038852494210004807,
-0.033795662224292755,
-0.0660194456577301,
0.0044054994359612465,
0.027614563703536987,
-0.004492470994591713,
0.05418112874031067,
0.02254081703722477,
-0.03332263603806496,
0.07067063450813293,
0.0789734423160553,
-0.10493919998407364,
-0.20304231345653534,
0.02228446491062641,
0.01570124365389347,
0.12935058772563934,
-0.23583723604679108,
0.014220974408090115,
0.11097463220357895,
-0.014806410297751427,
0.08816047757863998,
0.048244908452034,
-0.020161142572760582,
0.020756056532263756,
0.06263000518083572,
-0.0838891863822937,
-0.06637270003557205,
-0.02679222822189331,
-0.03686922788619995,
-0.07554823160171509,
0.06394712626934052,
0.08105799555778503,
-0.10660108178853989,
0.012125366367399693,
-0.015229524113237858,
-0.037266574800014496,
-0.10801097005605698,
0.2051178514957428,
0.045207079499959946,
0.06669426709413528,
-0.06206975504755974,
0.10238360613584518,
0.07893754541873932,
-0.09477721154689789,
0.023661429062485695,
0.17844045162200928,
-0.11651287227869034,
-0.07161416858434677,
0.08195420354604721,
0.18761195242404938,
-0.0403532050549984,
-0.10750491917133331,
-0.14349547028541565,
-0.09626922756433487,
0.04539673775434494,
0.04503003507852554,
0.06748835742473602,
0.04098424315452576,
-0.05646820366382599,
0.005815967917442322,
-0.14581404626369476,
0.03811819478869438,
0.0749216079711914,
0.045944083482027054,
-0.14432735741138458,
0.1334376037120819,
0.06718908995389938,
0.12145493179559708,
-0.02756333351135254,
0.014226987957954407,
-0.1093076765537262,
0.06282661855220795,
-0.036982983350753784,
0.03995257988572121,
-0.023063959553837776,
0.021314406767487526,
-0.04081765562295914,
0.003895974950864911,
-0.045148804783821106,
0.06466886401176453,
-0.032660987228155136,
-0.016582071781158447,
-0.0011098150862380862,
0.028930243104696274,
-0.022093435749411583,
-0.030247915536165237,
-0.01560728158801794,
-0.041368670761585236,
0.07124140858650208,
-0.008653501980006695,
-0.06303107738494873,
-0.0014434506883844733,
-0.04125494509935379,
-0.00012811266060452908,
0.0634712427854538,
0.04421723261475563,
0.02370048128068447,
0.009543427266180515,
0.043474338948726654,
0.03135964274406433,
0.029073629528284073,
-0.01596895605325699,
0.10764149576425552,
-0.10894062370061874,
-0.06714917719364166,
-0.09863541275262833,
-0.05339645594358444,
-0.05738309398293495,
0.029722584411501884,
0.105488620698452,
0.1132691279053688,
0.13689307868480682,
-0.08171094208955765,
0.020319757983088493,
-0.15662062168121338,
-0.01655924879014492,
0.06494379788637161,
-0.035236213356256485,
-0.0725012794137001,
-0.10075776278972626,
0.059370916336774826,
-0.016689864918589592,
0.13178245723247528,
0.008458451367914677,
0.04592423886060715,
-0.007840348407626152,
0.03141366317868233,
-0.0458843857049942,
-0.02755345031619072,
0.16008241474628448,
-0.0799206793308258,
-0.006848083809018135,
0.010281408205628395,
0.0824580043554306,
0.10526946932077408,
0.13702824711799622,
0.1428001970052719,
0.12111315876245499,
0.028482886031270027,
0.1021726131439209,
-0.03667300567030907,
0.02385975606739521,
-0.15760193765163422,
0.05256925895810127,
-0.034499641507864,
0.043603021651506424,
-0.04725930094718933,
0.14525948464870453,
0.10871416330337524,
-0.06397125124931335,
0.07742799073457718,
0.016958454623818398,
-0.08731485158205032,
-0.020525295287370682,
-0.048664432018995285,
-0.06102763116359711,
-0.1533392369747162,
-0.01302307564765215,
-0.06016520410776138,
-0.09123275429010391,
0.12805737555027008,
0.01845964789390564,
-0.03218584135174751,
0.2301018387079239,
0.01747249998152256,
-0.033523187041282654,
0.04086264222860336,
-0.013657326810061932,
0.020596669986844063,
0.002411856781691313,
-0.004231415223330259,
0.026799947023391724,
-0.03078947402536869,
0.0850997269153595,
0.004959063604474068,
-0.028926922008395195,
0.04687022790312767,
0.026041090488433838,
-0.038777776062488556,
-0.05273662507534027,
0.030548961833119392,
0.06756506860256195,
0.08612740784883499,
0.013491727411746979,
-0.0476154200732708,
-0.043234650045633316,
0.16109490394592285,
-0.05688954517245293,
-0.07356930524110794,
-0.10299321264028549,
0.1224735900759697,
0.06913237273693085,
-0.024590320885181427,
0.004993898794054985,
-0.055553458631038666,
-0.045138098299503326,
0.2691216766834259,
0.11051976680755615,
-0.08454547822475433,
-0.03390247002243996,
-0.02071831189095974,
-0.009581136517226696,
-0.012301795184612274,
0.17856992781162262,
0.07692354917526245,
0.1561015397310257,
-0.024396738037467003,
-0.07486860454082489,
-0.07325766235589981,
-0.019327739253640175,
-0.05209020525217056,
0.10655903816223145,
0.026282260194420815,
0.045792754739522934,
-0.09351005405187607,
0.0437832735478878,
-0.019111203029751778,
-0.07591714709997177,
0.115084707736969,
-0.10050573199987411,
-0.0771922916173935,
-0.015790438279509544,
0.02877281978726387,
-0.003575512208044529,
0.07616806030273438,
-0.02799135632812977,
0.07561878114938736,
0.01571446843445301,
-0.021805094555020332,
-0.12669765949249268,
-0.09801100194454193,
0.0431395098567009,
0.032685816287994385,
0.15830855071544647,
-0.002803412266075611,
0.07900075614452362,
0.066938616335392,
0.03804513439536095,
-0.08826328814029694,
0.08965302258729935,
-0.020599128678441048,
0.02395733818411827,
0.0621664933860302,
0.00011612381058512256,
-0.06467054039239883,
0.08298588544130325,
-0.01766687072813511,
-0.10757976025342941,
-0.05489521473646164,
-0.04988769069314003,
-0.002801449503749609,
-0.13060346245765686,
-0.021495787426829338,
-0.03981451690196991,
0.12149480730295181,
0.16838453710079193,
-0.030942942947149277,
-0.03414752706885338,
-0.06440457701683044,
0.05064002424478531,
0.017493836581707,
0.0015767726581543684,
-0.053341057151556015,
-0.1608860194683075,
-0.023173639550805092,
-0.05709458142518997,
-0.0012413911754265428,
-0.2164558619260788,
-0.03334217146039009,
-0.07840432971715927,
-0.024960827082395554,
-0.08118420839309692,
0.09145193547010422,
0.0659094899892807,
0.040662068873643875,
-0.05402453616261482,
-0.0538446307182312,
-0.02498915046453476,
0.09354022890329361,
-0.14928339421749115,
-0.1427711695432663
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the php function/method.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/php/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_php_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the php function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
77
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11851944029331207,
0.03250424936413765,
-0.0009395870729349554,
0.1097380518913269,
0.08464760333299637,
0.01717735454440117,
0.037275269627571106,
0.10024672001600266,
-0.06348402798175812,
0.054472312331199646,
0.09508825093507767,
-0.057934124022722244,
0.025419114157557487,
0.14209416508674622,
0.03582373261451721,
-0.19846941530704498,
-0.022037118673324585,
0.07016713917255402,
-0.1337752640247345,
0.11515434831380844,
0.07249825447797775,
-0.10726170241832733,
0.07300852239131927,
-0.025799162685871124,
-0.10247106105089188,
0.034531693905591965,
-0.03317743539810181,
-0.01592269167304039,
0.08993234485387802,
0.02722013369202614,
0.10153341293334961,
-0.019528215751051903,
0.0513010248541832,
-0.11769044399261475,
0.00683767581358552,
0.0630299523472786,
0.055323004722595215,
0.05579005554318428,
0.0725044533610344,
0.1188599243760109,
0.042588185518980026,
-0.04107491299510002,
0.027490518987178802,
0.06574998795986176,
-0.05296562239527702,
-0.04545789211988449,
-0.06253091245889664,
0.08624807000160217,
0.10059116780757904,
0.11068814247846603,
0.007073758170008659,
-0.003140533110126853,
-0.08568552881479263,
0.049275416880846024,
0.10245896130800247,
-0.22247187793254852,
-0.03939888998866081,
0.05566982179880142,
0.06515096873044968,
0.010518700815737247,
-0.07132086902856827,
-0.035546742379665375,
0.06420611590147018,
0.0490039624273777,
0.09637466818094254,
-0.09375385195016861,
0.01541479304432869,
-0.0334932878613472,
-0.08869759738445282,
-0.03801698610186577,
0.19082918763160706,
0.08130084723234177,
-0.029555436223745346,
-0.09576603025197983,
-0.09507207572460175,
-0.16678667068481445,
0.009662962518632412,
-0.00510698975995183,
0.021690385416150093,
-0.0014132233336567879,
-0.02039463445544243,
-0.048948537558317184,
-0.10084140300750732,
-0.12054784595966339,
-0.00047263785381801426,
-0.013759694062173367,
0.08209795504808426,
0.03562728688120842,
0.0028175883926451206,
0.0941222757101059,
-0.003258292330428958,
-0.018151137977838516,
0.022123733535408974,
0.02218816988170147,
-0.0834800973534584,
0.0215158648788929,
-0.01673014461994171,
-0.16188712418079376,
0.013582425191998482,
0.07046481966972351,
-0.09543295204639435,
0.07110260426998138,
0.17992399632930756,
0.01785680651664734,
-0.05242734029889107,
0.1804208606481552,
-0.011948026716709137,
-0.08474894613027573,
0.0060026636347174644,
0.0221277866512537,
-0.03100130334496498,
0.03788839280605316,
-0.06589081883430481,
-0.04788685590028763,
0.028967246413230896,
0.020327411592006683,
-0.0747625008225441,
0.05107206478714943,
-0.00046204793034121394,
-0.025877434760332108,
0.11889515072107315,
-0.09016433358192444,
0.02823222428560257,
0.011016062460839748,
-0.04749339818954468,
0.011250874027609825,
0.04904436692595482,
-0.12639091908931732,
-0.1628837287425995,
0.030329091474413872,
-0.048061322420835495,
-0.024457812309265137,
-0.10709512233734131,
-0.09632816165685654,
0.016406327486038208,
-0.05512010678648949,
-0.02302168495953083,
-0.1288520246744156,
-0.09081510454416275,
-0.04670585319399834,
0.05490286275744438,
0.0012709014117717743,
-0.04195554181933403,
-0.024383584037423134,
0.025493936613202095,
0.0018663728842511773,
-0.019435664638876915,
0.0823827013373375,
-0.041306909173727036,
0.07638145983219147,
0.02816890925168991,
0.04686873033642769,
-0.0366530604660511,
0.03997369483113289,
-0.07388109713792801,
0.06449992954730988,
-0.08610941469669342,
0.04266100376844406,
0.05778317153453827,
0.06234077736735344,
-0.12964673340320587,
-0.10815108567476273,
-0.06717703491449356,
0.019929850473999977,
0.06001007556915283,
0.06718066334724426,
-0.06230178847908974,
0.032007843255996704,
0.15531025826931,
-0.061144404113292694,
-0.11174886673688889,
0.12460485845804214,
0.016514064744114876,
0.04447205364704132,
0.055778976529836655,
0.13597743213176727,
0.15252724289894104,
-0.08597932755947113,
0.014569948427379131,
0.09939683973789215,
0.011731485836207867,
-0.14805497229099274,
0.0744984894990921,
-0.022460462525486946,
0.003575363662093878,
0.018527258187532425,
0.04742981120944023,
0.05584501847624779,
-0.011364313773810863,
-0.036359526216983795,
-0.034221649169921875,
-0.08524874597787857,
-0.022734960541129112,
0.007250894792377949,
0.04133529216051102,
-0.04527086392045021,
-0.07101713865995407,
0.026193538680672646,
0.12119826674461365,
-0.09898794442415237,
0.04421280696988106,
-0.03581630438566208,
-0.03418394923210144,
-0.1000429168343544,
0.027602603659033775,
-0.1331041306257248,
0.028056345880031586,
0.024454109370708466,
-0.020200837403535843,
0.02880041114985943,
0.09532583504915237,
0.04076656699180603,
-0.0022321681026369333,
-0.06607739627361298,
-0.04381958767771721,
-0.015324029140174389,
-0.0449487566947937,
-0.1352207213640213,
0.010112835094332695,
-0.09438908100128174,
0.005660626105964184,
-0.01646200194954872,
-0.12510965764522552,
0.030151283368468285,
-0.010114239528775215,
0.0004416552546899766,
0.007285549305379391,
-0.025320274755358696,
0.03445975482463837,
0.048585373908281326,
0.0003032229724340141,
-0.0832456722855568,
0.07261307537555695,
0.06600194424390793,
-0.044694408774375916,
-0.041848890483379364,
-0.06506086140871048,
-0.01927841082215309,
0.07494326680898666,
-0.003449734067544341,
-0.1119629368185997,
-0.025382675230503082,
-0.04298970475792885,
-0.05058707669377327,
-0.04119277372956276,
-0.058188989758491516,
0.1521495133638382,
0.01688353531062603,
0.17749342322349548,
-0.1136191189289093,
-0.051561594009399414,
0.014085812494158745,
0.003497908590361476,
0.06850060075521469,
0.146528422832489,
0.05299905687570572,
-0.05659431591629982,
0.030139142647385597,
0.025783106684684753,
-0.06069999933242798,
0.0761554017663002,
-0.037143681198358536,
-0.09411092847585678,
0.02106644958257675,
0.09053107351064682,
-0.0011237914441153407,
0.11114665865898132,
-0.13265667855739594,
-0.011967750266194344,
-0.004296683706343174,
0.04338996112346649,
0.025035614147782326,
-0.1600702702999115,
0.056539759039878845,
0.041900984942913055,
-0.0519748255610466,
-0.003986652009189129,
-0.03561975434422493,
-0.055363450199365616,
0.03762785345315933,
0.07277446985244751,
0.02142680622637272,
0.016008155420422554,
0.007076911628246307,
-0.08969507366418839,
0.1900310516357422,
-0.029775500297546387,
-0.2035881131887436,
-0.12273494899272919,
0.07488705962896347,
-0.022573325783014297,
-0.020026857033371925,
0.04171742498874664,
-0.101132832467556,
-0.03123496100306511,
-0.08496597409248352,
0.06525630503892899,
-0.12693136930465698,
0.05243859067559242,
-0.06384623050689697,
0.056332752108573914,
0.09658317267894745,
-0.10874027013778687,
0.022221513092517853,
-0.0221731998026371,
-0.008416871540248394,
-0.022796539589762688,
-0.02382495440542698,
0.09580356627702713,
0.15431830286979675,
-0.0736243948340416,
0.03844878822565079,
-0.013555128127336502,
0.12328295409679413,
-0.04740646854043007,
0.06774508208036423,
0.21747930347919464,
0.06411853432655334,
0.03665037453174591,
0.0362694077193737,
0.03906724974513054,
-0.04610751196742058,
0.041230302304029465,
0.06279665976762772,
-0.05079338699579239,
-0.18430779874324799,
-0.0411507822573185,
-0.07956164330244064,
0.043703120201826096,
0.17373672127723694,
0.07307016104459763,
-0.09713533520698547,
0.06881271302700043,
-0.011103612370789051,
0.1246940866112709,
-0.06573279201984406,
0.05128121003508568,
0.0842742845416069,
0.006883941125124693,
-0.0006271969759836793,
-0.10044770687818527,
-0.02661360800266266,
0.07548162341117859,
0.08995194733142853,
0.1733052134513855,
-0.09451977163553238,
0.16697509586811066,
0.032260991632938385,
0.11047802120447159,
0.00019590588635765016,
0.13506367802619934,
-0.08202413469552994,
0.006310116499662399,
-0.00122840388212353,
-0.013271808624267578,
-0.01691279001533985,
0.03479435667395592,
-0.034357305616140366,
0.03873131051659584,
-0.1257544904947281,
-0.02487623132765293,
0.03560773655772209,
0.2467018961906433,
0.10818977653980255,
-0.21174758672714233,
-0.13996604084968567,
-0.05646462365984917,
-0.10277854651212692,
-0.10631529986858368,
0.07740268111228943,
0.162210613489151,
-0.04452031850814819,
0.0013515122700482607,
-0.04852217063307762,
0.14284759759902954,
-0.10909737646579742,
-0.0026250556111335754,
0.1020982414484024,
0.07245956361293793,
0.015608959831297398,
0.11217895150184631,
-0.21058551967144012,
0.09295543283224106,
0.01614854484796524,
0.09106472134590149,
-0.04553642123937607,
0.028020864352583885,
-0.03921465575695038,
0.0030989095102995634,
0.08243057131767273,
0.013800336979329586,
0.04267485439777374,
-0.1491359919309616,
-0.07230445742607117,
0.017414921894669533,
0.03492261841893196,
0.013263815082609653,
0.08303232491016388,
-0.0375497043132782,
0.018881307914853096,
-0.02136216312646866,
-0.10376649349927902,
-0.04282325133681297,
-0.1235857903957367,
-0.044354040175676346,
0.039826758205890656,
-0.04097791388630867,
-0.029989754781126976,
0.028171401470899582,
0.047635894268751144,
0.21396033465862274,
-0.11538530886173248,
-0.08606147021055222,
-0.10484850406646729,
0.05066976323723793,
0.1203828901052475,
-0.07574434578418732,
0.05757742375135422,
-0.004470391198992729,
0.006372901611030102,
0.006970782298594713,
-0.05157081037759781,
0.05628043785691261,
-0.0514398030936718,
-0.06580967456102371,
-0.029442787170410156,
0.10677819699048996,
-0.04745802283287048,
0.03958558291196823,
-0.01861174777150154,
-0.08156470954418182,
-0.07434773445129395,
-0.11979161202907562,
-0.053985368460416794,
-0.07754658907651901,
0.06571420282125473,
-0.04641169309616089,
-0.06661587208509445,
0.16916778683662415,
0.025701558217406273,
-0.04203231260180473,
0.11275248974561691,
0.056797564029693604,
-0.06421540677547455,
-0.030358875170350075,
0.11656121164560318,
-0.018754465505480766,
-0.22473560273647308,
-0.05440441519021988,
0.03467298299074173,
0.025561392307281494,
-0.1063041090965271,
-0.12904952466487885,
0.08186458796262741,
0.039100877940654755,
0.01590574160218239,
0.03474867343902588,
-0.30199140310287476,
-0.11399125307798386,
-0.004311244003474712,
0.06064603477716446,
0.09448271989822388,
-0.11295628547668457,
-0.03297458216547966,
-0.026577385142445564,
-0.023428143933415413,
0.0412917397916317,
-0.004800030030310154,
0.1283155083656311,
-0.048265792429447174,
-0.006443941965699196,
0.006700835190713406,
-0.04982955753803253,
0.03967604041099548,
-0.010942698456346989,
0.0656324103474617,
-0.017053138464689255,
0.034392163157463074,
0.07483246177434921,
-0.08189181983470917,
0.17591813206672668,
-0.09840705245733261,
0.07490221410989761,
-0.13509434461593628,
-0.06748034805059433,
-0.04634469002485275,
-0.01461194921284914,
0.005908128339797258,
-0.05027181655168533,
-0.07766098529100418,
-0.004837924614548683,
0.05173645168542862,
-0.03264632076025009,
-0.002261525020003319,
0.0027329246513545513,
-0.0785626471042633,
0.12283365428447723,
0.05144980177283287,
-0.09395097196102142,
-0.25572699308395386,
0.027368834242224693,
0.00013540468353312463,
0.10292697697877884,
-0.2195882797241211,
0.01326676644384861,
0.09672818332910538,
0.02054000459611416,
0.05778228119015694,
0.03322182595729828,
-0.02198435179889202,
0.016537589952349663,
0.047098465263843536,
-0.07373987883329391,
-0.1339210718870163,
-0.029791079461574554,
-0.12170901894569397,
-0.14313749969005585,
0.06292368471622467,
0.06488288193941116,
-0.08226921409368515,
0.01012984849512577,
-0.011728496290743351,
-0.020560942590236664,
-0.08673536777496338,
0.2358958125114441,
0.04058597609400749,
0.06489542126655579,
-0.06539488583803177,
0.07496652752161026,
0.09039069712162018,
-0.1787320226430893,
0.0022509547416120768,
0.1710442155599594,
-0.10468553006649017,
-0.05342356860637665,
0.09307479858398438,
-0.0008694133721292019,
0.002963454695418477,
-0.07851876318454742,
-0.1257036328315735,
-0.07697232812643051,
0.06574581563472748,
-0.03379731625318527,
0.05745331197977066,
0.06370776146650314,
-0.038972996175289154,
0.012536979280412197,
-0.14390867948532104,
0.07586304098367691,
0.07735241204500198,
0.04588587209582329,
-0.15793220698833466,
0.15226496756076813,
0.037631805986166,
0.10290733724832535,
-0.004555936437100172,
0.03041692078113556,
-0.06867329031229019,
0.03769966959953308,
-0.0312669463455677,
0.004229967016726732,
-0.016803164035081863,
0.014857001602649689,
-0.04264253005385399,
0.049827028065919876,
-0.027689645066857338,
0.057666197419166565,
-0.021583007648587227,
-0.04864663630723953,
-0.026295838877558708,
0.0340627059340477,
-0.024206141009926796,
-0.0034618813078850508,
-0.02588491328060627,
-0.0540948212146759,
0.06540831923484802,
-0.052697207778692245,
-0.04989593103528023,
-0.04644688963890076,
0.011107618920505047,
0.02044755220413208,
0.03246234729886055,
0.04909579083323479,
-0.010776331648230553,
0.027616579085588455,
0.0392506942152977,
0.029119037091732025,
0.0033302854280918837,
-0.014617935754358768,
0.06048387661576271,
-0.13505880534648895,
-0.045245204120874405,
-0.12676174938678741,
-0.03379311040043831,
-0.06977253407239914,
0.037657711654901505,
0.07415604591369629,
0.0803600400686264,
0.10604655742645264,
-0.05989702045917511,
0.01929253153502941,
-0.18742097914218903,
-0.0210853461176157,
0.048904143273830414,
-0.022187961265444756,
-0.1051480770111084,
-0.06601357460021973,
0.07196241617202759,
-0.011651524342596531,
0.11473644524812698,
0.00020732887787744403,
0.057933565229177475,
0.006718036253005266,
0.004410843830555677,
-0.01878831349313259,
-0.013406675308942795,
0.18327069282531738,
-0.078319251537323,
-0.04910936579108238,
-0.018463311716914177,
0.06156134605407715,
0.06341297924518585,
0.22225502133369446,
0.0766918808221817,
0.14135943353176117,
0.059335459023714066,
0.0932704508304596,
-0.08729632943868637,
0.019723931327462196,
-0.1554471254348755,
0.1207021176815033,
-0.02635410614311695,
0.11252102255821228,
-0.07023902237415314,
0.10956822335720062,
0.08596436679363251,
-0.0950283631682396,
0.0720829963684082,
0.019733145833015442,
-0.08173299580812454,
-0.026574397459626198,
-0.10875426232814789,
-0.06426000595092773,
-0.13876719772815704,
-0.024778064340353012,
-0.05174003913998604,
-0.03572511300444603,
0.11766134202480316,
0.018338918685913086,
-0.008126676082611084,
0.2071259468793869,
0.006288188509643078,
-0.04881105199456215,
0.0671546682715416,
0.023131385445594788,
0.03743626922369003,
0.08980254828929901,
0.0006572950514964759,
0.06177861988544464,
-0.09426036477088928,
0.08617502450942993,
0.01405715849250555,
-0.014304587617516518,
0.03258153051137924,
0.03143852949142456,
-0.020821668207645416,
-0.053715839982032776,
0.029446396976709366,
0.0832442194223404,
0.16739919781684875,
0.019779333844780922,
-0.07187534123659134,
-0.04735260829329491,
0.16352787613868713,
-0.07695988565683365,
-0.06330065429210663,
-0.10436075925827026,
0.140884667634964,
0.06418544054031372,
-0.019219931215047836,
0.034170184284448624,
-0.0721442922949791,
-0.01510767824947834,
0.28492841124534607,
0.11483141779899597,
-0.023269278928637505,
-0.03780558332800865,
0.0368557907640934,
-0.022804712876677513,
-0.020652851089835167,
0.17471693456172943,
0.030216045677661896,
0.20044614374637604,
-0.015574230812489986,
-0.009813392534852028,
-0.02172202244400978,
-0.044957928359508514,
-0.08123503625392914,
0.1329919844865799,
0.014240382239222527,
0.05535818263888359,
-0.0695556104183197,
0.024430474266409874,
0.05637392774224281,
-0.11164304614067078,
0.1546943336725235,
-0.06465617567300797,
-0.07217199355363846,
0.013117486611008644,
0.015731824561953545,
0.0025340565480291843,
0.05979036167263985,
-0.0365002378821373,
0.09067615121603012,
0.04363678768277168,
-0.03532164916396141,
-0.11262154579162598,
-0.10670560598373413,
0.05704285576939583,
0.020332640036940575,
0.14879322052001953,
0.015995990484952927,
0.05317224562168121,
0.08536713570356369,
-0.006525489501655102,
-0.11848348379135132,
0.08262105286121368,
0.01763947680592537,
-0.017532916739583015,
0.07805490493774414,
0.028995851054787636,
-0.03806763142347336,
0.06544987112283707,
-0.00961984321475029,
-0.06465156376361847,
-0.031648989766836166,
-0.01851428672671318,
-0.004911978729069233,
-0.16938424110412598,
-0.002849395154044032,
-0.037042830139398575,
0.1261264532804489,
0.1919119656085968,
-0.048430636525154114,
-0.02134537883102894,
-0.06603491306304932,
0.027670664712786674,
0.012549876235425472,
0.01989065296947956,
-0.01380305364727974,
-0.12784068286418915,
0.0007208556635305285,
-0.0399540551006794,
0.004344910383224487,
-0.19877898693084717,
-0.053298063576221466,
-0.032009098678827286,
-0.03360095992684364,
-0.08199276030063629,
0.12175010144710541,
0.02877238765358925,
0.028384169563651085,
-0.037150267511606216,
-0.07916782051324844,
-0.042446888983249664,
0.06685074418783188,
-0.13994617760181427,
-0.11546579748392105
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation php
Pretrained model on programming language php using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized php code functions: it works best with tokenized php functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the php function/method.
## Intended uses & limitations
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_php_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/php/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "public static function update ( $ table ) { if ( ! is_array ( $ table ) ) { $ table = json_decode ( $ table , true ) ; } if ( ! SchemaManager :: tableExists ( $ table [ 'oldName' ] ) ) { throw SchemaException :: tableDoesNotExist ( $ table [ 'oldName' ] ) ; } $ updater = new self ( $ table ) ; $ updater -> updateTable ( ) ; }"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_php_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation php
=====================================================
Pretrained model on programming language php using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized php code functions: it works best with tokenized php functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the php function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the php function or be fine-tuned on other php code tasks. It can be used on unparsed and untokenized php code. However, if the php code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate php function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate php function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing php code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.11301790922880173,
0.06788846850395203,
-0.0014134884113445878,
0.11529090255498886,
0.03464239835739136,
0.017707601189613342,
0.046797819435596466,
0.09154124557971954,
-0.046012889593839645,
0.063169926404953,
0.05441626161336899,
-0.06725513935089111,
0.05438385531306267,
0.17233912646770477,
0.024157380685210228,
-0.13672275841236115,
-0.030852891504764557,
0.051855772733688354,
-0.10209111869335175,
0.11280743777751923,
0.07573465257883072,
-0.09851640462875366,
0.0660719946026802,
-0.03790796920657158,
-0.09430806338787079,
0.04929796978831291,
-0.03251964971423149,
-0.013085816986858845,
0.09034258127212524,
0.059055447578430176,
0.11738509684801102,
-0.028633534908294678,
0.05987895280122757,
-0.19814762473106384,
0.005337951239198446,
0.026491761207580566,
0.056063130497932434,
0.03818271681666374,
0.06236081197857857,
0.08872272819280624,
0.05415361747145653,
-0.02293277718126774,
0.037730444222688675,
0.05610818788409233,
-0.05861988663673401,
-0.04356836527585983,
-0.06935171037912369,
0.08636412769556046,
0.07942923903465271,
0.10426618158817291,
-0.001970564015209675,
0.029989274218678474,
-0.09279093891382217,
0.06812260299921036,
0.12220298498868942,
-0.22471962869167328,
-0.019959190860390663,
0.08657620847225189,
0.08841445297002792,
0.034324098378419876,
-0.08472509682178497,
-0.04784705117344856,
0.09574336558580399,
0.03939587250351906,
0.044924963265657425,
-0.09333132952451706,
-0.0021228010300546885,
0.000247083225985989,
-0.053443752229213715,
-0.05403679981827736,
0.18722935020923615,
0.055958252400159836,
-0.044576529413461685,
-0.10709036141633987,
-0.04804851487278938,
-0.16574802994728088,
0.021689213812351227,
0.0050226482562720776,
0.019756922498345375,
-0.0018781002145260572,
-0.005844008177518845,
-0.01422454696148634,
-0.10005732625722885,
-0.11789033561944962,
0.024476036429405212,
0.00404390087351203,
0.0674438327550888,
0.023883843794465065,
-0.037233978509902954,
0.09399101883172989,
0.05037851259112358,
-0.03925127163529396,
-0.004305180162191391,
0.02336367405951023,
-0.09070606529712677,
0.02135162428021431,
-0.017549831420183182,
-0.08527977764606476,
0.015429857186973095,
0.09374719113111496,
-0.10603278875350952,
0.08236438035964966,
0.10889919102191925,
0.013949256390333176,
-0.011198100633919239,
0.2122025042772293,
0.04445047304034233,
-0.14167791604995728,
0.02396252378821373,
0.028887799009680748,
-0.013227077201008797,
0.03257951885461807,
-0.05851147696375847,
-0.05257995054125786,
0.027746664360165596,
0.0565393902361393,
-0.12068365514278412,
0.03520094230771065,
-0.0552034005522728,
-0.01579742133617401,
0.08894320577383041,
-0.1317606270313263,
0.03352704644203186,
0.02596360072493553,
-0.050471317023038864,
-0.039935339242219925,
0.08351486921310425,
-0.13702990114688873,
-0.14682213962078094,
0.018466517329216003,
-0.04046286642551422,
-0.040790360420942307,
-0.10847045481204987,
-0.0990225076675415,
0.0032682400196790695,
-0.007055423688143492,
-0.004479510709643364,
-0.09852250665426254,
-0.07564359903335571,
-0.029616331681609154,
0.04265781491994858,
0.005975921638309956,
-0.03173819184303284,
-0.0490083321928978,
0.01418438833206892,
-0.007886148057878017,
-0.024109095335006714,
0.025563782081007957,
-0.03705444931983948,
0.08768615871667862,
0.0658201277256012,
0.03266981616616249,
-0.016194911673665047,
0.028416449204087257,
-0.07488463819026947,
0.0800059586763382,
-0.09990493953227997,
0.06300049275159836,
0.002615120727568865,
0.050488173961639404,
-0.11371598392724991,
-0.07791410386562347,
-0.007504363544285297,
0.04275371879339218,
0.08127608895301819,
0.0567336343228817,
-0.11147385090589523,
0.028433071449398994,
0.1666278839111328,
-0.09118268638849258,
-0.13953141868114471,
0.0996527150273323,
-0.01138085126876831,
0.040223609656095505,
0.06722202152013779,
0.13834406435489655,
0.141444593667984,
-0.09163004159927368,
-0.018855886533856392,
0.07749169319868088,
0.060907986015081406,
-0.077725350856781,
0.06536228209733963,
-0.006329759024083614,
-0.00046189522254280746,
0.02806280553340912,
0.05518864467740059,
0.043268490582704544,
0.007075627334415913,
-0.04163254424929619,
-0.028543516993522644,
-0.09271377325057983,
-0.037414904683828354,
-0.0047735958360135555,
0.027138618752360344,
-0.04995056986808777,
-0.06013926863670349,
0.0298030786216259,
0.15708626806735992,
-0.10658826678991318,
0.04451538249850273,
-0.07826484739780426,
-0.028149591758847237,
-0.0777047649025917,
0.024800539016723633,
-0.12836086750030518,
0.0257243774831295,
0.05123429745435715,
-0.027924317866563797,
0.06755116581916809,
0.0766746997833252,
0.02197588048875332,
-0.0013150143204256892,
-0.05752477049827576,
-0.041748542338609695,
-0.03598432615399361,
-0.06489913910627365,
-0.12063927203416824,
-0.027099980041384697,
-0.07752053439617157,
-0.0147897033020854,
-0.031336892396211624,
-0.167960524559021,
-0.00453011691570282,
-0.00394959794357419,
0.020292021334171295,
0.009172367863357067,
-0.030651651322841644,
0.019590385258197784,
0.03709835186600685,
-0.031604088842868805,
-0.0704537183046341,
0.027665890753269196,
0.049037862569093704,
-0.0848977342247963,
-0.037815701216459274,
-0.08201262354850769,
-0.07021239399909973,
0.08712130784988403,
0.07656989246606827,
-0.11138182878494263,
-0.043628256767988205,
-0.032180994749069214,
-0.043623026460409164,
-0.025732681155204773,
-0.05790406093001366,
0.1764535903930664,
0.014285067096352577,
0.17195840179920197,
-0.1534241884946823,
-0.07717961072921753,
-0.024621540680527687,
0.011056280694901943,
0.04289238527417183,
0.15835413336753845,
0.00612248107790947,
-0.07040765881538391,
0.032905980944633484,
-0.004355101380497217,
-0.04335590451955795,
0.12642617523670197,
-0.03728288784623146,
-0.07551469653844833,
0.007413611747324467,
0.1158352866768837,
0.0013790944358333945,
0.18150296807289124,
-0.0801074430346489,
0.004779428709298372,
-0.007594102527946234,
0.017896393314003944,
0.0391375869512558,
-0.13728713989257812,
0.040022969245910645,
0.033720824867486954,
-0.07237538695335388,
-0.01747909188270569,
-0.034721311181783676,
-0.03550005704164505,
0.04365149512887001,
0.033018238842487335,
0.036991868168115616,
-0.011311633512377739,
-0.030580634251236916,
-0.10939840227365494,
0.1882108449935913,
-0.07354555279016495,
-0.202956885099411,
-0.15871255099773407,
0.09779376536607742,
-0.03198102116584778,
-0.019362028688192368,
0.02904156595468521,
-0.1111956238746643,
-0.056518491357564926,
-0.09940280765295029,
0.09670879691839218,
-0.10638631135225296,
0.002585009438917041,
-0.036078039556741714,
0.06236160546541214,
0.07463056594133377,
-0.16866618394851685,
0.02784135192632675,
-0.020184289664030075,
0.011580712161958218,
-0.03085271269083023,
-0.05713389441370964,
0.0800955519080162,
0.12704361975193024,
-0.06562823802232742,
0.029258187860250473,
-0.0005538361729122698,
0.16790801286697388,
-0.057452164590358734,
0.042431529611349106,
0.17308342456817627,
0.009093879722058773,
0.03287481144070625,
0.04477496072649956,
0.014496715739369392,
-0.08289904147386551,
0.055977847427129745,
0.05904890224337578,
-0.023134782910346985,
-0.23253275454044342,
-0.03372173756361008,
-0.07431982457637787,
0.036777228116989136,
0.12734058499336243,
0.06156161054968834,
-0.1517045497894287,
0.0353236049413681,
-0.01066640205681324,
0.1478164941072464,
-0.02817818708717823,
0.05407634377479553,
0.04121801257133484,
0.010452428832650185,
-0.011882264167070389,
-0.10723096132278442,
-0.012688578106462955,
0.0779680535197258,
0.11937715858221054,
0.18422259390354156,
-0.09346737712621689,
0.16532576084136963,
0.02165624126791954,
0.11067164689302444,
0.013456338085234165,
0.10815607756376266,
-0.12119491398334503,
0.01957947574555874,
0.005365083459764719,
-0.01854577660560608,
-0.050141457468271255,
0.04278254508972168,
-0.043963972479104996,
0.06716585904359818,
-0.08454425632953644,
-0.020043913275003433,
0.02894451469182968,
0.1965014785528183,
0.08388374000787735,
-0.1663128137588501,
-0.12374459952116013,
-0.007971140556037426,
-0.10308454185724258,
-0.10975712537765503,
0.07498379796743393,
0.1931724101305008,
-0.05190956965088844,
0.013387970626354218,
-0.01216057687997818,
0.14090365171432495,
-0.0930977389216423,
-0.021853547543287277,
0.04007252678275108,
0.05088119953870773,
0.008395734243094921,
0.12173379212617874,
-0.24430568516254425,
0.08142418414354324,
0.020435690879821777,
0.09181680530309677,
-0.030001778155565262,
0.05005151405930519,
-0.049757275730371475,
-0.017672305926680565,
0.08091405779123306,
0.010668355040252209,
0.010512921027839184,
-0.17583191394805908,
-0.04577123373746872,
0.03059123456478119,
0.054673947393894196,
0.0053529199212789536,
0.08757255226373672,
-0.025716468691825867,
0.043782368302345276,
-0.021756626665592194,
-0.11909136921167374,
-0.06319039314985275,
-0.14081299304962158,
-0.048683974891901016,
-0.007004890125244856,
-0.049869928508996964,
-0.02226192317903042,
0.050165046006441116,
0.01908271759748459,
0.22471696138381958,
-0.15508753061294556,
-0.07664233446121216,
-0.09014775604009628,
0.05015534535050392,
0.1370786875486374,
-0.09116831421852112,
0.033418208360672,
0.003993472550064325,
0.034259118139743805,
-0.030327489599585533,
-0.07028581202030182,
0.02622046507894993,
-0.045042652636766434,
-0.0816662386059761,
-0.031874705106019974,
0.11559160053730011,
-0.010574422776699066,
0.0444762147963047,
0.015378525480628014,
-0.09801562130451202,
-0.047112882137298584,
-0.12352679669857025,
-0.09079419076442719,
-0.049227140843868256,
0.07211444526910782,
-0.027437424287199974,
-0.12263408303260803,
0.10391189903020859,
-0.004864915274083614,
-0.0758192166686058,
0.08682958781719208,
0.14901261031627655,
-0.06634888797998428,
0.01771402545273304,
0.09661021828651428,
-0.056326139718294144,
-0.19837068021297455,
-0.048021622002124786,
0.03467065468430519,
0.05502639338374138,
-0.04741861671209335,
-0.12942816317081451,
0.08080801367759705,
-0.012904684990644455,
0.019029628485441208,
0.004518160596489906,
-0.25575292110443115,
-0.13326554000377655,
0.017199071124196053,
0.06472741812467575,
0.037611931562423706,
-0.11132926493883133,
-0.04682988300919533,
-0.06521874666213989,
-0.04664924740791321,
0.0577559769153595,
0.05244073271751404,
0.10672246664762497,
-0.03298225998878479,
0.027629569172859192,
0.036217667162418365,
-0.030870651826262474,
0.0537990964949131,
-0.010797228664159775,
0.09636451303958893,
-0.024584079161286354,
0.011431897059082985,
0.06465042382478714,
-0.06488748639822006,
0.17742693424224854,
-0.15158000588417053,
0.10335277020931244,
-0.18784281611442566,
-0.04833516478538513,
-0.027249079197645187,
-0.01533490139991045,
-0.0349699929356575,
-0.03678201511502266,
-0.12347881495952606,
0.03112480603158474,
0.06483808904886246,
-0.023576661944389343,
0.020409569144248962,
-0.008419018238782883,
-0.07336503267288208,
0.06518080085515976,
0.09262136369943619,
-0.01453064288944006,
-0.1612756848335266,
0.03844507038593292,
0.016222674399614334,
0.09260332584381104,
-0.1928485631942749,
0.03549398109316826,
0.105487160384655,
0.013829999603331089,
0.08472185581922531,
0.017337916418910027,
-0.08500245213508606,
0.011877286247909069,
0.0660594180226326,
-0.06842779368162155,
-0.09651070833206177,
-0.013248205184936523,
-0.05973324179649353,
-0.1126323863863945,
0.04195993021130562,
0.07319263368844986,
-0.04566263034939766,
-0.007032059598714113,
-0.004374193958938122,
-0.000670885550789535,
-0.0858968198299408,
0.19657230377197266,
0.0384247787296772,
0.06688657402992249,
-0.05993412807583809,
0.07836143672466278,
0.09353066980838776,
-0.12480761110782623,
0.035704806447029114,
0.15102817118167877,
-0.08227255195379257,
-0.02242283523082733,
0.12543325126171112,
0.10773394256830215,
-0.010534738190472126,
-0.06556900590658188,
-0.10542312264442444,
-0.07565345615148544,
0.03219052404165268,
0.037839826196432114,
0.06805811077356339,
0.08105756342411041,
-0.00969483982771635,
0.001355519751086831,
-0.1162494346499443,
0.100446917116642,
0.09316420555114746,
0.03626200556755066,
-0.12344829738140106,
0.13548322021961212,
0.027715526521205902,
0.08644349873065948,
0.003494134871289134,
0.030366158112883568,
-0.11850720643997192,
0.04153905436396599,
-0.04753449559211731,
0.048654958605766296,
-0.0016012744745239615,
0.04466131329536438,
-0.0425768606364727,
0.04841497913002968,
-0.026280952617526054,
0.04901435971260071,
-0.03490148112177849,
-0.026184258982539177,
-0.028019404038786888,
0.035132311284542084,
-0.06286568939685822,
-0.009738619439303875,
0.0035172621719539165,
-0.0771985575556755,
0.09017699956893921,
-0.06251661479473114,
-0.025306235998868942,
-0.0047577968798577785,
0.0013975136680528522,
0.042854100465774536,
0.002805663738399744,
0.0604732520878315,
-0.014987500384449959,
0.0184160303324461,
0.033251043409109116,
0.0334283784031868,
-0.00640952680259943,
-0.006832081358879805,
0.10302907228469849,
-0.13540756702423096,
-0.08138919621706009,
-0.1148064136505127,
-0.07314462214708328,
-0.061126865446567535,
0.07476974278688431,
0.07489794492721558,
0.07377789169549942,
0.08996615558862686,
-0.04441411420702934,
0.003686519805341959,
-0.173368901014328,
-0.04154002666473389,
0.05832749232649803,
0.0007452427526004612,
-0.11048052459955215,
-0.05113871023058891,
0.061691660434007645,
-0.027183491736650467,
0.1308717280626297,
-0.012407105416059494,
0.042354024946689606,
-0.011682574637234211,
-0.0367744006216526,
-0.03913019970059395,
0.0010509949643164873,
0.1819724589586258,
-0.10907278209924698,
-0.0013650133041664958,
-0.020198160782456398,
0.012502598576247692,
0.036440830677747726,
0.1825685203075409,
0.09297308325767517,
0.15227766335010529,
0.02878456749022007,
0.09075683355331421,
-0.05800652503967285,
-0.013946580700576305,
-0.10269013792276382,
0.08717575669288635,
-0.022197311744093895,
0.046432387083768845,
-0.04800902679562569,
0.150476336479187,
0.07637257874011993,
-0.1359056681394577,
0.10603697597980499,
0.004529389552772045,
-0.09786826372146606,
-0.034588173031806946,
-0.09940514713525772,
-0.05186351761221886,
-0.11176448315382004,
0.002453995868563652,
-0.09756475687026978,
-0.012648310512304306,
0.06914029270410538,
0.03358888626098633,
-0.02856658585369587,
0.16928131878376007,
-0.028612112626433372,
-0.060194723308086395,
0.03636869788169861,
0.04391630366444588,
0.03405015543103218,
0.10252641141414642,
0.021330947056412697,
0.06743304431438446,
-0.06965707987546921,
0.07289943099021912,
0.04068121314048767,
0.0036763164680451155,
0.013097196817398071,
0.03187916800379753,
0.0024845520965754986,
-0.050422102212905884,
0.0011151960352435708,
0.09492803364992142,
0.1823454201221466,
0.03877507150173187,
-0.050975997000932693,
-0.052925530821084976,
0.15230901539325714,
-0.04823397472500801,
-0.07508444786071777,
-0.12310875207185745,
0.1403563916683197,
0.041730448603630066,
0.004580499138683081,
0.02385028637945652,
-0.08225209265947342,
-0.009890264831483364,
0.28462809324264526,
0.05981064960360527,
-0.05572570115327835,
-0.02585579827427864,
0.016629057005047798,
-0.0065279677510261536,
-0.03153327852487564,
0.1598530113697052,
0.005632583983242512,
0.23854172229766846,
0.012706550769507885,
-0.024832293391227722,
-0.039265964180231094,
-0.0423143208026886,
-0.019392816349864006,
0.20442084968090057,
-0.03090599924325943,
0.02484256401658058,
-0.09259001910686493,
-0.00970615167170763,
0.02774161472916603,
-0.13599219918251038,
0.13055028021335602,
-0.11447394639253616,
-0.06764838099479675,
0.012060649693012238,
0.043467432260513306,
-0.041829872876405716,
0.03165186941623688,
-0.020135022699832916,
0.07520594447851181,
0.03921232372522354,
-0.028425391763448715,
-0.09964099526405334,
-0.12936221063137054,
0.05793878808617592,
-0.011857489123940468,
0.16010692715644836,
0.026953712105751038,
0.09387887269258499,
0.080347940325737,
0.010358432307839394,
-0.08195027709007263,
0.07477279007434845,
0.038584958761930466,
-0.008109803311526775,
0.062203168869018555,
0.07921179383993149,
-0.03818535804748535,
0.11818376183509827,
0.00497055659070611,
0.0021163492929190397,
-0.012037922628223896,
-0.017733527347445488,
-0.021686168387532234,
-0.17559914290905,
0.0020150726195424795,
-0.07646133005619049,
0.13645361363887787,
0.1822652965784073,
-0.05090190842747688,
-0.01046181470155716,
-0.05635234713554382,
0.07442542165517807,
-0.016003940254449844,
0.05648098513484001,
-0.001090899226255715,
-0.1511652171611786,
0.014479042962193489,
-0.040638506412506104,
0.007387127727270126,
-0.2083866447210312,
-0.04519249498844147,
-0.026783112436532974,
-0.03070659376680851,
-0.08950620889663696,
0.1543653905391693,
0.057740718126297,
0.03276262432336807,
-0.04064972698688507,
-0.11174166947603226,
-0.022580012679100037,
0.04826343059539795,
-0.12131694704294205,
-0.11019144207239151
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus python dataset.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/python/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_python
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus python dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
112
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09766249358654022,
0.02147459238767624,
-0.0006262424285523593,
0.06462004035711288,
0.15445031225681305,
0.020631251856684685,
0.07963903248310089,
0.06600847095251083,
-0.0022049208637326956,
-0.04733183607459068,
0.09256942570209503,
0.1683204621076584,
0.028493216261267662,
0.1263503134250641,
-0.015838248655200005,
-0.22370755672454834,
-0.010424407199025154,
0.06158769875764847,
-0.1564856618642807,
0.13806603848934174,
0.11096695065498352,
-0.03235277533531189,
0.09337104856967926,
0.005055610090494156,
-0.20754197239875793,
0.05446014180779457,
-0.0013083922676742077,
-0.07448302209377289,
0.13405457139015198,
0.0679069384932518,
0.13266707956790924,
0.002992939902469516,
0.020927520468831062,
-0.2189430445432663,
0.0297627504914999,
-0.023573117330670357,
-0.0030028019100427628,
0.040354397147893906,
0.0436643548309803,
-0.07655161619186401,
0.24096813797950745,
-0.012163372710347176,
0.056602638214826584,
0.049392011016607285,
-0.11325600743293762,
-0.11614245921373367,
-0.021611027419567108,
0.0051190536469221115,
0.08386103063821793,
0.09436888247728348,
0.017428822815418243,
0.13920807838439941,
-0.12650536000728607,
0.14259123802185059,
0.08052180707454681,
-0.17867717146873474,
-0.018851708620786667,
0.11202596873044968,
0.08801893889904022,
-0.0675106793642044,
-0.029171813279390335,
0.0027352417819201946,
0.056535884737968445,
0.026979781687259674,
0.01417646836489439,
-0.12401506304740906,
-0.13691526651382446,
0.03127190098166466,
-0.09173586964607239,
-0.07958938181400299,
0.2627294361591339,
-0.024284707382321358,
-0.05084719508886337,
-0.048078324645757675,
-0.03967723622918129,
0.019794100895524025,
-0.015584400855004787,
0.036533694714307785,
-0.012195361778140068,
-0.008932211436331272,
-0.027600279077887535,
-0.011531319469213486,
-0.09686531871557236,
-0.10187743604183197,
0.007786322385072708,
0.10905338823795319,
0.01270358543843031,
0.031052235513925552,
-0.16820861399173737,
0.10928509384393692,
0.09030641615390778,
-0.05854106321930885,
0.024855736643075943,
-0.05658777803182602,
-0.03200628608465195,
-0.021220102906227112,
-0.044947825372219086,
-0.12245029956102371,
0.07014987617731094,
0.10959566384553909,
-0.027907108888030052,
0.053727708756923676,
0.022282330319285393,
0.056110017001628876,
0.05163348838686943,
0.17788778245449066,
-0.004936495795845985,
-0.03917766734957695,
0.050491660833358765,
-0.03073512762784958,
-0.05151747539639473,
0.0036066414322704077,
-0.06899619102478027,
-0.03578401729464531,
0.020589660853147507,
0.12612788379192352,
-0.05980665981769562,
0.08987979590892792,
-0.065691739320755,
-0.037457507103681564,
-0.04790528863668442,
-0.13208255171775818,
-0.02008727937936783,
0.008863260969519615,
-0.05928468704223633,
0.0033257126342505217,
0.1335940808057785,
-0.07324681431055069,
-0.11128141731023788,
0.006168292835354805,
-0.07326920330524445,
0.008651181124150753,
-0.08722252398729324,
-0.09970764815807343,
0.008150997571647167,
0.04872322827577591,
0.07675841450691223,
-0.12448446452617645,
-0.1640823632478714,
0.005667613819241524,
0.09999076277017593,
0.018449358642101288,
0.025115342810750008,
-0.08333277702331543,
-0.013751222752034664,
-0.022916976362466812,
-0.011831667274236679,
0.01436694711446762,
-0.0769803375005722,
0.10229344666004181,
0.08379009366035461,
0.03800724074244499,
-0.07513183355331421,
0.04234002158045769,
-0.11489590257406235,
0.06265709549188614,
-0.1381843239068985,
0.07753442972898483,
-0.054444290697574615,
0.13514330983161926,
-0.1012943759560585,
-0.07777397334575653,
0.06557092815637589,
0.062127213925123215,
0.05111755058169365,
0.1405041664838791,
-0.1053127720952034,
-0.05277286469936371,
0.15630771219730377,
-0.10393806546926498,
-0.20599697530269623,
0.08517396450042725,
-0.07509760558605194,
0.1969601958990097,
0.07176243513822556,
0.15636217594146729,
0.1779967099428177,
-0.10168113559484482,
0.056815531104803085,
0.09956145286560059,
-0.03936666250228882,
-0.05180848762392998,
0.06633400917053223,
0.04873298108577728,
-0.14490531384944916,
0.04705079644918442,
-0.020700741559267044,
0.12741197645664215,
-0.03782431408762932,
-0.04178580641746521,
-0.010987861081957817,
-0.055337511003017426,
0.0659368485212326,
-0.004591888282448053,
0.07475584745407104,
0.005361564923077822,
-0.038691114634275436,
0.06574046611785889,
0.13539767265319824,
-0.12785868346691132,
-0.0038881974760442972,
-0.10848688334226608,
0.09236191213130951,
-0.09552693367004395,
0.01544023398309946,
-0.20976294577121735,
-0.02302492782473564,
-0.01808866672217846,
0.04886331781744957,
0.062492258846759796,
0.015556358732283115,
0.010810790583491325,
0.004005874041467905,
0.022093474864959717,
-0.00020257735741324723,
0.0021349708549678326,
-0.02185862511396408,
-0.044915251433849335,
-0.07595711946487427,
-0.053196366876363754,
-0.046772804111242294,
0.05873214453458786,
-0.1920934021472931,
0.005389368161559105,
0.033487774431705475,
0.053196053951978683,
0.01460691262036562,
0.0241700429469347,
0.030572013929486275,
0.060238033533096313,
-0.0512864924967289,
-0.024318689480423927,
0.061455562710762024,
0.012523014098405838,
-0.09910698980093002,
0.005434995051473379,
-0.0857507586479187,
0.041042279452085495,
0.11682411283254623,
-0.14757731556892395,
-0.07139124721288681,
-0.02388661541044712,
-0.021613165736198425,
-0.008127299137413502,
0.006216882728040218,
-0.02045675925910473,
0.21430060267448425,
0.002387765096500516,
0.1772681176662445,
-0.09452524036169052,
-0.02849760092794895,
-0.03498266637325287,
-0.030124204233288765,
0.042063985019922256,
0.13752488791942596,
0.08182471245527267,
-0.17590996623039246,
0.05100635066628456,
0.04992658644914627,
-0.027848541736602783,
0.16985926032066345,
-0.04630653187632561,
-0.038992948830127716,
-0.0055716740898787975,
0.08213908225297928,
-0.013721616007387638,
0.14892932772636414,
-0.18611249327659607,
-0.020916739478707314,
0.013977455906569958,
-0.016632501035928726,
0.108249731361866,
-0.12667877972126007,
-0.006218534894287586,
0.03740527480840683,
-0.028122104704380035,
-0.1669662445783615,
0.030041376128792763,
0.01704358495771885,
0.03700939938426018,
-0.003212020266801119,
-0.024608174338936806,
0.02578338235616684,
-0.01972990483045578,
-0.12703587114810944,
0.24876822531223297,
-0.07669892907142639,
-0.2676839828491211,
-0.18655498325824738,
-0.020526938140392303,
-0.0010862776543945074,
-0.02425864338874817,
0.04799516499042511,
-0.049405962228775024,
-0.04002571851015091,
-0.017825501039624214,
0.1609032154083252,
-0.08683067560195923,
-0.0422319732606411,
-0.04612287878990173,
0.06613251566886902,
0.0030357176437973976,
-0.1856512427330017,
0.0014376797480508685,
0.018363235518336296,
0.041260723024606705,
0.024521848186850548,
-0.1637582927942276,
0.0942307636141777,
0.0970778688788414,
-0.05273834988474846,
0.039236292243003845,
-0.04359624534845352,
0.24264229834079742,
-0.07265163958072662,
-0.09394364058971405,
0.1524418741464615,
-0.10688794404268265,
0.015065331943333149,
0.017994234338402748,
0.004897193517535925,
-0.12153631448745728,
0.029784632846713066,
-0.04427231848239899,
-0.06785718351602554,
-0.23357480764389038,
-0.11426433175802231,
-0.08852118253707886,
0.11454305797815323,
0.07807912677526474,
0.0226316899061203,
-0.08995141834020615,
0.0672963485121727,
0.06153661757707596,
0.12337259948253632,
-0.00496253278106451,
0.07900505512952805,
0.04983540624380112,
0.006119732279330492,
-0.0052994368597865105,
-0.1093265637755394,
-0.054555539041757584,
0.030872460454702377,
0.09352400153875351,
0.19372494518756866,
0.00600263150408864,
0.1306208372116089,
0.0516507662832737,
0.059626203030347824,
0.031452544033527374,
0.18748994171619415,
-0.101254403591156,
0.029458237811923027,
0.007734634913504124,
-0.027425475418567657,
-0.13974995911121368,
0.014069957658648491,
-0.013384062796831131,
0.022279230877757072,
-0.1514306664466858,
-0.06626739352941513,
0.04163268581032753,
0.07269667834043503,
0.024988245218992233,
-0.27096039056777954,
-0.10850346088409424,
0.02369202859699726,
-0.08889489620923996,
-0.07767990231513977,
0.05029108002781868,
0.06770692020654678,
-0.14917084574699402,
0.017812376841902733,
-0.07339929789304733,
0.16551926732063293,
-0.08183671534061432,
0.0063367802649736404,
-0.06875374913215637,
-0.06601373106241226,
0.0003160728665534407,
0.1523950695991516,
-0.19861847162246704,
0.23703671991825104,
0.00010754554386949167,
0.030036643147468567,
-0.06665828824043274,
0.035740066319704056,
0.009586038067936897,
0.09851957112550735,
0.12377224117517471,
-0.021846015006303787,
-0.00887022353708744,
-0.17162060737609863,
0.00944464560598135,
0.08488833159208298,
0.0601150244474411,
-0.027223708108067513,
0.09355876594781876,
-0.03690386936068535,
0.03896936774253845,
-0.01766015775501728,
-0.10694513469934464,
-0.054223038256168365,
-0.1264112889766693,
-0.012104840017855167,
-0.09222745150327682,
0.04743218421936035,
-0.016256917268037796,
0.017696574330329895,
0.07432854920625687,
0.17675884068012238,
-0.08189123868942261,
-0.08380478620529175,
-0.1105508953332901,
0.03538995236158371,
0.11346200108528137,
-0.08437713235616684,
0.03269410505890846,
-0.003293298650532961,
0.00533126899972558,
-0.003528350032866001,
-0.16375836730003357,
0.05544203147292137,
-0.07047241926193237,
0.027744637802243233,
-0.013383543118834496,
0.10450278222560883,
-0.01017832476645708,
0.0050767557695508,
0.059852905571460724,
-0.056139327585697174,
-0.0794021487236023,
-0.12319844216108322,
-0.1325480043888092,
-0.06264734268188477,
0.04690544679760933,
0.07661157101392746,
-0.12823593616485596,
0.027156393975019455,
-0.04803883284330368,
0.004857901483774185,
0.23507115244865417,
0.1144677996635437,
-0.029987022280693054,
0.010966169647872448,
0.08546129614114761,
-0.09762004762887955,
-0.23956067860126495,
-0.01373977866023779,
-0.020435696467757225,
0.06993017345666885,
0.02081393450498581,
-0.17112118005752563,
0.08452685177326202,
-0.017401346936821938,
0.0454736091196537,
0.04718524590134621,
-0.29984763264656067,
-0.09222501516342163,
0.13302098214626312,
0.12497507780790329,
0.06546513736248016,
-0.12324458360671997,
-0.03620145469903946,
-0.07682371884584427,
-0.19200477004051208,
0.1686265468597412,
-0.08886007964611053,
0.0936705470085144,
-0.0017716207075864077,
0.10942798107862473,
0.03261478617787361,
-0.042491380125284195,
0.11280840635299683,
-0.008705337531864643,
0.07872705161571503,
-0.02415907010436058,
-0.093645840883255,
0.07181090116500854,
-0.03533365577459335,
0.1415223777294159,
-0.11415688693523407,
0.08164659142494202,
-0.28235816955566406,
-0.039998359978199005,
-0.03200700506567955,
0.04909312725067139,
-0.008042145520448685,
-0.06683597713708878,
-0.07660526037216187,
0.00103543431032449,
0.040580905973911285,
0.014748243615031242,
0.10481675714254379,
-0.0351165272295475,
0.0283795353025198,
0.055899832397699356,
0.15102390944957733,
0.015078644268214703,
-0.10068745166063309,
0.03520558401942253,
0.020398128777742386,
0.09117638319730759,
-0.23316408693790436,
0.06358661502599716,
0.12451433390378952,
0.04289697855710983,
0.1070321723818779,
0.0777881070971489,
-0.03210173547267914,
0.037512119859457016,
0.0883377194404602,
-0.11971443146467209,
-0.07848281413316727,
-0.03821452334523201,
-0.0578356571495533,
-0.027711492031812668,
0.053603656589984894,
0.143877312541008,
-0.05881484970450401,
-0.020681574940681458,
-0.0016548982821404934,
-0.02461160346865654,
-0.14005446434020996,
0.13291816413402557,
0.03887597471475601,
0.06931215524673462,
-0.08993295580148697,
0.05228206142783165,
0.04306701943278313,
-0.12638886272907257,
-0.030903253704309464,
0.11894137412309647,
-0.12537439167499542,
-0.07919679582118988,
-0.003595340298488736,
0.23456257581710815,
-0.13020779192447662,
-0.06574746966362,
-0.10833058506250381,
-0.04698770493268967,
-0.013133297674357891,
0.21185055375099182,
0.10473716259002686,
0.08006517589092255,
-0.05574924126267433,
-0.01207047887146473,
-0.126015305519104,
0.04757455736398697,
0.09914769977331161,
0.001972595928236842,
-0.11381594091653824,
0.12762461602687836,
0.006736141629517078,
0.1376148760318756,
-0.053101278841495514,
-0.03729821369051933,
-0.17855307459831238,
0.09253609925508499,
-0.1087561622262001,
0.0648411214351654,
-0.06994723528623581,
0.03919955715537071,
0.019044244661927223,
0.01791645772755146,
-0.029285399243235588,
0.032089777290821075,
-0.09753548353910446,
0.013712357729673386,
0.008940025232732296,
0.03997019678354263,
-0.06459395587444305,
0.0029667497146874666,
0.08704052120447159,
-0.07035832107067108,
0.09066846966743469,
0.05180631950497627,
-0.055989935994148254,
0.1252916157245636,
-0.1661413460969925,
-0.02816280536353588,
0.040479984134435654,
0.020818959921598434,
0.049009665846824646,
-0.047925688326358795,
0.041507694870233536,
0.00791880115866661,
0.05018317699432373,
0.009687153622508049,
0.10376814752817154,
-0.12669700384140015,
-0.09473017603158951,
-0.0323888324201107,
-0.10581401735544205,
-0.03221137449145317,
0.03229451924562454,
0.012089976109564304,
0.08770627528429031,
0.1224382072687149,
-0.026225663721561432,
0.05663224309682846,
-0.06245861575007439,
-0.02714702859520912,
0.029083367437124252,
-0.06883543729782104,
-0.020396865904331207,
-0.09572046995162964,
0.02949630096554756,
-0.05253485590219498,
0.1710478812456131,
-0.0024632220156490803,
0.13208340108394623,
-0.019900504499673843,
-0.01944783329963684,
0.05021161958575249,
0.033353041857481,
0.2555643320083618,
-0.016740670427680016,
0.05341694504022598,
-0.06254620850086212,
0.062066759914159775,
0.026517847552895546,
0.075223907828331,
0.10247381031513214,
0.10481560975313187,
-0.051535192877054214,
0.09138806909322739,
0.013961102813482285,
0.041313670575618744,
-0.07643631845712662,
-0.16031259298324585,
0.08408933132886887,
0.05538669228553772,
-0.027035998180508614,
0.1138080433011055,
0.12193267792463303,
-0.0746161937713623,
0.08983767032623291,
0.016219675540924072,
-0.10534052550792694,
-0.060963813215494156,
0.0010728184133768082,
-0.03147072717547417,
-0.12626001238822937,
0.010109678842127323,
-0.11408717185258865,
-0.060395535081624985,
0.09209098666906357,
0.033772654831409454,
-0.052147913724184036,
0.1930704563856125,
-0.012888547964394093,
-0.09046676009893417,
0.04643648862838745,
-0.007025235798209906,
0.006892713718116283,
-0.004911486525088549,
0.07011088728904724,
-0.03110826388001442,
-0.04064226895570755,
0.024441858753561974,
0.03571129962801933,
-0.06748007982969284,
0.02723122574388981,
-0.08829715102910995,
-0.029016070067882538,
-0.04442313686013222,
0.059764519333839417,
0.003022319870069623,
0.0848223865032196,
0.02141384780406952,
-0.044597793370485306,
-0.02828315831720829,
0.22252647578716278,
-0.04892599582672119,
-0.08710657805204391,
-0.15037034451961517,
0.25966906547546387,
0.011473191902041435,
0.04325799643993378,
0.0011675788555294275,
-0.058342743664979935,
-0.03175663203001022,
0.2962663471698761,
0.19192011654376984,
-0.038592781871557236,
0.005168607924133539,
0.0022204797714948654,
0.020113449543714523,
0.005538835655897856,
0.147747203707695,
0.04009388014674187,
0.2507842481136322,
-0.031941939145326614,
-0.08543787151575089,
-0.051110927015542984,
-0.04016803950071335,
-0.0037091507110744715,
0.09559708833694458,
0.032339490950107574,
-0.054271746426820755,
-0.04330216348171234,
0.1295984834432602,
-0.16305410861968994,
-0.08611619472503662,
0.02020864188671112,
-0.12403570860624313,
-0.08880302309989929,
-0.07189879566431046,
0.0150179173797369,
-0.019521581009030342,
0.06420119851827621,
-0.04769754409790039,
-0.031334951519966125,
0.03165638446807861,
0.034962914884090424,
-0.16255062818527222,
-0.10488221794366837,
0.0764288455247879,
-0.013129822909832,
0.1262882500886917,
-0.015889564529061317,
0.10228469967842102,
0.09828817844390869,
0.02619854360818863,
-0.024819592013955116,
0.015843553468585014,
0.07185094058513641,
0.03649171441793442,
0.07149162143468857,
0.06641030311584473,
-0.035967376083135605,
0.09986169636249542,
-0.04537282884120941,
-0.09547656029462814,
0.03611858934164047,
-0.008127612993121147,
0.018872614949941635,
-0.10456773638725281,
-0.04036332294344902,
-0.09918566048145294,
0.08730889111757278,
0.1859745979309082,
-0.05521555244922638,
0.025415152311325073,
-0.07961447536945343,
0.11258424073457718,
0.024250492453575134,
-0.033532075583934784,
-0.08914248645305634,
-0.1542951762676239,
-0.05465633049607277,
-0.012802223674952984,
-0.04567328095436096,
-0.2381352037191391,
0.011068235151469707,
-0.05343886837363243,
-0.006748770363628864,
-0.05960762873291969,
0.13476447761058807,
0.11765141785144806,
0.021348074078559875,
-0.02365177311003208,
-0.11928162723779678,
-0.02348954603075981,
0.08236227184534073,
-0.1153629794716835,
-0.138124018907547
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/python/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.1281861811876297,
-0.017873352393507957,
-0.0006902650347910821,
0.13325180113315582,
0.11884834617376328,
0.034155648201704025,
0.06055370718240738,
0.06928083300590515,
-0.037707678973674774,
0.019685382023453712,
0.045776840299367905,
0.02138960175216198,
0.03297438099980354,
0.18984273076057434,
0.016662297770380974,
-0.13063772022724152,
-0.021778864786028862,
0.04477338492870331,
-0.051645275205373764,
0.13382799923419952,
0.08299331367015839,
-0.0576661080121994,
0.050344549119472504,
-0.06260361522436142,
-0.2371012270450592,
0.05212556943297386,
0.004738867748528719,
-0.054320987313985825,
0.10274650901556015,
0.03583918511867523,
0.14017155766487122,
-0.02467113919556141,
0.032114725559949875,
-0.14049440622329712,
0.00773554528132081,
0.015929710119962692,
0.025422465056180954,
0.009587056003510952,
0.0476720966398716,
0.03207775950431824,
0.16564318537712097,
-0.001324565033428371,
0.0686599537730217,
0.05854376032948494,
-0.07763193547725677,
-0.13562947511672974,
-0.011735041625797749,
0.030060606077313423,
0.044805388897657394,
0.10848456621170044,
-0.014415208250284195,
0.13256889581680298,
-0.14230285584926605,
0.1366840898990631,
0.09272369742393494,
-0.2326132357120514,
-0.010667968541383743,
0.11696695536375046,
0.08007878065109253,
0.0855918750166893,
-0.0440046451985836,
-0.06764350086450577,
0.0940818265080452,
0.05639692768454552,
0.03130835294723511,
-0.09380488097667694,
-0.08352621644735336,
-0.0000730573883629404,
-0.09429354965686798,
-0.07504595071077347,
0.2081473469734192,
-0.013975521549582481,
-0.08665261417627335,
-0.05188555642962456,
-0.033101603388786316,
-0.14082598686218262,
0.026030512526631355,
0.05340259522199631,
-0.0020237131975591183,
-0.034108586609363556,
0.004686551634222269,
0.03712109476327896,
-0.06847258657217026,
-0.13869717717170715,
0.024264072999358177,
0.11295025795698166,
0.06557828187942505,
0.026385031640529633,
-0.09851787984371185,
0.1131637692451477,
0.036560505628585815,
-0.044684652239084244,
-0.02197827771306038,
-0.009802483953535557,
-0.11145362257957458,
0.027238471433520317,
-0.048652563244104385,
-0.1584211140871048,
0.009343497455120087,
0.049632180482149124,
-0.030403196811676025,
0.052644193172454834,
0.0296543650329113,
0.025551287457346916,
0.021347055211663246,
0.19705745577812195,
0.059869445860385895,
-0.10822582989931107,
0.06001522019505501,
0.03835931047797203,
-0.031410034745931625,
-0.007003038190305233,
-0.06722737848758698,
-0.09883911907672882,
0.09979414194822311,
0.10512859374284744,
-0.10672586411237717,
0.04678544029593468,
-0.06675104051828384,
-0.04244794696569443,
-0.0394013337790966,
-0.15504761040210724,
0.005723483394831419,
0.028300847858190536,
-0.06570138037204742,
-0.025842241942882538,
0.10439787805080414,
-0.1777792125940323,
-0.1506461352109909,
-0.02090286836028099,
-0.07761198282241821,
-0.03399860858917236,
-0.15472884476184845,
-0.1718670129776001,
-0.013704032637178898,
-0.030672255903482437,
0.026152269914746284,
-0.09412100166082382,
-0.15081170201301575,
-0.019153980538249016,
0.026835041120648384,
0.017262058332562447,
-0.012467586435377598,
-0.07222003489732742,
-0.005764494184404612,
-0.02320552058517933,
-0.03124704770743847,
-0.006162950769066811,
-0.05339433625340462,
0.1322268396615982,
0.10524989664554596,
0.042748671025037766,
-0.03218543156981468,
0.05325160175561905,
-0.06897006183862686,
0.062543123960495,
-0.08772043883800507,
0.08856654912233353,
-0.05898356810212135,
0.08998115360736847,
-0.029217420145869255,
-0.11723149567842484,
0.08731885999441147,
0.05966288223862648,
0.06537256389856339,
0.045442841947078705,
-0.1218753531575203,
-0.03262975811958313,
0.19312795996665955,
-0.11807557940483093,
-0.1288331001996994,
0.1153237521648407,
-0.03812049701809883,
0.0763617604970932,
0.0973428338766098,
0.1250470131635666,
0.17007973790168762,
-0.05789044871926308,
0.016523469239473343,
0.05387846753001213,
0.047790613025426865,
-0.1148909330368042,
0.0788390263915062,
0.05573326721787453,
-0.09584641456604004,
0.054696448147296906,
-0.013609557412564754,
0.11158230900764465,
-0.010779807344079018,
-0.023743584752082825,
-0.05202951282262802,
-0.07536089420318604,
0.024307480081915855,
0.002538733184337616,
0.05920577794313431,
-0.08121505379676819,
-0.08672409504652023,
0.09200413525104523,
0.19338318705558777,
-0.13189977407455444,
-0.003582754172384739,
-0.08876270800828934,
0.07215740531682968,
-0.06543131172657013,
0.021015938371419907,
-0.16307108104228973,
0.00901241134852171,
0.0585763081908226,
-0.01426240149885416,
0.07381164282560349,
0.11787315458059311,
0.017229974269866943,
0.043796356767416,
0.009061597287654877,
-0.012863324955105782,
-0.12290190905332565,
-0.0590306892991066,
-0.07373980432748795,
-0.04507948085665703,
-0.09189888089895248,
-0.05629247426986694,
0.002127026440575719,
-0.2006005197763443,
0.01936589926481247,
0.0084858238697052,
-0.0045663439668715,
0.01837305724620819,
-0.017886921763420105,
0.019037464633584023,
0.07498075067996979,
-0.062098704278469086,
-0.04011385887861252,
0.031405746936798096,
0.018887631595134735,
-0.0418272465467453,
-0.1013658419251442,
-0.10511291027069092,
0.0026072540786117315,
0.12135773152112961,
0.05050972104072571,
-0.09045737236738205,
0.034286368638277054,
-0.009894275106489658,
-0.03568270802497864,
0.011184491217136383,
-0.06502964347600937,
0.1584346741437912,
-0.003604840487241745,
0.20325054228305817,
-0.14616283774375916,
-0.026272261515259743,
-0.027870144695043564,
0.01538321003317833,
0.07023806124925613,
0.13310080766677856,
-0.02436552569270134,
-0.06204404681921005,
0.0646863654255867,
-0.0009283997351303697,
-0.10571945458650589,
0.2086438238620758,
-0.04936474561691284,
-0.09512051939964294,
0.03369962051510811,
0.11033166944980621,
-0.0025089485570788383,
0.16637715697288513,
-0.18289220333099365,
-0.02408907748758793,
0.014482292346656322,
0.0063659995794296265,
0.0651465505361557,
-0.13138070702552795,
0.0049998098984360695,
0.02071032114326954,
-0.06414800882339478,
-0.10979863256216049,
-0.01734272763133049,
0.00021685654064640403,
0.04283313453197479,
-0.00689710071310401,
-0.03357547149062157,
0.011188232339918613,
-0.029840664938092232,
-0.1145806685090065,
0.227989062666893,
-0.0964803546667099,
-0.2193836271762848,
-0.1998814046382904,
0.060338862240314484,
-0.05476135388016701,
-0.022294677793979645,
0.0319027304649353,
-0.08421045541763306,
-0.04749171435832977,
-0.04012932628393173,
0.20017777383327484,
-0.09380262345075607,
-0.00800342857837677,
-0.033909644931554794,
0.06766877323389053,
0.016954682767391205,
-0.2060755342245102,
0.04057344049215317,
-0.006714200135320425,
-0.028804106637835503,
0.01642523519694805,
-0.11457903683185577,
0.08272409439086914,
0.16257396340370178,
-0.08269589394330978,
0.015211771242320538,
-0.0009737422224134207,
0.20844712853431702,
-0.039353929460048676,
-0.0772365853190422,
0.13550199568271637,
-0.02366577461361885,
-0.000199833870283328,
0.00978782493621111,
-0.012458931654691696,
-0.10085268318653107,
0.058165378868579865,
-0.014794341288506985,
-0.030284889042377472,
-0.25888168811798096,
-0.026898393407464027,
-0.08209757506847382,
0.061117466539144516,
0.0676712617278099,
0.03837069496512413,
-0.10115770250558853,
0.034289244562387466,
0.05093924328684807,
0.14524494111537933,
-0.005027451552450657,
0.06535555422306061,
0.06319274753332138,
0.0038006799295544624,
0.0063229952938854694,
-0.09943554550409317,
0.007298408076167107,
0.07350482046604156,
0.11021627485752106,
0.27122437953948975,
-0.0949387326836586,
0.1774524748325348,
0.03197644650936127,
0.056568775326013565,
0.04989146068692207,
0.14401137828826904,
-0.12306249886751175,
0.038523752242326736,
0.017001906409859657,
0.0008711097179912031,
-0.11463883519172668,
0.0022798965219408274,
-0.037115149199962616,
0.07129789143800735,
-0.11938841640949249,
-0.06431514024734497,
-0.002182481810450554,
0.13415181636810303,
0.06626749783754349,
-0.237335667014122,
-0.12593163549900055,
0.014229778200387955,
-0.10110877454280853,
-0.11138899624347687,
0.06260070204734802,
0.1886579543352127,
-0.08946538716554642,
-0.022220535203814507,
-0.022167671471834183,
0.13740752637386322,
-0.04226788505911827,
-0.032540760934352875,
-0.04688740521669388,
0.0505668930709362,
0.006154310889542103,
0.12643353641033173,
-0.28462210297584534,
0.1299460232257843,
-0.011310585774481297,
0.06566963344812393,
-0.027621202170848846,
0.05502026528120041,
-0.034259870648384094,
0.0805438980460167,
0.03466888144612312,
-0.011991618201136589,
0.04844075068831444,
-0.17447753250598907,
-0.002098239026963711,
0.041550152003765106,
0.021740764379501343,
0.058594126254320145,
0.0861494243144989,
-0.015295440331101418,
0.06812715530395508,
-0.02034357376396656,
-0.14648786187171936,
-0.04406598582863808,
-0.07972656935453415,
-0.02140837348997593,
-0.06505435705184937,
-0.01776808686554432,
-0.03830469772219658,
-0.01157515961676836,
0.06868048757314682,
0.1820782572031021,
-0.10028029978275299,
-0.09241273254156113,
-0.08114761859178543,
0.05909852311015129,
0.10198741406202316,
-0.08483915030956268,
0.05104960501194,
-0.005818333011120558,
0.02713300846517086,
-0.012141923420131207,
-0.09312205761671066,
0.049056507647037506,
-0.038591839373111725,
-0.05638841167092323,
-0.001659182133153081,
0.06921414285898209,
0.000603150692768395,
0.04104791581630707,
0.010737646371126175,
-0.08464975655078888,
-0.05730046331882477,
-0.10767590254545212,
-0.14317168295383453,
-0.055089205503463745,
0.008795649744570255,
0.06707564741373062,
-0.1397232860326767,
-0.05590013042092323,
-0.023233750835061073,
-0.013858644291758537,
0.14937175810337067,
0.17199230194091797,
-0.05604377016425133,
0.019829582422971725,
0.10765815526247025,
-0.053221479058265686,
-0.19088737666606903,
0.02354096807539463,
0.05041033402085304,
0.11573406308889389,
-0.039242420345544815,
-0.18507319688796997,
0.03982662037014961,
0.02240194007754326,
0.04066889360547066,
0.0773235559463501,
-0.31840911507606506,
-0.1208692267537117,
0.10117316991090775,
0.1515442132949829,
0.11041750013828278,
-0.12964344024658203,
-0.020893020555377007,
-0.05675765499472618,
-0.1329382210969925,
0.09365855902433395,
-0.04369134083390236,
0.13495928049087524,
-0.06433621048927307,
0.06589306145906448,
0.04119107127189636,
-0.036494266241788864,
0.07768183946609497,
0.01820141077041626,
0.10505765676498413,
-0.03955947235226631,
0.024283699691295624,
0.11059997230768204,
-0.037195656448602676,
0.17401903867721558,
-0.1491558700799942,
0.09528782963752747,
-0.2658860385417938,
-0.05779055505990982,
-0.06504333019256592,
0.006275600288063288,
-0.032831624150276184,
-0.045501962304115295,
-0.09002386033535004,
0.018807370215654373,
0.0023940063547343016,
-0.008749867789447308,
0.035278208553791046,
-0.024974457919597626,
-0.00862075574696064,
0.06588371843099594,
0.10420501232147217,
0.01648605428636074,
-0.11424941569566727,
0.053594741970300674,
0.04867471009492874,
0.10207876563072205,
-0.19410721957683563,
0.020077569410204887,
0.1119740903377533,
0.03232182562351227,
0.11501552164554596,
0.047779571264982224,
-0.10295626521110535,
0.04883923381567001,
0.08708242326974869,
-0.06640218943357468,
-0.077396921813488,
-0.004691810812801123,
-0.08682840317487717,
-0.08967144787311554,
0.041358113288879395,
0.10221274197101593,
-0.04506884515285492,
-0.025051243603229523,
-0.023203052580356598,
-0.019949784502387047,
-0.11972075700759888,
0.19554102420806885,
0.0746733620762825,
0.08080128580331802,
-0.0696655735373497,
0.04426400735974312,
0.07478664070367813,
-0.06376980990171432,
0.0036982246674597263,
0.1797206997871399,
-0.09944704174995422,
-0.04531025514006615,
0.07576438784599304,
0.1839829385280609,
-0.04630321264266968,
-0.04153407737612724,
-0.1212419643998146,
-0.06864743679761887,
0.021163038909435272,
0.1563214212656021,
0.10660609602928162,
0.08585666120052338,
-0.036893151700496674,
0.0017394735477864742,
-0.11794222146272659,
0.08189049363136292,
0.07673878222703934,
0.04586144536733627,
-0.12166270613670349,
0.15820848941802979,
0.0423123799264431,
0.11418487876653671,
-0.02655697427690029,
-0.01408446952700615,
-0.1421872228384018,
0.0748593658208847,
-0.10370318591594696,
0.03304409608244896,
-0.008822810836136341,
0.058472637087106705,
-0.018237978219985962,
0.006682444363832474,
-0.030160151422023773,
0.05593914911150932,
-0.09018956124782562,
-0.000027343336114427075,
0.010255182161927223,
0.030032172799110413,
-0.0418732576072216,
-0.0095561807975173,
0.03408488258719444,
-0.10202768445014954,
0.12123505771160126,
-0.017338309437036514,
-0.023057641461491585,
0.09910131990909576,
-0.036612674593925476,
0.028383804485201836,
0.01360576506704092,
0.05350333824753761,
0.01633373089134693,
0.018251201137900352,
0.0802576020359993,
0.02291429415345192,
0.06293663382530212,
0.03939564898610115,
0.12626346945762634,
-0.133112832903862,
-0.08501964062452316,
-0.05783122032880783,
-0.10794229805469513,
-0.05243898928165436,
0.09455219656229019,
0.027475325390696526,
0.11221952736377716,
0.1132090613245964,
-0.03430391103029251,
0.026400728151202202,
-0.1163778081536293,
-0.07332833111286163,
0.016647284850478172,
-0.02313779853284359,
-0.0586969330906868,
-0.05719003453850746,
0.04226377233862877,
-0.022921068593859673,
0.11201050877571106,
0.01239826250821352,
0.03616023436188698,
-0.024334702640771866,
-0.039811789989471436,
0.005805129650980234,
0.0100462157279253,
0.22864066064357758,
-0.07354459166526794,
0.04461736977100372,
-0.002100364537909627,
0.01323719322681427,
0.01266325730830431,
0.12352492660284042,
0.12717020511627197,
0.1518240123987198,
-0.05717136710882187,
0.09030148386955261,
0.009519108571112156,
-0.0020270268432796,
-0.09598126262426376,
-0.028445681557059288,
0.023076917976140976,
0.05880787596106529,
-0.04144035652279854,
0.19871185719966888,
0.0920768454670906,
-0.10225339233875275,
0.10567101836204529,
0.03339328244328499,
-0.13459739089012146,
-0.04419412836432457,
0.034926917403936386,
-0.02534407563507557,
-0.14442123472690582,
0.027059171348810196,
-0.11963636428117752,
-0.02745916321873665,
0.0745605081319809,
0.05454858019948006,
-0.07119131833314896,
0.16835810244083405,
0.024605033919215202,
-0.07209298759698868,
0.042711447924375534,
-0.0018650175770744681,
0.01979227177798748,
0.033120181411504745,
0.02697610668838024,
0.0209458377212286,
-0.047118134796619415,
0.05458423122763634,
0.02090521901845932,
-0.03896327316761017,
0.0010840505128726363,
-0.03287406265735626,
-0.0008320020278915763,
-0.020434031262993813,
0.029433144256472588,
0.057773273438215256,
0.19079668819904327,
0.03590782731771469,
-0.07936888188123703,
-0.03339793160557747,
0.164112389087677,
-0.029885223135352135,
-0.10310553759336472,
-0.1335485428571701,
0.17688019573688507,
0.0261519905179739,
0.006940604653209448,
0.021038049831986427,
-0.08471133559942245,
-0.051429931074380875,
0.2165343165397644,
0.06125263124704361,
-0.02292465977370739,
-0.018829455599188805,
0.0017539847176522017,
-0.00028357250266708434,
-0.03826618194580078,
0.21105413138866425,
0.02404973842203617,
0.2450835108757019,
0.02037903293967247,
-0.0182639230042696,
-0.07228036969900131,
-0.034836556762456894,
-0.018011627718806267,
0.1067916601896286,
-0.03859938681125641,
-0.04071563482284546,
-0.08706605434417725,
0.019108468666672707,
-0.00557383568957448,
-0.06581633538007736,
0.08951162546873093,
-0.12446131557226181,
-0.10186626762151718,
-0.043948944658041,
0.024034136906266212,
-0.053231868892908096,
0.02922878973186016,
-0.03327355161309242,
0.03195338323712349,
0.05153270810842514,
-0.03443853184580803,
-0.11584712564945221,
-0.17465174198150635,
0.11302988231182098,
-0.0255756638944149,
0.13820217549800873,
-0.011092121712863445,
0.15465103089809418,
0.08799999952316284,
0.025162896141409874,
-0.05102435126900673,
0.09912189841270447,
0.0365421324968338,
0.053044144064188004,
0.05345012992620468,
0.11839573085308075,
-0.058092597872018814,
0.11561348289251328,
-0.04708987474441528,
-0.01563035510480404,
-0.028205865994095802,
-0.07209841161966324,
-0.0015845217276364565,
-0.1574401706457138,
-0.02974838763475418,
-0.10319434851408005,
0.08795696496963501,
0.2039586305618286,
-0.049423668533563614,
-0.023357465863227844,
-0.09377351403236389,
0.0918336808681488,
0.003498874371871352,
0.05668489262461662,
-0.038976673036813736,
-0.18697871267795563,
-0.019124578684568405,
-0.013675219379365444,
-0.003223475767299533,
-0.27591314911842346,
-0.002550119999796152,
-0.04516563564538956,
-0.0222233384847641,
-0.09529218822717667,
0.17657269537448883,
0.07629793137311935,
0.030472395941615105,
-0.03913094848394394,
-0.12909461557865143,
-0.046578798443078995,
0.06668303161859512,
-0.14419938623905182,
-0.14011134207248688
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the python function/method.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/python/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_python_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the python function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 4000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09364219754934311,
0.07657555490732193,
-0.0012474871473386884,
0.10554156452417374,
0.048726700246334076,
0.035543546080589294,
0.021755022928118706,
0.10967344790697098,
-0.031487125903367996,
0.06592940539121628,
0.06018340215086937,
-0.06934331357479095,
0.05134846270084381,
0.17429319024085999,
0.034922659397125244,
-0.17196036875247955,
-0.03371892124414444,
0.022133007645606995,
-0.05211257189512253,
0.10640904307365417,
0.07487455755472183,
-0.07171907275915146,
0.07376247644424438,
-0.03968166559934616,
-0.11638940125703812,
0.045677222311496735,
-0.02728867717087269,
-0.01729155145585537,
0.10181553661823273,
0.05076586455106735,
0.11802254617214203,
-0.04090002179145813,
0.07006578147411346,
-0.21031714975833893,
-0.000754303066059947,
0.032933443784713745,
0.05878609046339989,
0.03140576183795929,
0.05760490521788597,
0.05835084617137909,
0.15031088888645172,
-0.019257977604866028,
0.06359252333641052,
0.05771839618682861,
-0.06733611971139908,
-0.09718194603919983,
-0.06467626243829727,
0.0644790306687355,
0.06975078582763672,
0.1039157584309578,
-0.007244537118822336,
0.022601202130317688,
-0.06549662351608276,
0.09641087800264359,
0.11146021634340286,
-0.2190857231616974,
-0.01828940585255623,
0.11867285519838333,
0.08918523788452148,
0.05186331644654274,
-0.07261641323566437,
-0.03854935243725777,
0.09836674481630325,
0.051516853272914886,
0.04800897091627121,
-0.09127860516309738,
-0.02173648402094841,
-0.01698777638375759,
-0.05632268637418747,
-0.055264733731746674,
0.14225059747695923,
0.023189714178442955,
-0.056776609271764755,
-0.11466939747333527,
-0.05603717267513275,
-0.20009608566761017,
0.03493133932352066,
0.03516062721610069,
0.009351170621812344,
-0.0037276388611644506,
0.008493958041071892,
-0.011824635788798332,
-0.08824004232883453,
-0.09878810495138168,
0.02942267246544361,
0.02942400425672531,
0.0682186558842659,
0.03603138029575348,
-0.021261367946863174,
0.09596782922744751,
-0.007110784761607647,
-0.041751034557819366,
-0.009709320031106472,
0.02091258391737938,
-0.1172061339020729,
0.012907272204756737,
-0.006529401522129774,
-0.04540980979800224,
-0.0013158678775653243,
0.09482090175151825,
-0.10586608946323395,
0.08470138907432556,
0.09562455117702484,
0.004136912524700165,
0.020035015419125557,
0.21763212978839874,
0.042497940361499786,
-0.14887435734272003,
0.018214026466012,
0.010893834754824638,
0.0021325170528143644,
0.005278171505779028,
-0.04883105307817459,
-0.04374508559703827,
0.014464084059000015,
0.07106100767850876,
-0.10281050950288773,
0.02381390705704689,
-0.05086340382695198,
-0.00800025463104248,
0.06171657517552376,
-0.11965804547071457,
0.04428619518876076,
0.015414156951010227,
-0.04374147206544876,
-0.014869017526507378,
0.086066335439682,
-0.12919825315475464,
-0.12130658328533173,
0.06031017005443573,
-0.04319780692458153,
-0.034440383315086365,
-0.10771597176790237,
-0.12001374363899231,
-0.006319371052086353,
-0.028549600392580032,
0.003135193372145295,
-0.09899721294641495,
-0.08763986825942993,
-0.019911207258701324,
0.04136624559760094,
-0.0029981445986777544,
-0.033058665692806244,
-0.031616970896720886,
0.013452318497002125,
-0.00039148094947449863,
-0.01138978824019432,
0.011619682423770428,
-0.039221927523612976,
0.09375427663326263,
0.07976013422012329,
0.03475486487150192,
-0.006542477756738663,
0.023214103654026985,
-0.08273173868656158,
0.08130978792905807,
-0.08176828920841217,
0.049016907811164856,
-0.01451210118830204,
0.07262181490659714,
-0.10011716932058334,
-0.08977852016687393,
0.029559673741459846,
0.04827527701854706,
0.05688103288412094,
0.028297659009695053,
-0.10129371285438538,
0.025115348398685455,
0.1537620574235916,
-0.11114807426929474,
-0.12311764061450958,
0.1203274205327034,
-0.0015013079391792417,
0.017805777490139008,
0.07238015532493591,
0.12332300841808319,
0.15873365104198456,
-0.11691118031740189,
-0.02293972484767437,
0.08233900368213654,
0.06450236588716507,
-0.04757540673017502,
0.06672503054141998,
-0.0005837631761096418,
0.0014952941564843059,
0.030420202761888504,
0.06134265288710594,
0.05779993534088135,
0.0028885442297905684,
-0.03357787802815437,
-0.04616701975464821,
-0.08332162350416183,
-0.04221139848232269,
-0.011551213450729847,
0.012964890338480473,
-0.060142748057842255,
-0.07556958496570587,
-0.010256695561110973,
0.1861671358346939,
-0.09687779098749161,
0.03337666392326355,
-0.0728449746966362,
-0.018144790083169937,
-0.07226365059614182,
0.021456403657794,
-0.11912448704242706,
0.007630972191691399,
0.05293264985084534,
-0.038984958082437515,
0.05299125984311104,
0.07454876601696014,
0.004302044864743948,
0.02740516886115074,
-0.056412141770124435,
-0.04130448400974274,
-0.039846014231443405,
-0.06951101869344711,
-0.1104850172996521,
-0.023045286536216736,
-0.09013001620769501,
-0.030331077054142952,
-0.043031614273786545,
-0.17590534687042236,
0.0011223095934838057,
0.0059918914921581745,
0.022351205348968506,
0.02407723106443882,
-0.03986005857586861,
0.021681878715753555,
0.05757620185613632,
-0.0472717210650444,
-0.0829949900507927,
0.017442017793655396,
0.03389127925038338,
-0.0877825990319252,
-0.04933552071452141,
-0.10764087736606598,
-0.05974147468805313,
0.07211604714393616,
0.10025471448898315,
-0.1208764910697937,
0.004061833489686251,
-0.0210395660251379,
-0.04627695679664612,
-0.05597427859902382,
-0.06124887987971306,
0.17443154752254486,
0.013328298926353455,
0.17051441967487335,
-0.12218441814184189,
-0.05598223954439163,
-0.028729364275932312,
0.002520614070817828,
0.02666385844349861,
0.1515112668275833,
0.0003890570951625705,
-0.06871556490659714,
0.034516170620918274,
-0.04493755102157593,
-0.05535043776035309,
0.14934603869915009,
-0.014861606061458588,
-0.08020348101854324,
0.003098946763202548,
0.10822107642889023,
-0.002990773180499673,
0.1672644317150116,
-0.056160278618335724,
0.005511343479156494,
-0.00461951969191432,
0.022453999146819115,
0.04306517913937569,
-0.12039713561534882,
0.030401214957237244,
0.050789181143045425,
-0.05613689869642258,
-0.05673617497086525,
-0.02963646501302719,
-0.03386897221207619,
0.041863881051540375,
0.0169810950756073,
0.038653284311294556,
-0.028851311653852463,
-0.028773000463843346,
-0.10490240156650543,
0.19031216204166412,
-0.07432208955287933,
-0.206452414393425,
-0.17526032030582428,
0.041816215962171555,
-0.025626584887504578,
-0.022625479847192764,
0.036156658083200455,
-0.0987156480550766,
-0.0580764003098011,
-0.09548220783472061,
0.13397249579429626,
-0.14252826571464539,
-0.003335846122354269,
-0.04697743430733681,
0.05461077392101288,
0.061255570501089096,
-0.16605883836746216,
0.028324777260422707,
-0.006219402886927128,
0.0012268867576494813,
-0.004572875332087278,
-0.0499177947640419,
0.07670789957046509,
0.11683807522058487,
-0.06203819811344147,
0.013290636241436005,
-0.003920787014067173,
0.17636758089065552,
-0.05768907815217972,
0.029785681515932083,
0.17829535901546478,
0.008402188308537006,
0.036553751677274704,
0.050789374858140945,
0.019276604056358337,
-0.0957384929060936,
0.056522633880376816,
0.04728477820754051,
-0.03866963088512421,
-0.21242505311965942,
-0.03603784367442131,
-0.07909197360277176,
0.07624247670173645,
0.13925407826900482,
0.05645022168755531,
-0.1602574735879898,
0.016724780201911926,
-0.010594907216727734,
0.15061511099338531,
-0.035604801028966904,
0.06794612109661102,
0.025535138323903084,
0.006128487177193165,
-0.003875037422403693,
-0.10381823033094406,
0.0059155966155231,
0.08331441879272461,
0.11380117386579514,
0.1934531033039093,
-0.08294538408517838,
0.18109716475009918,
0.004211460240185261,
0.10221342742443085,
0.03858466073870659,
0.08341000974178314,
-0.1315605193376541,
0.012603027746081352,
0.015801260247826576,
-0.014476679265499115,
-0.05103369057178497,
0.04641517996788025,
-0.01806396059691906,
0.057041022926568985,
-0.06838765740394592,
-0.008819936774671078,
0.013081688433885574,
0.1917303055524826,
0.0793493464589119,
-0.16904892027378082,
-0.1272040754556656,
0.014227435924112797,
-0.09491436183452606,
-0.11146773397922516,
0.0704587996006012,
0.19156943261623383,
-0.07090838253498077,
0.029511533677577972,
-0.018251340836286545,
0.13451707363128662,
-0.11323058605194092,
-0.024056099355220795,
0.03501027822494507,
0.052471619099378586,
0.0022758464328944683,
0.11450590193271637,
-0.23980771005153656,
0.07284620404243469,
0.014823800884187222,
0.08587399870157242,
-0.013522254303097725,
0.06701410561800003,
-0.05227207392454147,
0.01773260161280632,
0.07517331838607788,
0.010997142642736435,
-0.05135861784219742,
-0.19999587535858154,
-0.04885042458772659,
0.026549389585852623,
0.04938327521085739,
-0.013563456013798714,
0.09427318722009659,
-0.03265365958213806,
0.0490129292011261,
-0.03330905735492706,
-0.15680062770843506,
-0.033703479915857315,
-0.14725521206855774,
-0.03215831518173218,
-0.01380995474755764,
-0.05642097443342209,
-0.024018477648496628,
0.04129239171743393,
0.03805938735604286,
0.2455057054758072,
-0.14686988294124603,
-0.08612719923257828,
-0.1045638769865036,
0.06230801343917847,
0.1361309289932251,
-0.09145358949899673,
0.03283628821372986,
0.008576544001698494,
0.0489097535610199,
-0.04327232018113136,
-0.07433201372623444,
0.028982549905776978,
-0.05792923644185066,
-0.08352170884609222,
-0.027772927656769753,
0.11875162273645401,
-0.020337950438261032,
0.04662643373012543,
-0.006027163937687874,
-0.0652473196387291,
-0.05412488803267479,
-0.12565724551677704,
-0.07713808119297028,
-0.009143324568867683,
0.03803219273686409,
-0.003932085353881121,
-0.11435839533805847,
0.0880352109670639,
-0.00047644629376009107,
-0.08159573376178741,
0.08370804041624069,
0.18707412481307983,
-0.07471893727779388,
0.03140408918261528,
0.07512682676315308,
-0.05501144751906395,
-0.17231528460979462,
-0.047557901591062546,
0.03721771761775017,
0.07237208634614944,
-0.019632404670119286,
-0.15781892836093903,
0.05885903909802437,
0.012833667919039726,
0.01870022527873516,
0.03313648700714111,
-0.28528672456741333,
-0.1267007440328598,
0.017235545441508293,
0.06738146394491196,
0.026555033400654793,
-0.10991673916578293,
-0.03982393816113472,
-0.060947299003601074,
-0.05433323234319687,
0.03695644438266754,
0.06922632455825806,
0.1109548881649971,
-0.038272175937891006,
0.05281589552760124,
0.047318555414676666,
-0.02881196141242981,
0.07138299196958542,
-0.026193691417574883,
0.08694813400506973,
-0.019136862829327583,
0.018155323341488838,
0.03465355560183525,
-0.06036799028515816,
0.1823493242263794,
-0.1612682342529297,
0.1007809042930603,
-0.20082277059555054,
-0.04186559095978737,
-0.02004081942141056,
0.0005226510693319142,
-0.040561847388744354,
-0.055133894085884094,
-0.12093060463666916,
0.02088492549955845,
0.05756509676575661,
-0.036418166011571884,
0.046649057418107986,
-0.019793465733528137,
-0.038130540400743484,
0.062229789793491364,
0.05779208987951279,
0.015066174790263176,
-0.16036206483840942,
0.030986154451966286,
0.014788074418902397,
0.08149975538253784,
-0.1921256184577942,
0.018310727551579475,
0.11163853108882904,
0.027957528829574585,
0.09260524064302444,
0.004011843353509903,
-0.08784174174070358,
0.027411507442593575,
0.07133417576551437,
-0.06785624474287033,
-0.09767097979784012,
-0.010405994951725006,
-0.052973225712776184,
-0.10126754641532898,
0.02241344377398491,
0.09402898699045181,
-0.05444291606545448,
-0.014945735223591328,
-0.005367901176214218,
0.01679888181388378,
-0.07395246624946594,
0.17715385556221008,
0.019417976960539818,
0.07481095939874649,
-0.06595882028341293,
0.07037719339132309,
0.10307977348566055,
-0.11015424877405167,
0.015800992026925087,
0.16354583203792572,
-0.08202138543128967,
-0.022999249398708344,
0.07984878867864609,
0.08726152032613754,
-0.029734265059232712,
-0.04448186978697777,
-0.08162346482276917,
-0.06831532716751099,
0.014352708123624325,
0.014068897813558578,
0.07044357806444168,
0.08066847920417786,
-0.036451078951358795,
0.0026110513135790825,
-0.12730388343334198,
0.09949202835559845,
0.07122913002967834,
0.05135239288210869,
-0.1443762332201004,
0.14832940697669983,
0.0395490787923336,
0.07756917178630829,
0.004178448114544153,
0.02308034338057041,
-0.10956250876188278,
0.03630971908569336,
-0.023630090057849884,
0.030197342857718468,
-0.002601931570097804,
0.05313241854310036,
-0.033005159348249435,
0.038407862186431885,
-0.02760879322886467,
0.040741514414548874,
-0.04238616302609444,
-0.03089238703250885,
-0.03334742784500122,
0.018757814541459084,
-0.05996270105242729,
-0.007949106395244598,
0.013052255846560001,
-0.08638562262058258,
0.09688802063465118,
-0.0547950379550457,
-0.0037075225263834,
0.007254639640450478,
0.023127276450395584,
0.060960255563259125,
0.018750054761767387,
0.04275888204574585,
-0.01620395854115486,
-0.01174702774733305,
0.029067061841487885,
0.009792204946279526,
-0.017327532172203064,
0.0030962983146309853,
0.0852765440940857,
-0.15188466012477875,
-0.08237665146589279,
-0.08047755807638168,
-0.08187396824359894,
-0.058359432965517044,
0.06571270525455475,
0.07078671455383301,
0.07499366253614426,
0.09552786499261856,
-0.041199952363967896,
0.01945008896291256,
-0.14744757115840912,
-0.05000952258706093,
0.04555117338895798,
0.00005783526648883708,
-0.09530159085988998,
-0.03512583300471306,
0.054487623274326324,
-0.03825007751584053,
0.12953540682792664,
-0.022147737443447113,
0.050410471856594086,
-0.008866910822689533,
-0.043285686522722244,
-0.02454577572643757,
0.0013431478291749954,
0.17957469820976257,
-0.10288883000612259,
0.0029561067931354046,
-0.013167515397071838,
0.002975607756525278,
0.032965727150440216,
0.1755613386631012,
0.09032835066318512,
0.1163255050778389,
0.04154599457979202,
0.06690101325511932,
-0.05139122158288956,
-0.031519122421741486,
-0.13998274505138397,
0.04132732376456261,
-0.02687983587384224,
0.05014115571975708,
-0.040123358368873596,
0.14569434523582458,
0.10607834905385971,
-0.129838764667511,
0.10241999477148056,
0.014808062463998795,
-0.09188768267631531,
-0.044604767113924026,
-0.07974155247211456,
-0.041935406625270844,
-0.09718794375658035,
0.004008202347904444,
-0.09926432371139526,
0.015762504190206528,
0.0778723806142807,
0.03446941822767258,
-0.028019648045301437,
0.16395044326782227,
-0.03177649527788162,
-0.0642414316534996,
0.025493314489722252,
0.05230557173490524,
0.028014104813337326,
0.110392726957798,
0.027362100780010223,
0.050770457834005356,
-0.07317119091749191,
0.07432273775339127,
0.036307837814092636,
-0.007782327942550182,
0.02052067406475544,
0.008340466767549515,
-0.00649484945461154,
-0.04767608270049095,
-0.000832104473374784,
0.07475944608449936,
0.16536666452884674,
0.044208791106939316,
-0.04883236810564995,
-0.05903686210513115,
0.20488567650318146,
-0.056359611451625824,
-0.056089796125888824,
-0.12840259075164795,
0.18136079609394073,
0.0362652949988842,
0.006628748029470444,
0.015518913976848125,
-0.08028433471918106,
-0.027002274990081787,
0.23277558386325836,
0.06547088921070099,
-0.028716115280985832,
-0.022151434794068336,
0.0003958659654017538,
-0.008093979209661484,
-0.026590751484036446,
0.14898638427257538,
-0.0034330266062170267,
0.2419901043176651,
0.011321856640279293,
-0.004636536352336407,
-0.04233367368578911,
-0.040184032171964645,
-0.031503356993198395,
0.21055227518081665,
-0.03491942957043648,
0.02186673693358898,
-0.10131777077913284,
-0.0022812820971012115,
0.028353175148367882,
-0.12105026096105576,
0.11665359139442444,
-0.1267704963684082,
-0.07757676392793655,
0.021720336750149727,
0.055843934416770935,
-0.04249299317598343,
0.04853774607181549,
-0.023936139419674873,
0.04966229200363159,
0.045442845672369,
-0.029926441609859467,
-0.10796575248241425,
-0.15606962144374847,
0.05456846207380295,
0.004467152990400791,
0.14166629314422607,
0.021591896191239357,
0.06955555826425552,
0.08398241549730301,
0.0045018987730145454,
-0.07599184662103653,
0.08597254008054733,
0.03716311603784561,
-0.01817130111157894,
0.05184146389365196,
0.128005713224411,
-0.04548967257142067,
0.1447393000125885,
0.012898229993879795,
-0.010600944980978966,
-0.023884758353233337,
-0.03363285958766937,
0.005377480294555426,
-0.14639128744602203,
-0.010938886553049088,
-0.06337827444076538,
0.1329718977212906,
0.20651943981647491,
-0.04559687152504921,
-0.018896840512752533,
-0.05139659717679024,
0.07626114040613174,
-0.005800872575491667,
0.08107811212539673,
0.005397809203714132,
-0.17213174700737,
0.011231733486056328,
-0.01880878023803234,
0.005035923328250647,
-0.19310294091701508,
-0.0557919479906559,
-0.029519077390432358,
-0.021540388464927673,
-0.10027951747179031,
0.15573900938034058,
0.05299884080886841,
0.019689347594976425,
-0.03702664002776146,
-0.13652801513671875,
-0.012759359553456306,
0.04811821132898331,
-0.12716349959373474,
-0.11966308951377869
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation python
Pretrained model on programming language python using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized python code functions: it works best with tokenized python functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the python function/method.
## Intended uses & limitations
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_python_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/python/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def e ( message , exit_code = None ) : print_log ( message , YELLOW , BOLD ) if exit_code is not None : sys . exit ( exit_code )"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_python_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation python
========================================================
Pretrained model on programming language python using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized python code functions: it works best with tokenized python functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the python function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the python function or be fine-tuned on other python code tasks. It can be used on unparsed and untokenized python code. However, if the python code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate python function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate python function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing python code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09859102964401245,
0.07669943571090698,
-0.0006934590637683868,
0.10037070512771606,
0.040964946150779724,
0.029198219999670982,
0.03281877562403679,
0.10960274189710617,
-0.04137900471687317,
0.053839873522520065,
0.060164324939250946,
-0.06623684614896774,
0.06069529801607132,
0.18122780323028564,
0.020331593230366707,
-0.15199953317642212,
-0.025784151628613472,
0.038148995488882065,
-0.0714026466012001,
0.11465974897146225,
0.07227574288845062,
-0.0762271136045456,
0.07237900048494339,
-0.04187599942088127,
-0.12680773437023163,
0.04332587867975235,
-0.02431875467300415,
-0.019234076142311096,
0.09982011467218399,
0.06444841623306274,
0.13131681084632874,
-0.03400580957531929,
0.06990525126457214,
-0.2028551548719406,
0.00014947923773434013,
0.029045237228274345,
0.06057986989617348,
0.03924686461687088,
0.04115908220410347,
0.07142124325037003,
0.14161112904548645,
-0.014821797609329224,
0.05613664537668228,
0.048792097717523575,
-0.06717025488615036,
-0.062346383929252625,
-0.07488804310560226,
0.0774296447634697,
0.07716967165470123,
0.10403434932231903,
-0.0029429662972688675,
0.05474909394979477,
-0.06673161685466766,
0.09518301486968994,
0.11718988418579102,
-0.22383399307727814,
-0.022594913840293884,
0.1063380017876625,
0.08844969421625137,
0.03602416440844536,
-0.07002940028905869,
-0.0440957173705101,
0.10188465565443039,
0.04617857187986374,
0.04927074909210205,
-0.08803993463516235,
-0.04143594950437546,
-0.015517216175794601,
-0.059576600790023804,
-0.055453766137361526,
0.17208527028560638,
0.02582581713795662,
-0.06098392605781555,
-0.09487468004226685,
-0.04994693771004677,
-0.1951928585767746,
0.03594835847616196,
0.02186746336519718,
0.0030450569465756416,
-0.008554053492844105,
-0.00345915206708014,
-0.007021714001893997,
-0.0924060195684433,
-0.10541430115699768,
0.013445496559143066,
0.038425907492637634,
0.06248266622424126,
0.035980530083179474,
-0.03380916267633438,
0.09300234168767929,
0.026694636791944504,
-0.03701974079012871,
-0.011516745202243328,
0.016088129952549934,
-0.11126076430082321,
0.0016767617780715227,
-0.011196189559996128,
-0.06508904695510864,
-0.01364437397569418,
0.08299265056848526,
-0.1023440733551979,
0.07347654551267624,
0.09125031530857086,
0.010368173010647297,
0.017681891098618507,
0.2099159061908722,
0.042262472212314606,
-0.14488714933395386,
0.023870011791586876,
0.01804845966398716,
-0.004682880360633135,
0.010062522254884243,
-0.04865183308720589,
-0.03961764648556709,
0.029243309050798416,
0.06783892214298248,
-0.10711585730314255,
0.032320406287908554,
-0.050596244633197784,
-0.012925617396831512,
0.05300876870751381,
-0.12728385627269745,
0.03517681732773781,
0.007958534173667431,
-0.05312184989452362,
-0.02038058079779148,
0.10131561011075974,
-0.13009904325008392,
-0.11760449409484863,
0.04333438724279404,
-0.045686930418014526,
-0.03645973280072212,
-0.11478565633296967,
-0.12263738363981247,
-0.004961755592375994,
0.0017659574514254928,
-0.0017468510195612907,
-0.09847697615623474,
-0.09099651128053665,
-0.013021349906921387,
0.05131267383694649,
0.005718535743653774,
-0.03067714534699917,
-0.0328962616622448,
0.0006063917535357177,
0.00030287864501588047,
-0.0158393494784832,
0.011214712634682655,
-0.03194340318441391,
0.10140301287174225,
0.07244187593460083,
0.031637195497751236,
-0.02179497294127941,
0.023510025814175606,
-0.08072618395090103,
0.07562492042779922,
-0.08315092325210571,
0.04892423376441002,
-0.007574727758765221,
0.06634484231472015,
-0.10055305808782578,
-0.08212628960609436,
0.01003948226571083,
0.05265862122178078,
0.07095588743686676,
0.03564474359154701,
-0.13125774264335632,
0.029523862525820732,
0.15423285961151123,
-0.11400265246629715,
-0.13581521809101105,
0.10455869883298874,
-0.015201173722743988,
0.037954043596982956,
0.07223185896873474,
0.1094655841588974,
0.15112431347370148,
-0.10107018053531647,
-0.0311740692704916,
0.0773821473121643,
0.056951623409986496,
-0.059783339500427246,
0.05992329865694046,
0.01784050650894642,
-0.016366397961974144,
0.021329043433070183,
0.05307001993060112,
0.06447520107030869,
-0.00335254636593163,
-0.035226911306381226,
-0.033858757466077805,
-0.09030855447053909,
-0.053993839770555496,
-0.009781471453607082,
0.017759956419467926,
-0.05097460374236107,
-0.07065955549478531,
0.020472895354032516,
0.1790362298488617,
-0.10107041895389557,
0.032298702746629715,
-0.07719437777996063,
-0.030697520822286606,
-0.06854653358459473,
0.01828712970018387,
-0.1229226216673851,
0.014927428215742111,
0.04948699474334717,
-0.023408545181155205,
0.05859728902578354,
0.0772351622581482,
0.0000075076363827975,
0.01580224186182022,
-0.0549132376909256,
-0.03230646997690201,
-0.033989232033491135,
-0.0682017058134079,
-0.10948827117681503,
-0.02679533138871193,
-0.08701270818710327,
-0.02668689750134945,
-0.0338689349591732,
-0.1743304282426834,
-0.001611629850231111,
-0.00042926755850203335,
0.022615743800997734,
0.020790359005331993,
-0.03729034587740898,
0.01881171576678753,
0.050231873989105225,
-0.053066473454236984,
-0.08034766465425491,
0.013739489018917084,
0.043971408158540726,
-0.08800583332777023,
-0.04572588950395584,
-0.10349248349666595,
-0.07551167160272598,
0.08590131998062134,
0.11153901368379593,
-0.13216888904571533,
0.00028589210705831647,
-0.02486921288073063,
-0.04059090092778206,
-0.047304220497608185,
-0.05796312168240547,
0.17053836584091187,
0.013321828097105026,
0.16625143587589264,
-0.13394510746002197,
-0.06673786789178848,
-0.03455767408013344,
0.006176521070301533,
0.03097330592572689,
0.13887296617031097,
0.014343183487653732,
-0.10853761434555054,
0.036688435822725296,
-0.05149416998028755,
-0.057880502194166183,
0.14941255748271942,
-0.015515890903770924,
-0.06923367828130722,
0.0027105407789349556,
0.11413451284170151,
0.006355093326419592,
0.18914616107940674,
-0.04072903096675873,
0.004674592521041632,
-0.009467612951993942,
0.01368630863726139,
0.04456073418259621,
-0.12362830340862274,
0.029944663867354393,
0.045861080288887024,
-0.05327732488512993,
-0.03907203674316406,
-0.03526952862739563,
-0.034895461052656174,
0.04718684405088425,
0.017226343974471092,
0.033582061529159546,
-0.026558559387922287,
-0.030128244310617447,
-0.109316386282444,
0.1921139806509018,
-0.07696863263845444,
-0.22130246460437775,
-0.1634579747915268,
0.06163296476006508,
-0.017075905576348305,
-0.022702718153595924,
0.026918256655335426,
-0.08149898052215576,
-0.057636793702840805,
-0.1003522127866745,
0.12922847270965576,
-0.13211961090564728,
0.0009304790874011815,
-0.03252056986093521,
0.055054694414138794,
0.058798640966415405,
-0.16924655437469482,
0.03447146341204643,
-0.015915129333734512,
0.01005622185766697,
-0.006451778579503298,
-0.0673072561621666,
0.0723847821354866,
0.11653739213943481,
-0.07526781409978867,
0.01570923998951912,
-0.005804246757179499,
0.1726498305797577,
-0.05884052440524101,
0.03937531262636185,
0.15739229321479797,
-0.0009203325607813895,
0.030038682743906975,
0.05018379166722298,
0.010827360674738884,
-0.09508802741765976,
0.061826832592487335,
0.03930040821433067,
-0.03827451169490814,
-0.21032701432704926,
-0.03689071536064148,
-0.08166757971048355,
0.06822913885116577,
0.13539090752601624,
0.04875156283378601,
-0.15431147813796997,
0.02164633199572563,
-0.015874013304710388,
0.16430015861988068,
-0.027066901326179504,
0.06968261301517487,
-0.000579093408305198,
0.012975266203284264,
0.000563371810130775,
-0.10267128050327301,
0.005143040791153908,
0.07688458263874054,
0.10083945840597153,
0.19442503154277802,
-0.0920763611793518,
0.14397059381008148,
0.005878608673810959,
0.10571408271789551,
0.04499775916337967,
0.10977873206138611,
-0.13741692900657654,
0.014920475892722607,
0.012184958904981613,
-0.014193085953593254,
-0.06747347116470337,
0.04594303295016289,
-0.036591220647096634,
0.0671769455075264,
-0.06473955512046814,
-0.0004077977209817618,
0.008219567127525806,
0.18700091540813446,
0.08747073262929916,
-0.17222696542739868,
-0.12819869816303253,
0.006254799198359251,
-0.09611592441797256,
-0.11230888217687607,
0.06527070701122284,
0.19351156055927277,
-0.062310267239809036,
0.024428967386484146,
-0.02091590315103531,
0.13262557983398438,
-0.09967098385095596,
-0.029408184811472893,
0.031427208334207535,
0.0490477979183197,
0.0007426876691170037,
0.11061660945415497,
-0.2397485226392746,
0.08773045986890793,
0.014840511605143547,
0.08465290069580078,
-0.019914910197257996,
0.06400943547487259,
-0.04854658618569374,
-0.004131344147026539,
0.0735366940498352,
0.013632096350193024,
-0.03114466369152069,
-0.1827862709760666,
-0.046495478600263596,
0.022972246631979942,
0.04907254874706268,
0.002585330978035927,
0.09957370162010193,
-0.027557745575904846,
0.0511043556034565,
-0.02325865998864174,
-0.12662796676158905,
-0.04663170501589775,
-0.1480916291475296,
-0.03511185571551323,
-0.01870100572705269,
-0.059897530823946,
-0.025873888283967972,
0.04377015680074692,
0.04202376678586006,
0.24846506118774414,
-0.1529986411333084,
-0.07385098934173584,
-0.0987519696354866,
0.07821295410394669,
0.14113859832286835,
-0.08492638170719147,
0.02918468974530697,
0.010518979281187057,
0.05245766416192055,
-0.04966975376009941,
-0.08066067844629288,
0.025462057441473007,
-0.05845082923769951,
-0.07017822563648224,
-0.03026162087917328,
0.11148159205913544,
-0.010232764296233654,
0.04951955005526543,
0.007978004403412342,
-0.07455655187368393,
-0.042589083313941956,
-0.12233009934425354,
-0.09605298191308975,
-0.02028697170317173,
0.03819252550601959,
-0.00035419079358689487,
-0.13078169524669647,
0.07637222856283188,
-0.008235546760261059,
-0.07832985371351242,
0.08377104997634888,
0.17182472348213196,
-0.0731668546795845,
0.025400526821613312,
0.05846928432583809,
-0.057534124702215195,
-0.1844608336687088,
-0.04173139110207558,
0.04584558680653572,
0.07633501291275024,
-0.018825925886631012,
-0.145036980509758,
0.06149549409747124,
0.002140626311302185,
0.022227352485060692,
0.011038411408662796,
-0.2556617558002472,
-0.12137182056903839,
0.018738457933068275,
0.07057410478591919,
0.03276883810758591,
-0.10444452613592148,
-0.03591552749276161,
-0.06376990675926208,
-0.045180365443229675,
0.0698307454586029,
0.07509792596101761,
0.10561792552471161,
-0.03259330987930298,
0.047463685274124146,
0.04699047654867172,
-0.02920038439333439,
0.059773728251457214,
-0.032910238951444626,
0.09120115637779236,
-0.020940877497196198,
0.002395623130723834,
0.042753107845783234,
-0.06175550818443298,
0.18331673741340637,
-0.1690313071012497,
0.10744910687208176,
-0.18870393931865692,
-0.0375247485935688,
-0.023287925869226456,
0.0015422517899423838,
-0.04125348851084709,
-0.04895048215985298,
-0.12905435264110565,
0.0385516993701458,
0.06491754949092865,
-0.032190173864364624,
0.0367865264415741,
-0.011943024583160877,
-0.036226753145456314,
0.052163928747177124,
0.07411585003137589,
0.014396918006241322,
-0.12188020348548889,
0.03720145300030708,
0.01911056786775589,
0.08311579376459122,
-0.1879921406507492,
0.02209320291876793,
0.10772518068552017,
0.027591604739427567,
0.09211254119873047,
0.010810882784426212,
-0.09502280503511429,
0.011454491876065731,
0.07388891279697418,
-0.07062216848134995,
-0.07023173570632935,
-0.009364143013954163,
-0.008107440546154976,
-0.1010771319270134,
0.021466201171278954,
0.09274131804704666,
-0.060213301330804825,
-0.010905580595135689,
-0.00045784033136442304,
0.008683198131620884,
-0.07869189977645874,
0.17076896131038666,
0.013033337891101837,
0.07497496902942657,
-0.061175961047410965,
0.0656198188662529,
0.09534988552331924,
-0.09160508960485458,
0.025608250871300697,
0.15040412545204163,
-0.08269006013870239,
-0.017725227400660515,
0.10294836759567261,
0.10092883557081223,
-0.039016272872686386,
-0.04312676191329956,
-0.07907596230506897,
-0.07472874224185944,
0.011109755374491215,
0.05527748540043831,
0.06919117271900177,
0.09038333594799042,
-0.026952551677823067,
-0.002108343644067645,
-0.12994535267353058,
0.09462413191795349,
0.0784650593996048,
0.04819296672940254,
-0.13318271934986115,
0.1546010673046112,
0.035763002932071686,
0.0672021135687828,
0.002130115171894431,
0.029512668028473854,
-0.11374308913946152,
0.037917450070381165,
-0.017034167423844337,
0.032962918281555176,
-0.005100417882204056,
0.046453602612018585,
-0.030409391969442368,
0.04154270887374878,
-0.033550601452589035,
0.03813278675079346,
-0.04480314254760742,
-0.02520892210304737,
-0.040060169994831085,
0.00844945851713419,
-0.051627323031425476,
-0.012500248849391937,
0.012783132493495941,
-0.09188820421695709,
0.08305142819881439,
-0.05351966246962547,
-0.0027739880606532097,
0.011959783732891083,
0.023336347192525864,
0.04538664594292641,
0.009764608927071095,
0.05449533462524414,
-0.008576802909374237,
-0.005780182313174009,
0.023868974298238754,
0.02663799561560154,
-0.011369448155164719,
0.001463797758333385,
0.09202200919389725,
-0.13744868338108063,
-0.09123038500547409,
-0.08818784356117249,
-0.07174517214298248,
-0.056655894964933395,
0.0734625831246376,
0.07767608016729355,
0.07970008254051208,
0.09452135860919952,
-0.03542633727192879,
0.013102758675813675,
-0.15615716576576233,
-0.045486994087696075,
0.046734318137168884,
-0.0013502856018021703,
-0.09857090562582016,
-0.03597329556941986,
0.05824475362896919,
-0.02887267805635929,
0.12313904613256454,
-0.019823996350169182,
0.03745151683688164,
-0.011622016318142414,
-0.06174792721867561,
-0.028191400691866875,
-0.00013127687270753086,
0.19157615303993225,
-0.10079295933246613,
0.00681227445602417,
-0.015766078606247902,
0.0034314896911382675,
0.028571799397468567,
0.1602686494588852,
0.11449958384037018,
0.1224445328116417,
0.02923734486103058,
0.07381439954042435,
-0.05056149885058403,
-0.031601861119270325,
-0.10055786371231079,
0.03930220007896423,
-0.028999296948313713,
0.036428000777959824,
-0.029352834448218346,
0.14987361431121826,
0.09165623039007187,
-0.13571129739284515,
0.10355502367019653,
0.009778875857591629,
-0.09466080367565155,
-0.04033656045794487,
-0.07839677482843399,
-0.040700264275074005,
-0.09401848912239075,
0.002438998082652688,
-0.10860975086688995,
-0.0009170167031697929,
0.05832621455192566,
0.03555907681584358,
-0.021981021389365196,
0.15257272124290466,
-0.048156995326280594,
-0.06843235343694687,
0.02561233378946781,
0.052255406975746155,
0.02343314327299595,
0.08824124187231064,
0.025911876931786537,
0.05254922807216644,
-0.06364920735359192,
0.07638019323348999,
0.035674042999744415,
0.000836004561278969,
0.028745431452989578,
0.03051227331161499,
-0.006287984549999237,
-0.045248836278915405,
-0.0142699358984828,
0.08598639816045761,
0.16080178320407867,
0.039624348282814026,
-0.04205869138240814,
-0.060377709567546844,
0.18404802680015564,
-0.05960312485694885,
-0.07166962325572968,
-0.1322491466999054,
0.182769313454628,
0.029257746413350105,
0.008028614334762096,
0.015465541742742062,
-0.07832517474889755,
-0.022046374157071114,
0.2533206343650818,
0.05982457846403122,
-0.04405688866972923,
-0.021200338378548622,
-0.0020278841257095337,
-0.008449611254036427,
-0.04085027053952217,
0.14266091585159302,
0.006548075471073389,
0.25293096899986267,
0.013741560280323029,
-0.020561281591653824,
-0.04939896985888481,
-0.038138460367918015,
-0.004281880799680948,
0.1994590312242508,
-0.03072701022028923,
0.017420614138245583,
-0.1031339168548584,
-0.004897762089967728,
0.023089567199349403,
-0.1512255221605301,
0.11560847610235214,
-0.13569819927215576,
-0.07367531955242157,
0.017228856682777405,
0.05521424114704132,
-0.05143853276968002,
0.04733062908053398,
-0.026938818395137787,
0.06267676502466202,
0.032158832997083664,
-0.0229792520403862,
-0.10880496352910995,
-0.15210959315299988,
0.06177075207233429,
-0.0009522156906314194,
0.13347990810871124,
0.01717812567949295,
0.08126454055309296,
0.08281877636909485,
0.007367499638348818,
-0.07250090688467026,
0.07849006354808807,
0.03436135873198509,
-0.011278276331722736,
0.051872044801712036,
0.1256750375032425,
-0.048196941614151,
0.1592390537261963,
0.009668524377048016,
-0.018686294555664062,
-0.023122426122426987,
-0.030048085376620293,
-0.0029465288389474154,
-0.14960959553718567,
-0.008712497539818287,
-0.06594599783420563,
0.1406583935022354,
0.20699553191661835,
-0.0469466932117939,
-0.00958788301795721,
-0.0529014952480793,
0.08084993809461594,
-0.0065215048380196095,
0.08014939725399017,
0.005140842404216528,
-0.17402255535125732,
0.0031316597014665604,
-0.05124271288514137,
0.0008598018903285265,
-0.1979154348373413,
-0.049351178109645844,
-0.03847179189324379,
-0.036962129175662994,
-0.10153764486312866,
0.15457060933113098,
0.06369628757238388,
0.025586487725377083,
-0.0418924018740654,
-0.10111241787672043,
-0.014186043292284012,
0.04599646478891373,
-0.11926551908254623,
-0.11525238305330276
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus ruby dataset.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/function%20documentation%20generation/ruby/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_ruby
|
[
"transformers",
"pytorch",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on CodeSearchNet Corpus ruby dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
43,
112
] |
[
"passage: TAGS\n#transformers #pytorch #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.10921680927276611,
0.009104340337216854,
-0.0012667434057220817,
0.09481465071439743,
0.15933792293071747,
0.011987284757196903,
0.11490911990404129,
0.05778690427541733,
-0.054066773504018784,
-0.030332284048199654,
0.09247341752052307,
0.17438563704490662,
0.022539841011166573,
0.11449670046567917,
-0.04586804285645485,
-0.20481987297534943,
-0.025187280029058456,
0.04368893429636955,
-0.16191527247428894,
0.11959215998649597,
0.13268381357192993,
-0.05208682268857956,
0.10520680993795395,
-0.014259520918130875,
-0.2421981692314148,
0.05182332172989845,
-0.00962996669113636,
-0.03399858996272087,
0.1475231647491455,
0.08004555851221085,
0.118920236825943,
0.03390113264322281,
0.006522198207676411,
-0.22159108519554138,
0.03516748920083046,
-0.03714863210916519,
0.00584328081458807,
0.05382050201296806,
0.0712738186120987,
-0.04239685460925102,
0.16730739176273346,
-0.028670499101281166,
0.015682362020015717,
0.054651740938425064,
-0.10944024473428726,
-0.07407192885875702,
-0.05994324013590813,
0.031352605670690536,
0.03721460700035095,
0.05862699821591377,
0.013998223468661308,
0.14659686386585236,
-0.10765718668699265,
0.1384270191192627,
0.14821790158748627,
-0.21449603140354156,
-0.022464824840426445,
0.11323822289705276,
0.0530291385948658,
-0.047812703996896744,
-0.02089853025972843,
0.020496195182204247,
0.07581163942813873,
0.017713598906993866,
-0.020696530118584633,
-0.12175634503364563,
-0.13940857350826263,
0.0764581486582756,
-0.08520903438329697,
-0.05816824361681938,
0.25027668476104736,
-0.020938459783792496,
-0.05478883534669876,
-0.030591286718845367,
-0.04678938165307045,
-0.005320437252521515,
-0.0005213564145378768,
0.01412010658532381,
0.00002605960071377922,
-0.007698242552578449,
-0.03148176521062851,
-0.006616051774471998,
-0.0872744470834732,
-0.13700613379478455,
0.02211344614624977,
0.09994121640920639,
0.0163978710770607,
0.038149457424879074,
-0.16415108740329742,
0.09074744582176208,
0.08390646427869797,
-0.058921605348587036,
0.018878590315580368,
-0.08374335616827011,
-0.04305468127131462,
0.008100167848169804,
-0.060760173946619034,
-0.11988700926303864,
0.10133162885904312,
0.11713356524705887,
-0.04029560834169388,
0.07308422774076462,
0.0007002722704783082,
0.0851127877831459,
0.03172501549124718,
0.1812979280948639,
0.03616901859641075,
-0.07788548618555069,
0.04488305747509003,
-0.045706190168857574,
-0.046453919261693954,
0.005130124278366566,
-0.07398919761180878,
-0.05621796101331711,
0.058488745242357254,
0.1182510182261467,
-0.08729428052902222,
0.07711460441350937,
-0.07523439824581146,
-0.03448642045259476,
0.02717757038772106,
-0.11697977036237717,
-0.02986016310751438,
0.001994484104216099,
-0.06379608809947968,
-0.05225956439971924,
0.0760837271809578,
-0.05469566211104393,
-0.09174992144107819,
-0.014745919033885002,
-0.048011381179094315,
-0.00010609706077957526,
-0.083458811044693,
-0.08270151913166046,
-0.017613878473639488,
-0.007520725019276142,
0.08316604793071747,
-0.1429550051689148,
-0.18812896311283112,
0.0009968283120542765,
0.07269265502691269,
-0.002205668017268181,
0.029428593814373016,
-0.08111115545034409,
0.005112646613270044,
-0.040359657257795334,
-0.004112618509680033,
0.007922280579805374,
-0.05330035462975502,
0.07714562863111496,
0.11592458188533783,
0.02811003103852272,
-0.08152987062931061,
0.034641362726688385,
-0.11423559486865997,
0.04944055154919624,
-0.15224988758563995,
0.10392551124095917,
-0.0382813923060894,
0.11160916835069656,
-0.10902636498212814,
-0.061824001371860504,
0.04495595023036003,
0.053868625313043594,
0.07460158318281174,
0.14011864364147186,
-0.1710885763168335,
-0.04381582885980606,
0.15613608062267303,
-0.12336456030607224,
-0.2017255574464798,
0.08953200280666351,
-0.07697862386703491,
0.11997254192829132,
0.06662273406982422,
0.18615439534187317,
0.12317944318056107,
-0.02121557481586933,
0.044649116694927216,
0.06590314954519272,
-0.05276363715529442,
-0.0610332116484642,
0.08539161086082458,
0.07333826273679733,
-0.18002106249332428,
0.05075366050004959,
-0.003929975442588329,
0.075143963098526,
-0.04937918484210968,
-0.04653841257095337,
-0.01805330440402031,
-0.08252214640378952,
0.09358745813369751,
-0.012042105197906494,
0.08712773025035858,
-0.006513841450214386,
-0.00520349619910121,
-0.020916316658258438,
0.11826224625110626,
-0.08890286833047867,
0.0015591825358569622,
-0.11948836594820023,
0.06358648091554642,
-0.06942588835954666,
0.019760603085160255,
-0.23740705847740173,
0.04806706681847572,
0.024355031549930573,
0.08284800499677658,
0.07526949793100357,
0.04276476427912712,
0.00415677297860384,
0.022375348955392838,
-0.007137808483093977,
-0.021756038069725037,
0.008398373611271381,
-0.03869917243719101,
-0.03489247336983681,
-0.09996895492076874,
-0.03532838076353073,
-0.060635991394519806,
0.02318182773888111,
-0.21485190093517303,
-0.0037259215023368597,
0.052741337567567825,
0.05348797142505646,
0.04083472117781639,
0.02892278879880905,
0.016411980614066124,
0.06507081538438797,
-0.050958406180143356,
-0.018813680857419968,
0.06776254624128342,
0.0331287607550621,
-0.09413034468889236,
0.015158304944634438,
-0.02381134405732155,
0.04266241565346718,
0.0980588048696518,
-0.14720584452152252,
-0.07020126283168793,
-0.08176639676094055,
-0.029835667461156845,
-0.0112423375248909,
0.021159114316105843,
-0.0017945852596312761,
0.2558061480522156,
0.0023851790465414524,
0.1664220690727234,
-0.09790471196174622,
-0.005216007586568594,
-0.032274097204208374,
-0.03283507749438286,
0.055826470255851746,
0.145645871758461,
0.06041697785258293,
-0.24753665924072266,
0.038576889783144,
0.015411709435284138,
-0.024232087656855583,
0.17008407413959503,
-0.04874379187822342,
0.008962125517427921,
-0.021987449377775192,
0.06889144331216812,
-0.0022652980405837297,
0.1383255571126938,
-0.16286934912204742,
-0.030740177258849144,
0.009309140034019947,
-0.03005669265985489,
0.10161590576171875,
-0.1389738768339157,
0.0006341997650451958,
0.031035153195261955,
-0.008310604840517044,
-0.11519502848386765,
0.04538519307971001,
-0.004731336142867804,
0.029177846387028694,
-0.008164157159626484,
-0.011589431203901768,
0.030455585569143295,
-0.028560884296894073,
-0.11595688760280609,
0.22318843007087708,
-0.08313196897506714,
-0.19883973896503448,
-0.1732168048620224,
0.010152038186788559,
-0.029614031314849854,
-0.0035153983626514673,
0.0709286779165268,
-0.07607889920473099,
-0.018191706389188766,
-0.021426234394311905,
0.15554484724998474,
-0.06798416376113892,
-0.012373823672533035,
-0.012958409264683723,
0.07916168868541718,
-0.019921591505408287,
-0.1725069135427475,
-0.012959608808159828,
-0.002604712964966893,
0.05987170338630676,
0.038847774267196655,
-0.15068241953849792,
0.10428091883659363,
0.08393292129039764,
-0.04732237383723259,
0.04453960061073303,
-0.04864652827382088,
0.24388103187084198,
-0.04962731525301933,
-0.10001978278160095,
0.18158750236034393,
-0.10380807518959045,
0.03743904083967209,
0.03823786973953247,
0.012522890232503414,
-0.10252837091684341,
0.04501648619771004,
-0.061523448675870895,
-0.05747949331998825,
-0.27012282609939575,
-0.10202112793922424,
-0.08833567798137665,
0.11796350032091141,
0.033811844885349274,
0.04006260260939598,
-0.08808039128780365,
0.07496614009141922,
0.08321766555309296,
0.14367647469043732,
-0.019321398809552193,
0.07736146450042725,
0.022879473865032196,
0.01418173685669899,
0.006570964585989714,
-0.10996063798666,
-0.058864809572696686,
0.04439558833837509,
0.0765497013926506,
0.23508602380752563,
0.009378405287861824,
0.10262086242437363,
0.06373782455921173,
0.047487564384937286,
0.0757666751742363,
0.18311265110969543,
-0.10139219462871552,
0.02315339259803295,
-0.029526297003030777,
-0.031299248337745667,
-0.16930429637432098,
0.03321024030447006,
-0.046502742916345596,
0.0033614812418818474,
-0.12033572047948837,
-0.02295692078769207,
0.06461088359355927,
0.09540747106075287,
0.006862435024231672,
-0.259361207485199,
-0.09420797973871231,
0.011662844568490982,
-0.048956844955682755,
-0.04498160257935524,
0.05041230469942093,
0.05417921394109726,
-0.13381323218345642,
0.012075048871338367,
-0.07253029942512512,
0.16600537300109863,
-0.1145424023270607,
0.03518380969762802,
-0.07203945517539978,
-0.04128720983862877,
0.008081887848675251,
0.14294005930423737,
-0.22244565188884735,
0.2490776628255844,
0.0035134966019541025,
0.008474067784845829,
-0.04897221550345421,
0.033567268401384354,
0.019977474585175514,
0.10492323338985443,
0.11608585715293884,
-0.026161804795265198,
-0.08856318891048431,
-0.17464640736579895,
0.054062772542238235,
0.07465271651744843,
0.11388379335403442,
-0.024237671867012978,
0.07084688544273376,
-0.04580403119325638,
0.04502767696976662,
-0.024249281734228134,
-0.06136276572942734,
-0.1245754137635231,
-0.11534333229064941,
-0.007412865292280912,
-0.08378716558218002,
0.02806420810520649,
-0.03242195025086403,
0.028734108433127403,
0.09059222787618637,
0.1211407259106636,
-0.07022731751203537,
-0.05731309577822685,
-0.10013134032487869,
0.031583987176418304,
0.0942896381020546,
-0.10124092549085617,
0.026535648852586746,
0.000423578720074147,
0.018937472254037857,
-0.007992592640221119,
-0.13270923495292664,
0.03394804149866104,
-0.07013954967260361,
-0.02218351513147354,
-0.03380943462252617,
0.06947307288646698,
-0.023395247757434845,
-0.002419095253571868,
0.05312364175915718,
-0.042392268776893616,
-0.0647355243563652,
-0.12558826804161072,
-0.08087863773107529,
-0.08464087545871735,
0.025288255885243416,
0.07909627258777618,
-0.10920193791389465,
0.0388210229575634,
-0.05288060009479523,
-0.039041973650455475,
0.24104367196559906,
0.14737236499786377,
-0.04631846025586128,
0.001984358299523592,
0.11740462481975555,
-0.09256625175476074,
-0.26035380363464355,
-0.03176821395754814,
-0.010246333666145802,
0.05833129584789276,
0.04095807299017906,
-0.17137524485588074,
0.06714487075805664,
-0.06834539026021957,
0.0315273180603981,
-0.034838706254959106,
-0.22837038338184357,
-0.08154325932264328,
0.11021686345338821,
0.11998554319143295,
0.10887956619262695,
-0.10503241419792175,
-0.07165244221687317,
-0.10378079116344452,
-0.15024136006832123,
0.11175955832004547,
-0.07738860696554184,
0.09699040651321411,
-0.022590626031160355,
0.045173462480306625,
0.03048880025744438,
-0.03981773927807808,
0.09397345036268234,
-0.003963896539062262,
0.11587205529212952,
-0.013467029668390751,
-0.14897848665714264,
0.08638256043195724,
-0.02872079238295555,
0.09211722761392593,
-0.14454375207424164,
0.12169276177883148,
-0.20608745515346527,
-0.03747893124818802,
-0.003550653113052249,
0.018818974494934082,
0.002745827427133918,
-0.07230101525783539,
-0.07279279083013535,
0.016698377206921577,
0.01819206029176712,
0.01319846697151661,
0.10238317400217056,
-0.007526016794145107,
-0.028815003111958504,
0.12786421179771423,
0.11870362609624863,
0.038769546896219254,
-0.06158748269081116,
0.021117722615599632,
0.02572900615632534,
0.0932069793343544,
-0.2131074219942093,
0.0687451958656311,
0.1141924262046814,
0.03192359209060669,
0.11491110920906067,
0.08512181788682938,
-0.034733086824417114,
0.0469343438744545,
0.08988254517316818,
-0.11983228474855423,
-0.10253871232271194,
-0.06710941344499588,
-0.010970485396683216,
0.017933087423443794,
0.10881689935922623,
0.14840756356716156,
-0.06367618590593338,
-0.019346999004483223,
-0.009589260444045067,
-0.021854808554053307,
-0.13722093403339386,
0.1525922417640686,
0.05155358836054802,
0.08207666873931885,
-0.0857333168387413,
0.07637523114681244,
0.0455043688416481,
-0.06138620525598526,
-0.02928173542022705,
0.09064282476902008,
-0.12181659042835236,
-0.08633722364902496,
0.007328317034989595,
0.3041950762271881,
-0.10882587730884552,
-0.0844162255525589,
-0.16069726645946503,
-0.038488227874040604,
-0.012154465541243553,
0.15386682748794556,
0.12235242873430252,
0.08430400490760803,
-0.07305833697319031,
-0.017593972384929657,
-0.07944317907094955,
0.047334298491477966,
0.11330024898052216,
0.027202876284718513,
-0.14513640105724335,
0.060854773968458176,
-0.012430977076292038,
0.15063107013702393,
-0.05093816667795181,
-0.01816406473517418,
-0.17571412026882172,
0.08211309462785721,
-0.09364917874336243,
0.08461491018533707,
-0.004126773215830326,
0.036676306277513504,
0.0007268179324455559,
0.025245480239391327,
-0.025897901505231857,
0.04585355892777443,
-0.08847362548112869,
0.023327408358454704,
0.004896479193121195,
0.0679275393486023,
-0.06770941615104675,
-0.007408810313791037,
0.09108700603246689,
-0.05441517382860184,
0.10692797601222992,
0.01213109865784645,
-0.06866258382797241,
0.11826511472463608,
-0.15453585982322693,
-0.02050887979567051,
0.03620683774352074,
0.02052072435617447,
0.04049695283174515,
-0.0249397661536932,
0.03201444074511528,
-0.004723200108855963,
0.031387653201818466,
-0.009482031688094139,
0.15156126022338867,
-0.10802227258682251,
-0.08089134842157364,
-0.06596757471561432,
-0.11315421015024185,
-0.03802011162042618,
0.03606755658984184,
0.017794683575630188,
0.1300407201051712,
0.09703517705202103,
-0.02516137808561325,
0.044216375797986984,
-0.050156477838754654,
-0.013088862411677837,
0.058791328221559525,
-0.1005612164735794,
-0.07513653486967087,
-0.12143927067518234,
0.03470117971301079,
-0.0430428571999073,
0.2028116136789322,
0.03090258687734604,
0.09662751853466034,
-0.030357394367456436,
0.004064799286425114,
-0.020617254078388214,
0.028613856062293053,
0.21921834349632263,
-0.02601676620543003,
0.052425552159547806,
-0.06954643875360489,
0.06011015176773071,
0.012812347151339054,
0.10463256388902664,
0.1271516978740692,
0.17288759350776672,
-0.017840396612882614,
0.06927834451198578,
0.032857537269592285,
0.04235570505261421,
-0.06448889523744583,
-0.11552523821592331,
0.09174572676420212,
0.031404606997966766,
-0.055272601544857025,
0.1833447366952896,
0.10078299790620804,
-0.06928718835115433,
0.09647737443447113,
-0.0032880944199860096,
-0.09122379124164581,
-0.06152019277215004,
0.007558346726000309,
-0.060179442167282104,
-0.159225732088089,
0.010375424288213253,
-0.10424137860536575,
-0.0710512325167656,
0.0944380983710289,
0.006661823019385338,
-0.041434310376644135,
0.1490694284439087,
-0.011980138719081879,
-0.052320696413517,
0.05773533508181572,
-0.02860899642109871,
-0.024514155462384224,
0.031803470104932785,
0.057447757571935654,
-0.0167956855148077,
-0.03336799517273903,
0.023205101490020752,
0.033434100449085236,
-0.08086410909891129,
0.01188549492508173,
-0.04515785351395607,
-0.035888224840164185,
-0.036761920899152756,
0.03303665667772293,
-0.003267943859100342,
0.06168001517653465,
0.005394845735281706,
-0.019909333437681198,
-0.013449713587760925,
0.24366652965545654,
-0.03562892600893974,
-0.08431367576122284,
-0.10852450132369995,
0.19752199947834015,
0.04122550040483475,
0.07358131557703018,
0.010650765150785446,
-0.07091491669416428,
-0.022375090047717094,
0.2577500641345978,
0.22298085689544678,
-0.04164324328303337,
0.00811389833688736,
0.026999764144420624,
0.02821553498506546,
-0.00500418059527874,
0.14694908261299133,
0.026243582367897034,
0.20075684785842896,
-0.030046282336115837,
-0.05639845132827759,
-0.06004122272133827,
-0.043091729283332825,
0.04096300154924393,
0.11736659705638885,
0.05264342948794365,
-0.029142772778868675,
-0.05323721840977669,
0.12110348045825958,
-0.17217561602592468,
-0.1583736538887024,
0.02189028076827526,
-0.14811095595359802,
-0.08504702895879745,
-0.057532310485839844,
0.07212099432945251,
-0.021998338401317596,
0.08183764666318893,
-0.0517532043159008,
-0.0050912718288600445,
0.0036071434151381254,
0.03483961895108223,
-0.14893293380737305,
-0.08495621383190155,
0.06222456693649292,
-0.0473083034157753,
0.1123877689242363,
-0.04158580303192139,
0.12751875817775726,
0.11184792965650558,
0.03364531323313713,
-0.037554334849119186,
0.02639858052134514,
0.04350278526544571,
0.014261739328503609,
0.05504092201590538,
0.07415801286697388,
-0.038252390921115875,
0.0814070850610733,
-0.034232061356306076,
-0.10043652355670929,
0.03598092123866081,
0.0640430822968483,
0.010249467566609383,
-0.13391843438148499,
-0.00865011103451252,
-0.1136423721909523,
0.08164123445749283,
0.15577562153339386,
-0.03895268589258194,
0.04034145548939705,
-0.07099927216768265,
0.1238972395658493,
0.015121125616133213,
-0.010750111192464828,
-0.07402343302965164,
-0.14803212881088257,
-0.03216966614127159,
0.0680321604013443,
-0.03180628642439842,
-0.21664400398731232,
-0.02366369217634201,
-0.06253164261579514,
0.009224913083016872,
-0.05434247851371765,
0.12979839742183685,
0.14500083029270172,
0.01761801727116108,
-0.014665963128209114,
-0.2313874512910843,
-0.05028247833251953,
0.05697062611579895,
-0.10162021219730377,
-0.15376073122024536
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/function%20documentation%20generation/ruby/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_ruby_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
50,
61,
143
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 420,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12642936408519745,
-0.006134607829153538,
-0.0003795455559156835,
0.16387560963630676,
0.11595796048641205,
0.014723332598805428,
0.07822862267494202,
0.0736374482512474,
-0.08383119851350784,
0.026111692190170288,
0.048850785940885544,
0.05305933579802513,
0.022859688848257065,
0.15968133509159088,
0.001107978168874979,
-0.15520219504833221,
-0.01118104811757803,
0.01267128624022007,
-0.07514488697052002,
0.11696350574493408,
0.11006072908639908,
-0.07511290907859802,
0.048052046447992325,
-0.04611900821328163,
-0.25553014874458313,
0.055248551070690155,
-0.007998011074960232,
-0.04177692532539368,
0.1139209046959877,
0.04655227065086365,
0.13080112636089325,
-0.003965061157941818,
0.03584125638008118,
-0.1236724779009819,
0.006370078772306442,
0.029665932059288025,
0.03524261713027954,
0.02832108549773693,
0.08413955569267273,
0.049132414162158966,
0.13555194437503815,
-0.011994574218988419,
0.049636662006378174,
0.056319259107112885,
-0.07214245200157166,
-0.15079233050346375,
-0.049191247671842575,
0.04024699702858925,
0.022353243082761765,
0.08313176780939102,
-0.010603501461446285,
0.1424252688884735,
-0.1227867603302002,
0.132841095328331,
0.14975567162036896,
-0.2795338034629822,
-0.017143866047263145,
0.11835139989852905,
0.06943874061107635,
0.0977080911397934,
-0.02823052927851677,
-0.03764066472649574,
0.09330154210329056,
0.05440515652298927,
0.02975439466536045,
-0.07675258815288544,
-0.09227757900953293,
0.03845415264368057,
-0.1073882132768631,
-0.07233060151338577,
0.21265484392642975,
-0.010356479324400425,
-0.0777563601732254,
-0.07411377131938934,
-0.04121279716491699,
-0.16090038418769836,
0.02637995220720768,
0.03578159213066101,
0.004059719853103161,
-0.021850207820534706,
0.02265535667538643,
0.041250407695770264,
-0.08307493478059769,
-0.14604805409908295,
0.039832137525081635,
0.11083769053220749,
0.0662851333618164,
0.032456908375024796,
-0.08968610316514969,
0.11005982756614685,
0.05863506346940994,
-0.05079449340701103,
-0.020486267283558846,
-0.03795285150408745,
-0.11674807220697403,
0.05112765356898308,
-0.06797003746032715,
-0.15400330722332,
0.013079147785902023,
0.043592702597379684,
-0.018427705392241478,
0.06070300191640854,
0.014104696922004223,
0.03245621919631958,
0.009668143466114998,
0.20048189163208008,
0.04141873121261597,
-0.10204090923070908,
0.04944190755486488,
0.023000918328762054,
-0.054418984800577164,
-0.02034906856715679,
-0.06670136004686356,
-0.10054009407758713,
0.10270728915929794,
0.09359902888536453,
-0.1278710961341858,
0.04677814990282059,
-0.06311485916376114,
-0.03430581092834473,
0.025626016780734062,
-0.14992979168891907,
0.0017556437524035573,
0.0049598803743720055,
-0.05710836872458458,
-0.06157270446419716,
0.06438744813203812,
-0.1376350075006485,
-0.13483640551567078,
-0.03364352136850357,
-0.06204352155327797,
-0.04701166972517967,
-0.1441020965576172,
-0.15333440899848938,
-0.03433728963136673,
-0.06146266311407089,
0.02258829027414322,
-0.09654262661933899,
-0.1545547991991043,
-0.025424238294363022,
0.02405780367553234,
-0.0036033261567354202,
-0.004418565426021814,
-0.05595805123448372,
-0.006004729773849249,
-0.009091917425394058,
-0.02580798789858818,
0.003905647434294224,
-0.03786439076066017,
0.10081177949905396,
0.1039552316069603,
0.02443479374051094,
-0.010649564675986767,
0.0418667271733284,
-0.055501848459243774,
0.05098464712500572,
-0.11045872420072556,
0.1061096116900444,
-0.06798765808343887,
0.08116905391216278,
-0.054828230291604996,
-0.10684159398078918,
0.06584387272596359,
0.06038849800825119,
0.06948953121900558,
0.04749855026602745,
-0.17962053418159485,
-0.013293244875967503,
0.18851836025714874,
-0.12347237020730972,
-0.11232642829418182,
0.11791005730628967,
-0.051923200488090515,
0.04266875237226486,
0.07204641401767731,
0.14674729108810425,
0.13767549395561218,
-0.02839902602136135,
0.012899285182356834,
0.030711732804775238,
0.04991699010133743,
-0.11331414431333542,
0.09304118901491165,
0.06634403765201569,
-0.11161591857671738,
0.061912886798381805,
-0.018818989396095276,
0.06623918563127518,
-0.008200434036552906,
-0.03423692286014557,
-0.059756312519311905,
-0.0732383280992508,
0.04355594888329506,
-0.0013614700874313712,
0.08418624848127365,
-0.07422181963920593,
-0.07631536573171616,
0.01988471858203411,
0.17463910579681396,
-0.11367615312337875,
-0.0018718959763646126,
-0.08784845471382141,
0.05113140866160393,
-0.05883249640464783,
0.01145316194742918,
-0.19287534058094025,
0.07135963439941406,
0.08713120222091675,
0.00948497373610735,
0.06862358003854752,
0.1368609368801117,
0.02179158478975296,
0.050231561064720154,
-0.004415606614202261,
-0.023081090301275253,
-0.10192758589982986,
-0.06083983555436134,
-0.06264660507440567,
-0.06320551037788391,
-0.08692743629217148,
-0.06979936361312866,
-0.004811984486877918,
-0.20908093452453613,
0.011478487402200699,
0.01995566114783287,
0.027874642983078957,
0.045144639909267426,
-0.0147008141502738,
0.00032980379182845354,
0.06994187086820602,
-0.054342590272426605,
-0.0364965982735157,
0.020385578274726868,
0.027724439278244972,
-0.022390317171812057,
-0.05378478392958641,
-0.06236586347222328,
0.014032632112503052,
0.1022142767906189,
0.045703690499067307,
-0.08615407347679138,
0.005112259648740292,
-0.014431958086788654,
-0.026715561747550964,
0.02872437983751297,
-0.04935517907142639,
0.1875597983598709,
-0.0022130173165351152,
0.1975172460079193,
-0.15012632310390472,
-0.02701270766556263,
-0.013022802770137787,
0.01845897175371647,
0.05267816781997681,
0.1367172747850418,
-0.027961166575551033,
-0.12546154856681824,
0.04476339742541313,
-0.0071838535368442535,
-0.08657828718423843,
0.20126545429229736,
-0.05669577419757843,
-0.06108812615275383,
0.007929597049951553,
0.08817833662033081,
-0.0013105846010148525,
0.14372389018535614,
-0.14728543162345886,
-0.027682501822710037,
0.018662434071302414,
0.010277715511620045,
0.06756148487329483,
-0.13786624372005463,
0.007317927200347185,
0.02577287144958973,
-0.04458435997366905,
-0.0644504502415657,
0.004670278634876013,
-0.016130661591887474,
0.0407826267182827,
-0.007660178001970053,
-0.02432182990014553,
0.010080679319798946,
-0.033273518085479736,
-0.08956447243690491,
0.202647402882576,
-0.1227995753288269,
-0.1841975748538971,
-0.19346733391284943,
0.09127660095691681,
-0.06245408207178116,
-0.005843114107847214,
0.03723623603582382,
-0.10681428760290146,
-0.043540459126234055,
-0.06225031241774559,
0.17035333812236786,
-0.1056348904967308,
0.0016865781508386135,
-0.00395542336627841,
0.07293792814016342,
-0.0007522536907345057,
-0.21239064633846283,
0.03390629589557648,
-0.008895553648471832,
-0.0014106105081737041,
0.013508088886737823,
-0.09695255011320114,
0.08083699643611908,
0.14591793715953827,
-0.06787613779306412,
0.028993165120482445,
-0.0057037933729588985,
0.2164916843175888,
-0.029376372694969177,
-0.08065278828144073,
0.15697672963142395,
-0.003043391974642873,
0.012404408305883408,
0.03038339875638485,
-0.0033822236582636833,
-0.08567028492689133,
0.056488700211048126,
-0.02419719658792019,
-0.018647203221917152,
-0.27215468883514404,
-0.023317143321037292,
-0.08907884359359741,
0.04478489235043526,
0.05122344568371773,
0.060020577162504196,
-0.07333385199308395,
0.023888753727078438,
0.06029374897480011,
0.14634957909584045,
-0.013421621173620224,
0.06013224646449089,
0.03871328383684158,
0.010015206411480904,
0.019162453711032867,
-0.10079167783260345,
-0.0065153478644788265,
0.07511530071496964,
0.09868017584085464,
0.2827814519405365,
-0.09737420082092285,
0.1719515174627304,
0.03748070448637009,
0.07908504456281662,
0.0764588788151741,
0.16518723964691162,
-0.11460482329130173,
0.02645653672516346,
-0.011565933004021645,
-0.018944572657346725,
-0.1296454668045044,
0.02459486573934555,
-0.05712481588125229,
0.026057599112391472,
-0.09122192114591599,
-0.060743141919374466,
0.006331161130219698,
0.15656445920467377,
0.04586902633309364,
-0.2187839150428772,
-0.09434032440185547,
0.009639973752200603,
-0.08325675874948502,
-0.10014981031417847,
0.05969101935625076,
0.1852882206439972,
-0.06932389736175537,
-0.029170038178563118,
-0.029255913570523262,
0.1277197152376175,
-0.06306266039609909,
-0.025227604433894157,
-0.04918096214532852,
0.05167985334992409,
0.011178582906723022,
0.1126253604888916,
-0.27077963948249817,
0.15469974279403687,
-0.004255551844835281,
0.0551968589425087,
-0.042398322373628616,
0.06743139028549194,
-0.03492072969675064,
0.06657208502292633,
0.04665853828191757,
-0.014296418987214565,
-0.04838662967085838,
-0.1733202040195465,
0.02669803984463215,
0.02401866763830185,
0.059380367398262024,
0.04123936966061592,
0.06937707215547562,
-0.018870655447244644,
0.04894813522696495,
-0.003284933976829052,
-0.09205161780118942,
-0.09550538659095764,
-0.10560894012451172,
-0.0011939246905967593,
-0.06543021649122238,
-0.046872496604919434,
-0.05736476555466652,
-0.014218522235751152,
0.038337334990501404,
0.13098053634166718,
-0.07170496881008148,
-0.060495078563690186,
-0.08397068828344345,
0.02542562037706375,
0.12362048029899597,
-0.0799354761838913,
0.0567140094935894,
-0.002232795348390937,
0.04274160414934158,
-0.00815711822360754,
-0.08518790453672409,
0.04000507667660713,
-0.0327637642621994,
-0.08803284913301468,
-0.023442404344677925,
0.046583373099565506,
-0.007294176612049341,
0.021461719647049904,
0.008024095557630062,
-0.04508642479777336,
-0.058679960668087006,
-0.10775111615657806,
-0.10653530061244965,
-0.06070941686630249,
0.017764627933502197,
0.05139950290322304,
-0.11638518422842026,
-0.04550899565219879,
-0.01830836571753025,
-0.04534347727894783,
0.1479569673538208,
0.18994462490081787,
-0.08572951704263687,
0.022715045139193535,
0.09943761676549911,
-0.04877366125583649,
-0.18621979653835297,
-0.00020198614220134914,
0.07010652124881744,
0.10590515285730362,
-0.013877437449991703,
-0.17480172216892242,
0.027174627408385277,
-0.014926147647202015,
0.019564790651202202,
-0.01619502529501915,
-0.3022002577781677,
-0.12285292893648148,
0.062019433826208115,
0.14020448923110962,
0.08181937783956528,
-0.09802913665771484,
-0.029241155833005905,
-0.07285695523023605,
-0.12743587791919708,
0.08748450130224228,
-0.025705739855766296,
0.12784703075885773,
-0.05088131129741669,
0.0148138627409935,
0.032928287982940674,
-0.04411466047167778,
0.07756034284830093,
0.009974404238164425,
0.12750108540058136,
-0.02687101811170578,
-0.004784844350069761,
0.12599489092826843,
-0.03535811975598335,
0.13715511560440063,
-0.16370061039924622,
0.12104008346796036,
-0.24193371832370758,
-0.06347250938415527,
-0.04900046065449715,
-0.01085775624960661,
-0.03350871056318283,
-0.052524104714393616,
-0.07942290604114532,
0.036868683993816376,
-0.0032568201422691345,
-0.019975731149315834,
0.05441443994641304,
-0.01181681826710701,
-0.026785196736454964,
0.12641337513923645,
0.0763658955693245,
0.04886239767074585,
-0.06425014138221741,
0.04266955703496933,
0.05135184898972511,
0.09084717184305191,
-0.2126958817243576,
0.02633601613342762,
0.1016291007399559,
0.013620852492749691,
0.11829344928264618,
0.049431294202804565,
-0.10133149474859238,
0.03646409139037132,
0.09415286779403687,
-0.06964600086212158,
-0.10985960066318512,
-0.01533303502947092,
-0.02271309494972229,
-0.0469234399497509,
0.07417310774326324,
0.10759802162647247,
-0.0551719106733799,
-0.03205002099275589,
-0.024628832936286926,
-0.028701605275273323,
-0.11350231617689133,
0.20447076857089996,
0.0686582550406456,
0.08823923766613007,
-0.06658090651035309,
0.05302819609642029,
0.08522510528564453,
-0.02554353140294552,
0.0046761054545640945,
0.16624249517917633,
-0.09498299658298492,
-0.05652991309762001,
0.07737067341804504,
0.21262410283088684,
-0.016536911949515343,
-0.04203714430332184,
-0.13321642577648163,
-0.05186253413558006,
0.02309517376124859,
0.10171309113502502,
0.1062629297375679,
0.0951240062713623,
-0.06069505959749222,
-0.012906250543892384,
-0.07752908766269684,
0.07432324439287186,
0.08537909388542175,
0.05451292544603348,
-0.15387457609176636,
0.13204325735569,
0.01795748434960842,
0.09989458322525024,
-0.03200942650437355,
-0.002670781221240759,
-0.11661670356988907,
0.05447307229042053,
-0.08558820188045502,
0.044781602919101715,
0.02661776915192604,
0.06239626929163933,
-0.026536477729678154,
0.013916869647800922,
-0.023971281945705414,
0.06209956482052803,
-0.10075647383928299,
0.0005872277542948723,
-0.0032428824342787266,
0.04874112084507942,
-0.03787057101726532,
-0.016117313876748085,
0.04299042746424675,
-0.09106722474098206,
0.13521894812583923,
-0.04001469910144806,
-0.0392368920147419,
0.09801168739795685,
-0.019925439730286598,
0.057015880942344666,
-0.0007272912189364433,
0.05330611765384674,
0.00954447127878666,
0.05157885700464249,
0.0717313140630722,
0.007735376711934805,
0.05136191472411156,
0.01922202669084072,
0.12048745155334473,
-0.12461676448583603,
-0.09517545998096466,
-0.07573243230581284,
-0.1008024588227272,
-0.055889710783958435,
0.11041703075170517,
0.05509793013334274,
0.1266516000032425,
0.0922388955950737,
-0.017470939084887505,
0.006957297679036856,
-0.10869713872671127,
-0.058562569320201874,
0.03593410179018974,
-0.07146286219358444,
-0.0771854817867279,
-0.0753292590379715,
0.045846473425626755,
-0.015398472547531128,
0.1375846117734909,
0.041769854724407196,
0.007832663133740425,
-0.03936348855495453,
-0.025609849020838737,
-0.01812991127371788,
0.017507079988718033,
0.21239861845970154,
-0.06863194704055786,
0.048102185130119324,
-0.00896748062223196,
0.0019714520312845707,
-0.009082725271582603,
0.16665585339069366,
0.11279865354299545,
0.19195662438869476,
-0.018842976540327072,
0.07448162138462067,
0.02130231261253357,
-0.020256033167243004,
-0.08962032198905945,
0.008675583638250828,
0.017175989225506783,
0.07442318648099899,
-0.09269778430461884,
0.20500199496746063,
0.07162415981292725,
-0.09645097702741623,
0.107965849339962,
0.010633673518896103,
-0.11659961193799973,
-0.04538160189986229,
0.0064473929814994335,
-0.04148581251502037,
-0.16662411391735077,
0.022011492401361465,
-0.11322615295648575,
-0.01063995249569416,
0.06644222140312195,
0.04292941838502884,
-0.05764997750520706,
0.11980269104242325,
0.03586168959736824,
-0.04252578318119049,
0.0632927194237709,
-0.013087080791592598,
0.009058423340320587,
0.04904979467391968,
0.019474713131785393,
0.029394764453172684,
-0.035753440111875534,
0.04794519022107124,
0.008200778625905514,
-0.05709390342235565,
-0.011704838834702969,
-0.009133736602962017,
-0.00891978107392788,
-0.018367603421211243,
0.01828721910715103,
0.04330354928970337,
0.1834830641746521,
0.02519642375409603,
-0.07029463350772858,
-0.03287879750132561,
0.2040536254644394,
-0.03285452350974083,
-0.08514721691608429,
-0.11850062757730484,
0.11981593817472458,
0.0465126559138298,
0.03371702507138252,
0.026480548083782196,
-0.09000242501497269,
-0.04987908899784088,
0.18479092419147491,
0.08851636946201324,
-0.006225275807082653,
-0.015117674134671688,
0.025548888370394707,
-0.0004374943673610687,
-0.06332238018512726,
0.2060837298631668,
0.02323361486196518,
0.2193414270877838,
0.006265110801905394,
0.00548619544133544,
-0.06124315410852432,
-0.03455604985356331,
0.001430333242751658,
0.11815596371889114,
-0.037763986736536026,
-0.04433939605951309,
-0.06716857105493546,
0.011447632685303688,
-0.007143247872591019,
-0.12068743258714676,
0.07620014995336533,
-0.13515101373195648,
-0.10992028564214706,
-0.03098697029054165,
0.05917308107018471,
-0.058814939111471176,
0.03926452621817589,
-0.028761936351656914,
0.044558167457580566,
0.04014285281300545,
-0.026710407808423042,
-0.08236102759838104,
-0.14345534145832062,
0.10526163876056671,
-0.053508564829826355,
0.1317901760339737,
-0.017124932259321213,
0.13740378618240356,
0.10732601583003998,
0.03247968479990959,
-0.07413479685783386,
0.09088486433029175,
0.019672797992825508,
0.030646568164229393,
0.04339523985981941,
0.13420216739177704,
-0.05918880179524422,
0.09518074989318848,
-0.044070109724998474,
-0.028480008244514465,
0.0009059262229129672,
-0.04607750102877617,
0.009186458773911,
-0.16623996198177338,
-0.007139886729419231,
-0.11139268428087234,
0.09389389306306839,
0.18319344520568848,
-0.039308756589889526,
-0.011143568903207779,
-0.08737398684024811,
0.09611818194389343,
-0.02012169174849987,
0.06408006697893143,
-0.040196917951107025,
-0.1927337646484375,
-0.006591164041310549,
0.048949651420116425,
0.006836041808128357,
-0.2354598045349121,
-0.025408228859305382,
-0.0357612781226635,
-0.023480795323848724,
-0.08011850714683533,
0.16561253368854523,
0.11910659074783325,
0.022642234340310097,
-0.026905246078968048,
-0.20316171646118164,
-0.05882430449128151,
0.051580365747213364,
-0.12480071187019348,
-0.12908683717250824
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the ruby function/method.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/function%20documentation%20generation/ruby/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_ruby_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the code documentation generation task for the ruby function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 2,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08953233063220978,
0.07327625155448914,
-0.0021869547199457884,
0.127711221575737,
0.051502954214811325,
0.02193749137222767,
0.04921646788716316,
0.11496589332818985,
-0.037982795387506485,
0.06875196099281311,
0.061658184975385666,
-0.04380218684673309,
0.04829785227775574,
0.15776224434375763,
0.017640599980950356,
-0.15554210543632507,
-0.045057572424411774,
0.013169865123927593,
-0.07569819688796997,
0.1047927513718605,
0.08444325625896454,
-0.07777979969978333,
0.07223953306674957,
-0.05081033334136009,
-0.1411772519350052,
0.043864328414201736,
-0.024600407108664513,
0.0009313329355791211,
0.09877564013004303,
0.04886409267783165,
0.10878810286521912,
-0.018178649246692657,
0.05817160755395889,
-0.20863699913024902,
0.0026901536621153355,
0.026968171820044518,
0.06566661596298218,
0.037538234144449234,
0.07518620043992996,
0.07301810383796692,
0.10265753418207169,
-0.026611898094415665,
0.04593522846698761,
0.06279993802309036,
-0.06302638351917267,
-0.07252753525972366,
-0.08989223092794418,
0.0914304330945015,
0.06775107234716415,
0.08336988091468811,
-0.006549705285578966,
0.022176304832100868,
-0.07148046046495438,
0.08926532417535782,
0.1408882737159729,
-0.24836906790733337,
-0.021535305306315422,
0.11625640094280243,
0.07045452296733856,
0.05113653093576431,
-0.06885954737663269,
-0.02876809611916542,
0.09967243671417236,
0.044454313814640045,
0.035995904356241226,
-0.09571173042058945,
-0.030516192317008972,
-0.0012157905148342252,
-0.058007191866636276,
-0.05227960646152496,
0.12188215553760529,
0.03264877572655678,
-0.06411635875701904,
-0.10726926475763321,
-0.045468270778656006,
-0.1967257708311081,
0.038997016847133636,
0.017041178420186043,
0.0131001565605402,
-0.0021772137843072414,
0.007122998125851154,
-0.02416078932583332,
-0.08297104388475418,
-0.12068517506122589,
0.04796682298183441,
0.027341295033693314,
0.07035797089338303,
0.04341721162199974,
-0.02172180637717247,
0.08717050403356552,
0.021098731085658073,
-0.040393173694610596,
-0.01979711279273033,
0.005360555835068226,
-0.1146775558590889,
0.031226936727762222,
-0.010628730058670044,
-0.03926241025328636,
0.011104132980108261,
0.0923173800110817,
-0.09419559687376022,
0.08687296509742737,
0.08438695967197418,
0.023245908319950104,
-0.00395679846405983,
0.20652741193771362,
0.06227429583668709,
-0.14011603593826294,
0.02411670610308647,
0.015431673265993595,
-0.002866300055757165,
0.009622125886380672,
-0.06012898311018944,
-0.05754859000444412,
0.03716709092259407,
0.07137562334537506,
-0.11749795079231262,
0.006693542934954166,
-0.06163371354341507,
-0.011556107550859451,
0.08830522745847702,
-0.1080293282866478,
0.04378150403499603,
0.018075676634907722,
-0.058257367461919785,
-0.04311513528227806,
0.057170066982507706,
-0.11421925574541092,
-0.11238107085227966,
0.039246637374162674,
-0.02846030332148075,
-0.02408403903245926,
-0.10973828285932541,
-0.10605931282043457,
-0.013261513784527779,
-0.06309260427951813,
0.009760314598679543,
-0.10644116997718811,
-0.10121305286884308,
-0.029183654114603996,
0.032026201486587524,
-0.018733737990260124,
-0.041144296526908875,
-0.03790942206978798,
0.011704148724675179,
-0.008909435011446476,
-0.015535985119640827,
0.020137213170528412,
-0.022698329761624336,
0.0907769724726677,
0.08076610416173935,
0.03433062508702278,
0.0021781413815915585,
0.022214222699403763,
-0.07896463572978973,
0.07778415083885193,
-0.09708268195390701,
0.07505348324775696,
-0.007641254458576441,
0.06028695032000542,
-0.10736767202615738,
-0.08046268671751022,
0.01797596737742424,
0.04529012739658356,
0.06798673421144485,
0.03953845053911209,
-0.13678552210330963,
0.029459470883011818,
0.16096250712871552,
-0.10710886865854263,
-0.11522451043128967,
0.1264798641204834,
-0.01094450056552887,
0.0019578703213483095,
0.0679452121257782,
0.1404307782649994,
0.14437668025493622,
-0.06782867759466171,
-0.037261586636304855,
0.06483614444732666,
0.05245869234204292,
-0.06241978332400322,
0.07756523042917252,
0.020474527031183243,
-0.007470983080565929,
0.02546127699315548,
0.061238836497068405,
0.0334310419857502,
0.002677851589396596,
-0.034960608929395676,
-0.04530293866991997,
-0.09355965256690979,
-0.01297706738114357,
-0.012007750570774078,
0.029137808829545975,
-0.06047852337360382,
-0.06801401078701019,
-0.030967416241765022,
0.17412593960762024,
-0.0804930180311203,
0.032533057034015656,
-0.08115825057029724,
-0.029167287051677704,
-0.05058138445019722,
0.01815914176404476,
-0.13374978303909302,
0.0643942803144455,
0.07263100147247314,
-0.005680982489138842,
0.06537371128797531,
0.09335765242576599,
0.012011679820716381,
0.02633342146873474,
-0.0622662715613842,
-0.052733294665813446,
-0.049073126167058945,
-0.08112359791994095,
-0.11226662993431091,
-0.03255300596356392,
-0.08805195242166519,
-0.034739330410957336,
-0.02890469878911972,
-0.18362994492053986,
0.002119175624102354,
0.009458948858082294,
0.0360289104282856,
0.03844762220978737,
-0.04081385210156441,
0.024542609229683876,
0.05398184433579445,
-0.04180343076586723,
-0.07859262824058533,
0.02816588059067726,
0.04250916838645935,
-0.07158907502889633,
-0.0450458899140358,
-0.06330002844333649,
-0.07376877963542938,
0.06147165223956108,
0.09823296964168549,
-0.1126355528831482,
-0.015475459396839142,
-0.019603027030825615,
-0.03862309083342552,
-0.048093050718307495,
-0.045055076479911804,
0.20367373526096344,
0.019244756549596786,
0.16265302896499634,
-0.1317521631717682,
-0.04798240214586258,
-0.025379925966262817,
-0.0053249322809278965,
0.04168649762868881,
0.16029198467731476,
-0.007936670444905758,
-0.11771494150161743,
0.034694839268922806,
-0.054094284772872925,
-0.06060997396707535,
0.14142915606498718,
-0.017817629501223564,
-0.05527767166495323,
-0.00801795907318592,
0.11675380170345306,
0.005488659255206585,
0.1490054577589035,
-0.04249338060617447,
0.0015166044468060136,
-0.0069260308519005775,
0.012950708158314228,
0.03407808393239975,
-0.13711172342300415,
0.027109939604997635,
0.03868066519498825,
-0.06202661618590355,
-0.021027186885476112,
-0.024833494797348976,
-0.039676323533058167,
0.03915124759078026,
0.022367579862475395,
0.03123677335679531,
-0.01442513708025217,
-0.02679711952805519,
-0.10030452907085419,
0.17150241136550903,
-0.07505921274423599,
-0.1870260089635849,
-0.1639523059129715,
0.07191967219114304,
-0.030878733843564987,
-0.01927821710705757,
0.039411261677742004,
-0.10944714397192001,
-0.03361335024237633,
-0.09038934856653214,
0.11656953394412994,
-0.1332118958234787,
0.004586163442581892,
-0.02492273971438408,
0.07795866578817368,
0.04943705350160599,
-0.1494244486093521,
0.02376154623925686,
-0.016398433595895767,
0.013478636741638184,
-0.0004850872210226953,
-0.046070586889982224,
0.0640861839056015,
0.10932064056396484,
-0.06522900611162186,
0.02008083648979664,
-0.004652749747037888,
0.17338697612285614,
-0.04169644042849541,
0.027922989800572395,
0.20341144502162933,
0.0075680878944695,
0.04287386313080788,
0.06165611371397972,
0.017156150192022324,
-0.08368680626153946,
0.0600496381521225,
0.04351721704006195,
-0.0314808189868927,
-0.24151216447353363,
-0.01595240831375122,
-0.07477526366710663,
0.07515808194875717,
0.11745679378509521,
0.06280364096164703,
-0.1585097461938858,
0.022911934182047844,
-0.0016390759265050292,
0.15011759102344513,
-0.035523075610399246,
0.060791999101638794,
-0.0057310499250888824,
0.017262909561395645,
0.004674152471125126,
-0.10010422021150589,
0.004828691482543945,
0.08066683262586594,
0.11295788735151291,
0.2136872112751007,
-0.08797106891870499,
0.15057314932346344,
0.011189762502908707,
0.11388202011585236,
0.05473138764500618,
0.07997261732816696,
-0.12754623591899872,
0.007105052471160889,
-0.002351256087422371,
-0.01329846028238535,
-0.07128432393074036,
0.05402383208274841,
-0.018058879300951958,
0.054297130554914474,
-0.05115886032581329,
0.02243085578083992,
0.015547000803053379,
0.214668869972229,
0.08787811547517776,
-0.15188726782798767,
-0.11454035341739655,
0.011734366416931152,
-0.08661261200904846,
-0.09798508882522583,
0.06640689074993134,
0.19011595845222473,
-0.07068026065826416,
0.021167583763599396,
-0.025147248059511185,
0.13245950639247894,
-0.12285678088665009,
-0.0191793292760849,
0.022768434137105942,
0.053165510296821594,
0.011957386508584023,
0.11231887340545654,
-0.26166465878486633,
0.07675941288471222,
0.015452966094017029,
0.0841602236032486,
-0.0075745964422822,
0.06551670283079147,
-0.04275435209274292,
0.002695600502192974,
0.07812957465648651,
0.009942607954144478,
-0.08307009935379028,
-0.1982623189687729,
-0.036052726209163666,
0.011244846507906914,
0.06963223218917847,
-0.011748774908483028,
0.09013988822698593,
-0.03295397758483887,
0.04552562162280083,
-0.04007826745510101,
-0.1337222456932068,
-0.06711306422948837,
-0.13292045891284943,
-0.035306863486766815,
-0.02337348461151123,
-0.06750086694955826,
-0.02922249212861061,
0.04511984437704086,
0.05723651126027107,
0.2082115262746811,
-0.15331299602985382,
-0.07707121968269348,
-0.08824850618839264,
0.06395865231752396,
0.11522584408521652,
-0.0985453724861145,
0.020860852673649788,
0.011021113954484463,
0.04914625734090805,
-0.04318608343601227,
-0.06040928140282631,
0.016039269044995308,
-0.05178651213645935,
-0.10682342946529388,
-0.03429779037833214,
0.10183706134557724,
-0.03120247833430767,
0.05251637473702431,
0.001779451733455062,
-0.06163515895605087,
-0.037154827266931534,
-0.115791454911232,
-0.05031720921397209,
-0.03511704131960869,
0.027659472078084946,
0.00023896693892311305,
-0.1002940684556961,
0.09340761601924896,
-0.017092067748308182,
-0.09364169836044312,
0.10128721594810486,
0.20834918320178986,
-0.0844993144273758,
0.03303459286689758,
0.07026734203100204,
-0.05122258886694908,
-0.18624477088451385,
-0.05376426875591278,
0.05579449236392975,
0.06986497342586517,
-0.011634827591478825,
-0.15669791400432587,
0.04033445566892624,
-0.009403401985764503,
0.008233398199081421,
-0.00164587062317878,
-0.26446476578712463,
-0.11895275115966797,
-0.011737463995814323,
0.069108746945858,
0.06596807390451431,
-0.0983894094824791,
-0.05370141938328743,
-0.0635996088385582,
-0.029398471117019653,
0.02925335243344307,
0.06789351999759674,
0.11317422240972519,
-0.045322220772504807,
0.018443873152136803,
0.04657094180583954,
-0.022831616923213005,
0.06370877474546432,
-0.04327073320746422,
0.10125745087862015,
-0.008182965219020844,
-0.011208211071789265,
0.032201774418354034,
-0.06279122829437256,
0.15401151776313782,
-0.18558438122272491,
0.10960161685943604,
-0.18223653733730316,
-0.042119357734918594,
-0.006178941577672958,
-0.022991502657532692,
-0.03411078080534935,
-0.048246681690216064,
-0.11362289637327194,
0.030743345618247986,
0.04262499138712883,
-0.025550438091158867,
0.04224352166056633,
-0.007778675761073828,
-0.06323415040969849,
0.1186150312423706,
0.057464491575956345,
0.042313531041145325,
-0.14501869678497314,
0.020145738497376442,
0.017255788668990135,
0.07921773940324783,
-0.20381681621074677,
0.024073513224720955,
0.1027270033955574,
0.022379353642463684,
0.09767596423625946,
0.006254789885133505,
-0.08946989476680756,
0.041133634746074677,
0.06638478487730026,
-0.05601233243942261,
-0.125308558344841,
-0.01626889780163765,
-0.01777837797999382,
-0.09295325726270676,
0.035502489656209946,
0.10324820876121521,
-0.06167345494031906,
-0.025736508890986443,
-0.0040110195986926556,
0.020156703889369965,
-0.08225040137767792,
0.18517227470874786,
0.024265529587864876,
0.08464377373456955,
-0.06161678954958916,
0.08235033601522446,
0.09626172482967377,
-0.06894146651029587,
0.024322841316461563,
0.15832297503948212,
-0.07832109928131104,
-0.030667226761579514,
0.0865660011768341,
0.12279070168733597,
-0.003862170036882162,
-0.047645408660173416,
-0.11219830065965652,
-0.0708395391702652,
0.017137648537755013,
-0.012678862549364567,
0.0787094384431839,
0.07640755921602249,
-0.03960300236940384,
-0.004307781811803579,
-0.1082875207066536,
0.09807782620191574,
0.08497379720211029,
0.053382258862257004,
-0.16756974160671234,
0.11588890105485916,
0.03200191259384155,
0.089556485414505,
0.005560068413615227,
0.03405274078249931,
-0.10842764377593994,
0.036433037370443344,
-0.029312118887901306,
0.051876913756132126,
0.02696610987186432,
0.05244242399930954,
-0.037929434329271317,
0.042721688747406006,
-0.027825355529785156,
0.04690397158265114,
-0.04680277407169342,
-0.03127282112836838,
-0.03862440213561058,
0.03432361036539078,
-0.05098375678062439,
-0.026377631351351738,
0.01514117419719696,
-0.07997985929250717,
0.10383229702711105,
-0.06425663083791733,
-0.015415998175740242,
0.0013448791578412056,
0.03752917796373367,
0.05751274153590202,
0.016517065465450287,
0.04687939211726189,
-0.021309401839971542,
-0.002670299494639039,
0.028801996260881424,
-0.0033692645374685526,
-0.008784998208284378,
0.0009265356347896159,
0.10697609186172485,
-0.13897529244422913,
-0.08212994784116745,
-0.10963990539312363,
-0.08208513259887695,
-0.06269218027591705,
0.08437500894069672,
0.07473472505807877,
0.099211186170578,
0.10251964628696442,
-0.040403492748737335,
0.02063026838004589,
-0.14371775090694427,
-0.03584900498390198,
0.05187438800930977,
-0.01687769591808319,
-0.13161341845989227,
-0.051825542002916336,
0.05496426671743393,
-0.023981790989637375,
0.10521509498357773,
0.0009049462387338281,
0.027213474735617638,
-0.017303025349974632,
-0.04419755935668945,
-0.05810093507170677,
0.006114371120929718,
0.15779654681682587,
-0.10358886420726776,
0.003944334574043751,
-0.009094302542507648,
-0.0007121398812159896,
0.024659128859639168,
0.19183452427387238,
0.1060413271188736,
0.1756087839603424,
0.0514429546892643,
0.06868351995944977,
-0.043722521513700485,
-0.015137679874897003,
-0.13764497637748718,
0.08634360134601593,
-0.026401551440358162,
0.03137835115194321,
-0.06211693584918976,
0.18542347848415375,
0.0995832309126854,
-0.13175034523010254,
0.10227590054273605,
0.005611996632069349,
-0.08447493612766266,
-0.051398247480392456,
-0.07343081384897232,
-0.055750951170921326,
-0.12425841391086578,
0.006628108210861683,
-0.0890888124704361,
0.011123945005238056,
0.08034885674715042,
0.021042095497250557,
-0.023352552205324173,
0.11241337656974792,
-0.024229397997260094,
-0.04468429088592529,
0.04039005562663078,
0.03223424777388573,
0.01650208793580532,
0.1219014897942543,
0.018618395552039146,
0.06153261289000511,
-0.07167056947946548,
0.08213706314563751,
0.032476022839546204,
-0.012669721618294716,
0.0053296517580747604,
0.01968567445874214,
-0.01962890289723873,
-0.041480597108602524,
-0.005320066586136818,
0.08129574358463287,
0.17281711101531982,
0.04342462494969368,
-0.04227697104215622,
-0.05287507548928261,
0.21397727727890015,
-0.056453343480825424,
-0.04955388605594635,
-0.11048649251461029,
0.15310271084308624,
0.03936344385147095,
0.021431295201182365,
0.021738028153777122,
-0.07848721742630005,
-0.027341710403561592,
0.22051101922988892,
0.06441689282655716,
-0.03807951509952545,
-0.026653824374079704,
0.012215442024171352,
-0.005642877891659737,
-0.04218343645334244,
0.15512323379516602,
-0.0017038281075656414,
0.20254088938236237,
0.0025234816130250692,
0.001975459046661854,
-0.033204030245542526,
-0.0477462001144886,
-0.021943753585219383,
0.18913383781909943,
-0.04043622314929962,
0.03143720701336861,
-0.10138443112373352,
-0.006186546292155981,
0.010406754910945892,
-0.1438586413860321,
0.12603303790092468,
-0.13621363043785095,
-0.08499466627836227,
0.02371940203011036,
0.07359959930181503,
-0.03860955685377121,
0.06414110213518143,
-0.020453037694096565,
0.061403557658195496,
0.029542753472924232,
-0.03279988840222359,
-0.0962589681148529,
-0.1353517323732376,
0.05093366652727127,
-0.015397278591990471,
0.13052205741405487,
0.007581683341413736,
0.09104008972644806,
0.09512700885534286,
0.003813359886407852,
-0.09041428565979004,
0.06832710653543472,
0.026219086721539497,
-0.016647854819893837,
0.05332161486148834,
0.14011503756046295,
-0.04557637870311737,
0.13157851994037628,
0.023065539076924324,
-0.020877137780189514,
-0.030413130298256874,
-0.007901265285909176,
0.0007159123779274523,
-0.1658044010400772,
0.013146664947271347,
-0.06910115480422974,
0.12426971644163132,
0.1927815079689026,
-0.04767422378063202,
-0.012876522727310658,
-0.04436406493186951,
0.07164863497018814,
-0.008219791576266289,
0.09229029715061188,
0.004382167477160692,
-0.16693277657032013,
0.013496101833879948,
0.015210204757750034,
0.014315379783511162,
-0.1998635083436966,
-0.06386358290910721,
-0.03536481782793999,
-0.021234216168522835,
-0.10583174973726273,
0.1602933555841446,
0.07157155871391296,
0.020054461434483528,
-0.03046766109764576,
-0.20995360612869263,
-0.03270156309008598,
0.04308011382818222,
-0.11407040804624557,
-0.11806727945804596
] |
null | null |
transformers
|
# CodeTrans model for code documentation generation ruby
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the ruby function/method.
## Intended uses & limitations
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_code_documentation_generation_ruby_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/function%20documentation%20generation/ruby/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
## Evaluation results
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Python | Java | Go | Php | Ruby | JavaScript |
| -------------------- | :------------: | :------------: | :------------: | :------------: | :------------: | :------------: |
| CodeTrans-ST-Small | 17.31 | 16.65 | 16.89 | 23.05 | 9.19 | 13.7 |
| CodeTrans-ST-Base | 16.86 | 17.17 | 17.16 | 22.98 | 8.23 | 13.17 |
| CodeTrans-TF-Small | 19.93 | 19.48 | 18.88 | 25.35 | 13.15 | 17.23 |
| CodeTrans-TF-Base | 20.26 | 20.19 | 19.50 | 25.84 | 14.07 | 18.25 |
| CodeTrans-TF-Large | 20.35 | 20.06 | **19.54** | 26.18 | 14.94 | **18.98** |
| CodeTrans-MT-Small | 19.64 | 19.00 | 19.15 | 24.68 | 14.91 | 15.26 |
| CodeTrans-MT-Base | **20.39** | 21.22 | 19.43 | **26.23** | **15.26** | 16.11 |
| CodeTrans-MT-Large | 20.18 | **21.87** | 19.38 | 26.08 | 15.00 | 16.23 |
| CodeTrans-MT-TF-Small | 19.77 | 20.04 | 19.36 | 25.55 | 13.70 | 17.24 |
| CodeTrans-MT-TF-Base | 19.77 | 21.12 | 18.86 | 25.79 | 14.24 | 18.62 |
| CodeTrans-MT-TF-Large | 18.94 | 21.42 | 18.77 | 26.20 | 14.19 | 18.83 |
| State of the art | 19.06 | 17.65 | 18.07 | 25.16 | 12.16 | 14.90 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "def add ( severity , progname , & block ) return true if io . nil? || severity < level message = format_message ( severity , progname , yield ) MUTEX . synchronize { io . write ( message ) } true end"}]}
|
summarization
|
SEBIS/code_trans_t5_small_code_documentation_generation_ruby_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for code documentation generation ruby
======================================================
Pretrained model on programming language ruby using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized ruby code functions: it works best with tokenized ruby functions.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the code documentation generation task for the ruby function/method.
Intended uses & limitations
---------------------------
The model could be used to generate the description for the ruby function or be fine-tuned on other ruby code tasks. It can be used on unparsed and untokenized ruby code. However, if the ruby code is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.
Evaluation results
------------------
For the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
108
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate ruby function documentation using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for half million steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 5000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing ruby code.\n\n\nEvaluation results\n------------------\n\n\nFor the code documentation tasks, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.09936226159334183,
0.05641235038638115,
-0.0012303452240303159,
0.11936061084270477,
0.047170497477054596,
0.024878187105059624,
0.046666838228702545,
0.11062204092741013,
-0.05556945875287056,
0.06041008606553078,
0.053351227194070816,
-0.04571954160928726,
0.06503932178020477,
0.16854943335056305,
0.00802221056073904,
-0.14220724999904633,
-0.04534033313393593,
0.02903488092124462,
-0.07521350681781769,
0.10946819931268692,
0.08290393650531769,
-0.08898895978927612,
0.0774410143494606,
-0.053098034113645554,
-0.13558784127235413,
0.044824909418821335,
-0.025680378079414368,
-0.004750253167003393,
0.09829708188772202,
0.06543570756912231,
0.12051327526569366,
-0.0183763075619936,
0.06649023294448853,
-0.21021594107151031,
0.0026993160136044025,
0.0284065380692482,
0.06611230224370956,
0.04621221497654915,
0.07146686315536499,
0.08100123703479767,
0.09845238924026489,
-0.028735361993312836,
0.042877428233623505,
0.05653927102684975,
-0.06351069360971451,
-0.05565549060702324,
-0.09166644513607025,
0.09804876148700714,
0.06895830482244492,
0.09015268832445145,
-0.004725164268165827,
0.05249916762113571,
-0.06361610442399979,
0.09371735155582428,
0.14171835780143738,
-0.25222891569137573,
-0.02504236437380314,
0.11889320611953735,
0.0763305127620697,
0.054205913096666336,
-0.07046877592802048,
-0.033204518258571625,
0.10580172389745712,
0.04162762686610222,
0.044316958636045456,
-0.08965497463941574,
-0.01926514506340027,
-0.0034559250343590975,
-0.060395874083042145,
-0.0492364726960659,
0.15850576758384705,
0.03380798175930977,
-0.06230833753943443,
-0.10194295644760132,
-0.03881020471453667,
-0.21005946397781372,
0.03360280394554138,
0.009557933546602726,
0.00960632599890232,
-0.007089819759130478,
-0.005362380761653185,
-0.00711815943941474,
-0.08148020505905151,
-0.1223042756319046,
0.038759052753448486,
0.024361595511436462,
0.06691979616880417,
0.03900795057415962,
-0.04359001666307449,
0.08604471385478973,
0.04828552156686783,
-0.03499385714530945,
-0.012491796165704727,
0.00019605027046054602,
-0.10999728739261627,
0.011917022988200188,
-0.013044701889157295,
-0.060993123799562454,
-0.004089429974555969,
0.0844244435429573,
-0.08806219696998596,
0.08650526404380798,
0.08610757440328598,
0.025031138211488724,
0.007341524586081505,
0.2089005410671234,
0.05837723985314369,
-0.1547912061214447,
0.025628743693232536,
0.0178474560379982,
-0.011109299026429653,
0.01568843238055706,
-0.04828409478068352,
-0.054539378732442856,
0.042686887085437775,
0.06678321957588196,
-0.12113670259714127,
0.024708136916160583,
-0.056972015649080276,
-0.014452122151851654,
0.08092552423477173,
-0.11486660689115524,
0.034760184586048126,
0.012341764755547047,
-0.061733677983284,
-0.03710499405860901,
0.06705182790756226,
-0.12220194190740585,
-0.11438677459955215,
0.036946021020412445,
-0.03168391436338425,
-0.0347253754734993,
-0.12048088759183884,
-0.114047110080719,
-0.021802974864840508,
-0.03202614188194275,
0.0029047203715890646,
-0.10507475584745407,
-0.10171059519052505,
-0.023503603413701057,
0.03513488546013832,
-0.008596819825470448,
-0.03366803005337715,
-0.03878689184784889,
0.010074751451611519,
-0.009152999147772789,
-0.01308218389749527,
0.01892983727157116,
-0.019698889926075935,
0.09528177976608276,
0.08165764063596725,
0.03477208688855171,
-0.001698188134469092,
0.021610695868730545,
-0.08167248219251633,
0.06985446065664291,
-0.10159610211849213,
0.07346245646476746,
-0.00761814508587122,
0.05797778442502022,
-0.11263089627027512,
-0.0805211216211319,
0.005610621068626642,
0.04644623398780823,
0.0793319046497345,
0.04529058188199997,
-0.15865223109722137,
0.03512437269091606,
0.15840359032154083,
-0.1131955161690712,
-0.1302087903022766,
0.11285987496376038,
-0.01659328117966652,
0.01706082746386528,
0.06892293691635132,
0.1261148750782013,
0.14247672259807587,
-0.06853767484426498,
-0.042340345680713654,
0.06678453087806702,
0.045951396226882935,
-0.06514965742826462,
0.06668440252542496,
0.0279526486992836,
-0.03649776428937912,
0.02114962972700596,
0.06270574778318405,
0.03970666229724884,
-0.006544079631567001,
-0.03433234617114067,
-0.0398654080927372,
-0.10009093582630157,
-0.03213623911142349,
-0.010193102061748505,
0.03333338722586632,
-0.051892489194869995,
-0.053710728883743286,
-0.04095127433538437,
0.16731292009353638,
-0.08437727391719818,
0.029512479901313782,
-0.08498706668615341,
-0.03797515481710434,
-0.05256868526339531,
0.01844291388988495,
-0.13729162514209747,
0.06577662378549576,
0.07225583493709564,
-0.003493786556646228,
0.06365139782428741,
0.08197034895420074,
0.009896575473248959,
0.013492058962583542,
-0.0633823350071907,
-0.049256831407547,
-0.03324412927031517,
-0.08105295896530151,
-0.11283041536808014,
-0.03786129876971245,
-0.08849934488534927,
-0.030981779098510742,
-0.04695663973689079,
-0.18298451602458954,
-0.0011558285914361477,
-0.00223827944137156,
0.03310509771108627,
0.04108845070004463,
-0.03667394071817398,
0.02046641707420349,
0.05348319932818413,
-0.04327063634991646,
-0.07565683871507645,
0.02062235400080681,
0.046286098659038544,
-0.08124437183141708,
-0.04995938017964363,
-0.062015168368816376,
-0.08521202951669693,
0.07152944058179855,
0.10110602527856827,
-0.1330358237028122,
-0.024488838389515877,
-0.022534657269716263,
-0.03447730094194412,
-0.04205573722720146,
-0.03983701393008232,
0.20293095707893372,
0.017955714836716652,
0.15941840410232544,
-0.13371075689792633,
-0.055998794734478,
-0.027713323011994362,
0.005750682670623064,
0.04177074134349823,
0.15259015560150146,
0.014493304304778576,
-0.13105608522891998,
0.028761757537722588,
-0.06448118388652802,
-0.0542624294757843,
0.1373520791530609,
-0.020816536620259285,
-0.048798978328704834,
-0.0020677740685641766,
0.10859637707471848,
0.010195179842412472,
0.17482790350914001,
-0.020373322069644928,
0.0023225515615195036,
-0.0080244280397892,
0.004003453534096479,
0.03778442367911339,
-0.13001346588134766,
0.03215113282203674,
0.038801733404397964,
-0.04771902784705162,
-0.022997500374913216,
-0.028570547699928284,
-0.04052296280860901,
0.04208940640091896,
0.01668996922671795,
0.03983866050839424,
-0.020832741633057594,
-0.031248481944203377,
-0.10385989397764206,
0.17565588653087616,
-0.07307795435190201,
-0.18664826452732086,
-0.1558961719274521,
0.08999177068471909,
-0.01808512769639492,
-0.0231351125985384,
0.029035314917564392,
-0.10057391971349716,
-0.04247410595417023,
-0.09675228595733643,
0.11942175775766373,
-0.1278616040945053,
0.009094360284507275,
-0.022686876356601715,
0.06938688457012177,
0.04462626203894615,
-0.15652036666870117,
0.03068036213517189,
-0.021020837128162384,
0.020617665722966194,
0.00020593326189555228,
-0.0600280836224556,
0.07059620320796967,
0.10406826436519623,
-0.07300078123807907,
0.021510353311896324,
-0.014745093882083893,
0.18066193163394928,
-0.05204755440354347,
0.03795000538229942,
0.17753303050994873,
0.008966061286628246,
0.035116974264383316,
0.06207253411412239,
0.012495240196585655,
-0.08846742659807205,
0.06660827249288559,
0.033917978405952454,
-0.024139748886227608,
-0.2297814041376114,
-0.022302750498056412,
-0.0763893723487854,
0.07452549785375595,
0.11996205151081085,
0.0498974546790123,
-0.16245071589946747,
0.027295809239149094,
-0.0034677735529839993,
0.16597680747509003,
-0.03171613812446594,
0.06559979170560837,
-0.025627803057432175,
0.02312466688454151,
0.0037238472141325474,
-0.10490675270557404,
0.0028490640688687563,
0.07514496147632599,
0.10348735749721527,
0.21396061778068542,
-0.08895646780729294,
0.13878731429576874,
0.0017366142710670829,
0.1189340129494667,
0.059491608291864395,
0.10586266964673996,
-0.13347694277763367,
0.010283676907420158,
-0.0028917714953422546,
-0.012346711941063404,
-0.08163201063871384,
0.04608119651675224,
-0.03558846935629845,
0.06252233684062958,
-0.05387919768691063,
0.022102907299995422,
0.01625409536063671,
0.20570391416549683,
0.06790406256914139,
-0.15663935244083405,
-0.11689336597919464,
0.0012753086630254984,
-0.07882261276245117,
-0.09512222558259964,
0.0678698942065239,
0.18125340342521667,
-0.06384043395519257,
0.02127978764474392,
-0.025537529960274696,
0.13394439220428467,
-0.11693892627954483,
-0.02306411974132061,
0.021554013714194298,
0.05698710307478905,
0.0022382892202585936,
0.10683518648147583,
-0.2719099819660187,
0.08358103781938553,
0.01715879514813423,
0.08935663849115372,
-0.015406576916575432,
0.06040972098708153,
-0.04719291999936104,
0.003932262305170298,
0.07943087071180344,
0.010709601454436779,
-0.08176105469465256,
-0.19257201254367828,
-0.028964033350348473,
0.01713506691157818,
0.07124148309230804,
0.0006811736384406686,
0.09203432500362396,
-0.030980991199612617,
0.04816007241606712,
-0.03302575275301933,
-0.12675940990447998,
-0.0782308429479599,
-0.13665562868118286,
-0.040685977786779404,
-0.025786759331822395,
-0.06260242313146591,
-0.0291550625115633,
0.05221250280737877,
0.049210187047719955,
0.1997506469488144,
-0.16390080749988556,
-0.06331625580787659,
-0.08831710368394852,
0.06743840128183365,
0.1196301132440567,
-0.09270040690898895,
0.020021837204694748,
0.016477182507514954,
0.05408778414130211,
-0.04709239304065704,
-0.06932312250137329,
0.01985304430127144,
-0.05909200757741928,
-0.09192320704460144,
-0.0364355742931366,
0.09075719118118286,
-0.01902111805975437,
0.05346531793475151,
0.00813612062484026,
-0.07138296216726303,
-0.03767133504152298,
-0.1171361654996872,
-0.06357544660568237,
-0.0414666123688221,
0.020481325685977936,
0.0021490994840860367,
-0.11774635314941406,
0.05616786703467369,
-0.022966476157307625,
-0.09292061626911163,
0.09132636338472366,
0.18377752602100372,
-0.08299601078033447,
0.018540576100349426,
0.06041410565376282,
-0.055775970220565796,
-0.18852324783802032,
-0.0426984578371048,
0.053510475903749466,
0.06962116807699203,
-0.013897119089961052,
-0.15156559646129608,
0.047231417149305344,
-0.025711631402373314,
0.0162627175450325,
-0.014416852965950966,
-0.24703972041606903,
-0.11833139508962631,
-0.003083133604377508,
0.072543203830719,
0.048267245292663574,
-0.09263744205236435,
-0.04960327968001366,
-0.06671229004859924,
-0.025773942470550537,
0.043276309967041016,
0.08010008186101913,
0.10719089955091476,
-0.03955037519335747,
0.017834732308983803,
0.04685383290052414,
-0.026948831975460052,
0.04291065037250519,
-0.03957942873239517,
0.11354217678308487,
-0.00643755029886961,
-0.016034677624702454,
0.03595571592450142,
-0.055721987038850784,
0.15669134259223938,
-0.1863052397966385,
0.11935614049434662,
-0.17758919298648834,
-0.0378175713121891,
-0.010837112553417683,
-0.017759235575795174,
-0.03626527264714241,
-0.04723555967211723,
-0.12368188053369522,
0.045498497784137726,
0.05690281465649605,
-0.02848738431930542,
0.03342922776937485,
-0.002117230324074626,
-0.06113114580512047,
0.08930689096450806,
0.07233580946922302,
0.04885700345039368,
-0.12478531152009964,
0.02802908420562744,
0.018010420724749565,
0.0885966569185257,
-0.18434281647205353,
0.022756999358534813,
0.10390997678041458,
0.022341223433613777,
0.09686607122421265,
0.011626003310084343,
-0.08830700069665909,
0.02441599778831005,
0.06746163219213486,
-0.05993802845478058,
-0.10172602534294128,
-0.013331201858818531,
0.010588474571704865,
-0.08734387159347534,
0.03885508328676224,
0.08943109959363937,
-0.06663717329502106,
-0.01936577819287777,
-0.0059590269811451435,
0.014049417339265347,
-0.07823298871517181,
0.1777644157409668,
0.019122019410133362,
0.08333364874124527,
-0.056509099900722504,
0.08161812275648117,
0.09552659839391708,
-0.07287599891424179,
0.028285833075642586,
0.14568597078323364,
-0.0837244838476181,
-0.021935945376753807,
0.11557365953922272,
0.14487217366695404,
-0.012831238098442554,
-0.04636311158537865,
-0.10506986081600189,
-0.07508993148803711,
0.0129086347296834,
0.022258246317505836,
0.07233264297246933,
0.07275372743606567,
-0.03286485746502876,
-0.006441221572458744,
-0.11803890764713287,
0.09508288651704788,
0.0836087241768837,
0.05239211022853851,
-0.14908190071582794,
0.13371844589710236,
0.03074834495782852,
0.0689023956656456,
0.00312375882640481,
0.041877631098032,
-0.11325492709875107,
0.03515153378248215,
-0.003197177778929472,
0.049596384167671204,
0.02419508807361126,
0.04988199099898338,
-0.037527669221162796,
0.05021083354949951,
-0.02751854807138443,
0.043727047741413116,
-0.043387118726968765,
-0.02420561946928501,
-0.034821026027202606,
0.02999294176697731,
-0.047078635543584824,
-0.020952560007572174,
0.014953727833926678,
-0.08192052692174911,
0.09118626266717911,
-0.06434939056634903,
-0.012249735184013844,
0.0011282479390501976,
0.03447386622428894,
0.05395421385765076,
0.00168329244479537,
0.05344266816973686,
-0.018457463011145592,
0.002967460546642542,
0.02421703189611435,
0.00598698016256094,
-0.0166595671325922,
-0.0010167709551751614,
0.10732374340295792,
-0.1369914710521698,
-0.07807335257530212,
-0.10076719522476196,
-0.06475132703781128,
-0.06131112575531006,
0.08752622455358505,
0.0759330466389656,
0.09083130955696106,
0.09619265794754028,
-0.0410291850566864,
0.013427878729999065,
-0.15081290900707245,
-0.03549240902066231,
0.05532175675034523,
-0.015468357130885124,
-0.13446027040481567,
-0.049784671515226364,
0.06311620026826859,
-0.026605794206261635,
0.11284057796001434,
0.002403859281912446,
0.012427386827766895,
-0.017919141799211502,
-0.048548780381679535,
-0.07214102149009705,
0.00399074936285615,
0.1761600524187088,
-0.10379743576049805,
0.0036703646183013916,
-0.007530563045293093,
0.005722720175981522,
0.01770690642297268,
0.1868162602186203,
0.12130331993103027,
0.16617421805858612,
0.03206793963909149,
0.06623253971338272,
-0.04724007472395897,
-0.027879957109689713,
-0.09565605968236923,
0.07266177982091904,
-0.027547914534807205,
0.030490899458527565,
-0.04979807510972023,
0.18826065957546234,
0.08310974389314651,
-0.1314050555229187,
0.1067531481385231,
-0.0008435228955931962,
-0.0873878225684166,
-0.038988709449768066,
-0.07257742434740067,
-0.04952123016119003,
-0.11525988578796387,
0.006195286754518747,
-0.0998905599117279,
-0.0030590782407671213,
0.0590045265853405,
0.02233981527388096,
-0.02265014871954918,
0.12733852863311768,
-0.04438666254281998,
-0.04786524176597595,
0.04516816511750221,
0.03784382343292236,
0.005978990811854601,
0.09373003244400024,
0.02029610425233841,
0.058362096548080444,
-0.07728967815637589,
0.07588838040828705,
0.03007001057267189,
-0.016352079808712006,
0.013364131562411785,
0.0449262373149395,
-0.015552341938018799,
-0.03850822150707245,
-0.020845795050263405,
0.08185960352420807,
0.17126484215259552,
0.035474829375743866,
-0.030722081661224365,
-0.05844901129603386,
0.20615284144878387,
-0.058502957224845886,
-0.05493541061878204,
-0.11461261659860611,
0.1560683250427246,
0.04200168699026108,
0.017486318945884705,
0.02317034639418125,
-0.08100900053977966,
-0.016602128744125366,
0.24160124361515045,
0.06596499681472778,
-0.045530080795288086,
-0.025459738448262215,
0.006653675809502602,
-0.004094691481441259,
-0.04367629066109657,
0.1464923918247223,
0.00648842565715313,
0.2003660798072815,
0.0028091799467802048,
0.005227000918239355,
-0.04291681945323944,
-0.04882633313536644,
0.00578042957931757,
0.19458559155464172,
-0.024568790569901466,
0.024675482884049416,
-0.10121024399995804,
-0.005930433515459299,
0.0034088159445673227,
-0.16540324687957764,
0.12309350818395615,
-0.14082071185112,
-0.0747184157371521,
0.013729900121688843,
0.06995774060487747,
-0.04402446001768112,
0.0582454577088356,
-0.01973172090947628,
0.07194805890321732,
0.025990759953856468,
-0.023797396570444107,
-0.09295473247766495,
-0.14465630054473877,
0.05145733430981636,
-0.0243088211864233,
0.12112953513860703,
0.011388091370463371,
0.09177181869745255,
0.08972779661417007,
0.00518985278904438,
-0.08559349924325943,
0.06722203642129898,
0.023477422073483467,
-0.006110933609306812,
0.04840518534183502,
0.13507120311260223,
-0.04501114785671234,
0.1566072404384613,
0.012774970382452011,
-0.022751890122890472,
-0.02105412445962429,
-0.011013932526111603,
-0.004304706584662199,
-0.16239199042320251,
0.015396510250866413,
-0.06309995800256729,
0.13575811684131622,
0.19672317802906036,
-0.04549727961421013,
-0.004492092877626419,
-0.04843619465827942,
0.07581037282943726,
-0.00637218588963151,
0.08771383762359619,
0.00353122316300869,
-0.16538377106189728,
0.007671186700463295,
-0.006661264691501856,
0.010826697573065758,
-0.19548656046390533,
-0.05436699092388153,
-0.0429023802280426,
-0.03166569396853447,
-0.10403791069984436,
0.1494446098804474,
0.08411125838756561,
0.025907019153237343,
-0.035766761749982834,
-0.1695692092180252,
-0.024401184171438217,
0.03856028616428375,
-0.11310208588838577,
-0.11476600915193558
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used single-task training on Git Commit Message Generation dataset.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_commit_generation"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_commit_generation", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/single%20task/commit%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_small_commit_generation
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used single-task training on Git Commit Message Generation dataset.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
50,
114
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #has_space #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08646299690008163,
-0.033205240964889526,
-0.0010327495401725173,
0.03889228776097298,
0.1346731185913086,
0.014372148551046848,
0.10016074776649475,
0.08751221746206284,
0.02772493101656437,
-0.022569606080651283,
0.09521498531103134,
0.10855718702077866,
0.06811675429344177,
0.1854415386915207,
-0.014535109512507915,
-0.18234817683696747,
0.01610684022307396,
0.054063133895397186,
-0.08643335103988647,
0.14896301925182343,
0.12608276307582855,
-0.05971958488225937,
0.10797784477472305,
0.013832883909344673,
-0.2159634381532669,
0.02580190636217594,
-0.007646885700523853,
-0.0676937997341156,
0.1100916713476181,
0.06483151018619537,
0.10319887101650238,
0.08149009943008423,
-0.028049830347299576,
-0.17306563258171082,
0.03913048282265663,
0.008505422621965408,
-0.02930632047355175,
0.03818582743406296,
-0.011384031735360622,
-0.05893990397453308,
0.13642647862434387,
-0.03883558139204979,
-0.022418316453695297,
0.034991879016160965,
-0.1298031508922577,
0.04318433254957199,
-0.04526522755622864,
0.014067302457988262,
0.07434237003326416,
0.07385798543691635,
-0.013763867318630219,
0.12656819820404053,
-0.13361309468746185,
0.10702544450759888,
0.19848354160785675,
-0.1670665144920349,
-0.017068341374397278,
0.18118081986904144,
0.08631949126720428,
0.03749893233180046,
-0.033996615558862686,
0.03036470152437687,
0.0704244077205658,
0.007770615629851818,
0.0289398692548275,
-0.08129900693893433,
-0.0390322282910347,
0.07789725810289383,
-0.1288919895887375,
-0.09414742887020111,
0.18911972641944885,
-0.027928709983825684,
-0.05709562823176384,
-0.07168339192867279,
-0.0306165162473917,
-0.081203892827034,
0.0033663001377135515,
0.02051861211657524,
-0.013808629475533962,
0.010725235566496849,
-0.09368208795785904,
-0.045174986124038696,
-0.11650417000055313,
-0.11563237756490707,
-0.04111693426966667,
0.1072525605559349,
0.008183376863598824,
0.03164239600300789,
-0.12834882736206055,
0.1271132081747055,
-0.03233928978443146,
-0.05747722461819649,
-0.023713329806923866,
-0.06285910308361053,
-0.04961670562624931,
-0.019736848771572113,
-0.07925035804510117,
-0.19900138676166534,
0.1297702044248581,
0.05302811786532402,
-0.05974714085459709,
0.023350737988948822,
0.022226713597774506,
0.06722790747880936,
0.0930844247341156,
0.20982040464878082,
-0.06834658235311508,
-0.018998004496097565,
0.07198569923639297,
-0.013481898233294487,
-0.05910332500934601,
-0.014686040580272675,
-0.09827381372451782,
-0.032431263476610184,
0.06532274931669235,
0.11952599138021469,
-0.07308436930179596,
0.1062692254781723,
-0.0657600685954094,
-0.027194170281291008,
0.021317893639206886,
-0.09976211190223694,
-0.036849599331617355,
-0.01820632629096508,
-0.06799336522817612,
-0.04983638599514961,
0.08308490365743637,
-0.038600604981184006,
-0.06976650655269623,
-0.02986271306872368,
-0.07842592149972916,
-0.027910808101296425,
-0.06476293504238129,
-0.1234775260090828,
0.013711617328226566,
-0.03646933287382126,
0.031526487320661545,
-0.1477559357881546,
-0.1388005018234253,
-0.0021354290656745434,
0.02956976555287838,
0.018719777464866638,
0.03040524572134018,
-0.037058353424072266,
-0.01605120487511158,
-0.019020313397049904,
-0.013596512377262115,
-0.0361701175570488,
-0.08783439546823502,
0.08584181219339371,
0.013466660864651203,
0.06434816122055054,
-0.050716109573841095,
0.007581138983368874,
-0.11702071130275726,
0.05797121673822403,
-0.20624035596847534,
0.09588547050952911,
-0.08148401975631714,
0.15224269032478333,
-0.12172996252775192,
-0.04093406721949577,
0.06222857907414436,
0.03434755280613899,
0.09123098850250244,
0.1548464596271515,
-0.07968326658010483,
-0.07677040994167328,
0.12607400119304657,
-0.09877904504537582,
-0.177212193608284,
0.11744742095470428,
-0.044067222625017166,
0.150274857878685,
0.10371869802474976,
0.19206467270851135,
0.14844122529029846,
-0.11115501821041107,
-0.006147254258394241,
0.08411221951246262,
-0.0707518458366394,
-0.016646360978484154,
0.04352452978491783,
0.09944242238998413,
-0.12802407145500183,
0.04175172373652458,
0.004115136805921793,
0.13003771007061005,
-0.0719546377658844,
-0.0236956924200058,
-0.037696462124586105,
-0.06814321875572205,
-0.0037176322657614946,
-0.028788994997739792,
0.07682820409536362,
-0.016021445393562317,
-0.0324203222990036,
0.0027305285912007093,
0.0931767076253891,
-0.08713352680206299,
0.02661275491118431,
-0.10661101341247559,
0.09068012982606888,
-0.11974956095218658,
0.042505908757448196,
-0.1242750808596611,
-0.028116129338741302,
0.02296736277639866,
0.023339157924056053,
0.08800539374351501,
0.009455996565520763,
0.023509075865149498,
-0.03089984878897667,
-0.005318408831954002,
-0.018925189971923828,
0.03051668033003807,
-0.02034739777445793,
-0.08013159036636353,
-0.1014203131198883,
-0.02809777483344078,
-0.031142583116889,
0.028307780623435974,
-0.06938548386096954,
0.027335302904248238,
0.168941468000412,
0.07239136099815369,
0.0018232251750305295,
0.00766553683206439,
0.05407601222395897,
0.013019837439060211,
-0.05364030972123146,
-0.04504445940256119,
0.017396017909049988,
0.013859964907169342,
-0.1760694682598114,
0.1059478148818016,
-0.0217899102717638,
0.03753850236535072,
0.14234913885593414,
-0.06499335169792175,
-0.06344129145145416,
-0.07776904106140137,
-0.04423671215772629,
-0.005387934390455484,
0.019108721986413002,
-0.05684000998735428,
0.1419937014579773,
-0.022059760987758636,
0.12071631848812103,
-0.0982641950249672,
-0.032686345279216766,
-0.00921737588942051,
-0.011007198132574558,
0.009167702868580818,
0.12104301154613495,
0.021939652040600777,
-0.20306964218616486,
0.07882105559110641,
0.032164961099624634,
-0.007436638232320547,
0.19762320816516876,
-0.019411342218518257,
-0.03654973953962326,
-0.02680230513215065,
0.06604652106761932,
-0.020795520395040512,
0.12858334183692932,
-0.21071498095989227,
-0.05891073867678642,
0.012470809742808342,
0.011602950282394886,
0.1078747883439064,
-0.1335994154214859,
-0.03246200829744339,
0.02098509855568409,
-0.04628762602806091,
-0.1139712780714035,
0.06409561634063721,
-0.003269862849265337,
0.041562579572200775,
0.030491618439555168,
0.05046158283948898,
0.05099375545978546,
-0.013807329349219799,
-0.08965794742107391,
0.2107846885919571,
-0.0773424580693245,
-0.31568506360054016,
-0.12116018682718277,
-0.003688397817313671,
-0.043253879994153976,
0.015529824420809746,
0.06864513456821442,
-0.14273720979690552,
-0.0165606327354908,
-0.03794265165925026,
0.18207857012748718,
-0.039420001208782196,
0.06316474825143814,
-0.07566405832767487,
0.04047085717320442,
-0.03391812741756439,
-0.17600825428962708,
-0.009360137395560741,
-0.010058525949716568,
-0.02671355940401554,
0.05799901485443115,
-0.17070862650871277,
0.09502162039279938,
0.11330028623342514,
-0.01680983230471611,
0.06883512437343597,
-0.05401109531521797,
0.3308482766151428,
-0.11508835107088089,
-0.029596248641610146,
0.13375414907932281,
-0.02295718342065811,
0.02568769082427025,
0.06175524368882179,
-0.006339430809020996,
-0.0987543910741806,
0.050631068646907806,
-0.04067646712064743,
-0.09531320631504059,
-0.23462218046188354,
-0.06553564220666885,
-0.07070478051900864,
0.12676775455474854,
0.014069982804358006,
0.04593335837125778,
-0.024626310914754868,
0.08527997881174088,
0.03994355723261833,
0.1129373237490654,
0.003909079823642969,
0.08352343738079071,
-0.04371784254908562,
-0.027173258364200592,
0.04094843566417694,
-0.051851920783519745,
-0.036437712609767914,
0.09219525009393692,
0.1055174320936203,
0.14895421266555786,
-0.0417969673871994,
0.154743954539299,
0.06626836955547333,
0.08671502023935318,
0.059532467275857925,
0.10257385671138763,
-0.11560802161693573,
-0.016513386741280556,
0.005614010617136955,
-0.022115174680948257,
-0.0983840823173523,
0.06248725578188896,
0.0250131543725729,
-0.035340096801519394,
-0.10366182029247284,
-0.07740671187639236,
0.1092340275645256,
0.09711404144763947,
0.047897156327962875,
-0.22577935457229614,
-0.10728363692760468,
0.026775136590003967,
-0.07146462053060532,
-0.046498529613018036,
0.039992522448301315,
0.11351245641708374,
-0.0893271416425705,
0.00043053223635070026,
-0.026940694078803062,
0.13982732594013214,
-0.10807152092456818,
0.015026436187326908,
-0.08293799310922623,
0.002597614424303174,
-0.0691581591963768,
0.12764060497283936,
-0.24158252775669098,
0.16298095881938934,
0.004419360775500536,
-0.023073676973581314,
-0.09209467470645905,
0.0042556580156087875,
0.007796039339154959,
0.14043176174163818,
0.09425292164087296,
0.00642374949529767,
-0.005896397866308689,
-0.16341529786586761,
-0.01752319000661373,
0.058543507009744644,
0.07552942633628845,
-0.08105503022670746,
0.058110933750867844,
0.011539734899997711,
0.024325056001544,
-0.004942229948937893,
0.015994967892766,
-0.10028711706399918,
-0.0947369933128357,
0.0488755889236927,
0.0192419346421957,
0.08927026391029358,
-0.05345626547932625,
-0.00439350213855505,
0.026092350482940674,
0.24296821653842926,
-0.006103744730353355,
-0.1048072949051857,
-0.1276729255914688,
0.061793141067028046,
0.10918081551790237,
-0.07152611762285233,
0.023019442334771156,
-0.019604412838816643,
0.05635278671979904,
-0.027778390794992447,
-0.1035832092165947,
0.09775792062282562,
-0.06541506201028824,
-0.030414050444960594,
-0.01714550144970417,
0.1350269615650177,
0.03148217126727104,
0.008396171033382416,
0.03820231556892395,
-0.07786822319030762,
-0.08247077465057373,
-0.12705636024475098,
-0.05850144103169441,
-0.05462781712412834,
0.009440126828849316,
0.11780495941638947,
-0.02772515080869198,
0.053563788533210754,
-0.04312920942902565,
-0.054382093250751495,
0.1893693208694458,
0.14670026302337646,
-0.010503994300961494,
0.032252274453639984,
0.10133769363164902,
-0.059607114642858505,
-0.2591853141784668,
0.028030401095747948,
-0.028271615505218506,
0.06678381562232971,
-0.06413152068853378,
-0.1321801245212555,
0.007249256130307913,
-0.03555764630436897,
0.016421766951680183,
0.010835363529622555,
-0.28787001967430115,
-0.10958739370107651,
0.06959978491067886,
0.09249673783779144,
0.1900259256362915,
-0.13984444737434387,
-0.05208081379532814,
-0.08638745546340942,
-0.12492088973522186,
0.17194467782974243,
-0.14048518240451813,
0.12208318710327148,
-0.0050121801905334,
0.0879603773355484,
0.03469506651163101,
-0.05852540209889412,
0.1130889505147934,
0.014627459459006786,
0.05586732178926468,
-0.032890789210796356,
-0.09139017760753632,
0.11350315809249878,
-0.04426898434758186,
0.10079672187566757,
-0.11805006116628647,
0.09495699405670166,
-0.1333441138267517,
-0.03302780166268349,
-0.07172268629074097,
0.055884286761283875,
0.01517948042601347,
-0.08931287378072739,
-0.15722264349460602,
0.03420891612768173,
0.013965346850454807,
0.026535289362072945,
0.14780457317829132,
-0.06111308932304382,
0.009232287295162678,
0.11243607848882675,
0.06025409698486328,
-0.08589258044958115,
0.0018178437603637576,
0.07324738800525665,
0.0023824505042284727,
0.10336283594369888,
-0.2758120000362396,
0.07166534662246704,
0.09219533205032349,
0.017930589616298676,
0.14381912350654602,
0.07789112627506256,
-0.08660615980625153,
0.030682096257805824,
0.0984835997223854,
-0.14844515919685364,
-0.04160337150096893,
-0.06394288688898087,
-0.04341236874461174,
0.02080792747437954,
0.11636324971914291,
0.1836058348417282,
-0.04827055335044861,
-0.014415391720831394,
-0.021209849044680595,
0.002834100741893053,
-0.12916620075702667,
0.14060072600841522,
0.03766263648867607,
0.033385150134563446,
-0.10108986496925354,
0.03033718653023243,
0.0377415306866169,
-0.09584441781044006,
-0.02591068483889103,
0.07605582475662231,
-0.13435938954353333,
-0.08198709785938263,
-0.049112387001514435,
0.11566460877656937,
-0.10820228606462479,
-0.02421431615948677,
-0.10639847815036774,
-0.06578495353460312,
0.005230733659118414,
0.19954821467399597,
0.08840272575616837,
0.10091051459312439,
0.001970751443877816,
0.034816674888134,
-0.07072281837463379,
0.026648130267858505,
0.07750316709280014,
0.041793107986450195,
-0.10950175672769547,
0.10408899933099747,
0.007096427027136087,
0.14988577365875244,
-0.06566659361124039,
-0.02548571117222309,
-0.2125907689332962,
0.07130616903305054,
-0.11409028619527817,
0.010437292978167534,
-0.05274082347750664,
-0.0036515947431325912,
0.009670383296906948,
-0.04457256570458412,
-0.04114241153001785,
0.03223772346973419,
-0.09785015136003494,
0.028049275279045105,
-0.007378216367214918,
0.05213883891701698,
-0.06148853152990341,
-0.009981286711990833,
0.08722760528326035,
-0.07044493407011032,
0.16339996457099915,
0.037416696548461914,
-0.055140018463134766,
0.0702923908829689,
-0.15956860780715942,
0.008640049956738949,
0.04621494561433792,
-0.01062836591154337,
0.04595277085900307,
-0.02143203280866146,
0.04416384920477867,
0.008199209347367287,
-0.0006307981675490737,
0.021565424278378487,
0.08972831815481186,
-0.09959427267313004,
-0.06789097189903259,
-0.008244996890425682,
-0.05001431331038475,
-0.035744525492191315,
0.024860192090272903,
0.05055142194032669,
0.10051316022872925,
0.04762914776802063,
-0.003736854763701558,
0.032021839171648026,
-0.09463638067245483,
-0.02578645385801792,
0.05249853804707527,
-0.10326871275901794,
0.026204898953437805,
-0.0734180361032486,
0.04961314797401428,
-0.05687934532761574,
0.25691545009613037,
0.02813124656677246,
0.06868244707584381,
-0.008981498889625072,
0.08992203325033188,
0.09495214372873306,
0.05330093204975128,
0.19451028108596802,
-0.023680957034230232,
0.035888802260160446,
-0.0865037739276886,
0.046018391847610474,
0.05559029057621956,
-0.026218369603157043,
0.03390594199299812,
0.008778912015259266,
0.04476766288280487,
0.09348008036613464,
0.009264485910534859,
0.06515385210514069,
-0.048240963369607925,
-0.12327437102794647,
-0.0062234606593847275,
0.05391222983598709,
-0.042622022330760956,
0.1193678230047226,
0.10148831456899643,
-0.08981487154960632,
0.07727726548910141,
0.03411087021231651,
-0.10621733963489532,
-0.025321656838059425,
-0.0893828496336937,
-0.03838258609175682,
-0.20067523419857025,
0.033814363181591034,
-0.11500515043735504,
-0.01699870452284813,
0.05332065746188164,
0.0632726177573204,
-0.017063822597265244,
0.2222898155450821,
0.040729157626628876,
-0.08235407620668411,
0.08814891427755356,
-0.012731963768601418,
-0.015412477776408195,
0.03540978953242302,
0.012584453448653221,
0.019534064456820488,
-0.0021820373367518187,
0.016432514414191246,
0.032070036977529526,
-0.005696664098650217,
0.021397292613983154,
-0.05693758651614189,
-0.05330822616815567,
-0.07045134156942368,
0.026645883917808533,
-0.024310311302542686,
0.035481326282024384,
-0.0035895591136068106,
-0.023461604490876198,
-0.0020976413507014513,
0.22124160826206207,
-0.05031711235642433,
-0.03505874425172806,
-0.13446062803268433,
0.1555982381105423,
0.0006401251303032041,
0.0418526791036129,
-0.021105574443936348,
-0.09226483106613159,
-0.05551524460315704,
0.32528582215309143,
0.2008843868970871,
-0.08140307664871216,
0.004778568167239428,
-0.00900194887071848,
0.013236695900559425,
0.03630595654249191,
0.10596774518489838,
0.011576985009014606,
0.17048224806785583,
-0.024418121203780174,
0.0064766122959554195,
-0.043794721364974976,
-0.07036861777305603,
0.05361507087945938,
0.1407037228345871,
0.030288582667708397,
-0.04532250761985779,
-0.0889146625995636,
0.07975821942090988,
-0.1830214112997055,
-0.1599653959274292,
0.011576790362596512,
-0.16931875050067902,
-0.06360830366611481,
-0.038081079721450806,
0.09373827278614044,
-0.03998593986034393,
0.08024920523166656,
-0.03567754477262497,
0.02004377730190754,
-0.017552245408296585,
0.02262699045240879,
-0.12387470155954361,
-0.11398440599441528,
0.09281066805124283,
-0.008338747546076775,
0.09864609688520432,
-0.0371745266020298,
0.1113167256116867,
0.11539994180202484,
0.0012861514696851373,
-0.04099694639444351,
0.013955682516098022,
0.04135787859559059,
-0.060281626880168915,
-0.004936092533171177,
0.1087041050195694,
-0.027869436889886856,
0.0727003812789917,
-0.021702906116843224,
-0.1311791092157364,
0.015017449855804443,
-0.020593835040926933,
0.04608367010951042,
-0.14488492906093597,
-0.04202517867088318,
-0.10233055055141449,
0.07587293535470963,
0.1285502016544342,
-0.052402421832084656,
0.04517430439591408,
-0.08575231581926346,
0.08532711863517761,
-0.008652431890368462,
-0.00773600721731782,
-0.055990882217884064,
-0.17130038142204285,
-0.03986923024058342,
0.11268862336874008,
0.0001379925524815917,
-0.2022552490234375,
-0.0025815751869231462,
-0.03996390849351883,
0.00987657904624939,
-0.05222810432314873,
0.14613930881023407,
0.11962980031967163,
0.018051877617836,
-0.01236290019005537,
-0.07640353590250015,
0.010368185117840767,
0.034136079251766205,
-0.10906795412302017,
-0.13010218739509583
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_commit_generation_multitask"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_commit_generation_multitask", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/pre-training/commit%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_small_commit_generation_multitask
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
145
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 360,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.12012813240289688,
-0.015070044435560703,
-0.0014355257153511047,
0.11766417324542999,
0.11563064903020859,
0.017629237845540047,
0.08662179112434387,
0.07630542665719986,
-0.01600678823888302,
0.022816434502601624,
0.054562631994485855,
0.01067487895488739,
0.06598877161741257,
0.20303583145141602,
0.02508597820997238,
-0.14931067824363708,
-0.005760630127042532,
0.025303402915596962,
-0.019483553245663643,
0.13103561103343964,
0.10331006348133087,
-0.07134836167097092,
0.060755617916584015,
-0.03285747393965721,
-0.2179277390241623,
0.03775065764784813,
-0.006741225253790617,
-0.059708286076784134,
0.10214170813560486,
0.0513252317905426,
0.12104237824678421,
-0.0008032802143134177,
0.01931755803525448,
-0.12065234780311584,
0.010850769467651844,
0.04886849224567413,
0.02316088229417801,
0.015538987703621387,
0.04682359844446182,
0.04253546521067619,
0.1328839361667633,
-0.005706983618438244,
0.018749073147773743,
0.04477814957499504,
-0.07537003606557846,
-0.031050775200128555,
-0.034429360181093216,
0.029503660276532173,
0.06036876514554024,
0.10272717475891113,
-0.02233942784368992,
0.10355442017316818,
-0.1374451220035553,
0.12234284728765488,
0.15000289678573608,
-0.21516212821006775,
-0.012320760637521744,
0.12133166939020157,
0.06797555834054947,
0.11841123551130295,
-0.0485193133354187,
-0.05037027969956398,
0.09097222238779068,
0.05859685316681862,
0.02529464289546013,
-0.06023867800831795,
-0.022032789885997772,
0.03448782488703728,
-0.1203068420290947,
-0.08577483147382736,
0.15467184782028198,
-0.011787207797169685,
-0.0892082005739212,
-0.07537217438220978,
-0.0350542888045311,
-0.17284180223941803,
0.04279560595750809,
0.049977805465459824,
-0.003933989908546209,
-0.012316997162997723,
-0.023251190781593323,
0.0005585841136053205,
-0.09157361835241318,
-0.14356182515621185,
-0.0057433051988482475,
0.10489245504140854,
0.06514040380716324,
0.038858894258737564,
-0.06882509589195251,
0.12028715014457703,
-0.024298207834362984,
-0.038519058376550674,
-0.045014698058366776,
-0.019301461055874825,
-0.12083867192268372,
0.025604000315070152,
-0.059085749089717865,
-0.19753026962280273,
0.04007814824581146,
0.008837138302624226,
-0.058046501129865646,
0.044238682836294174,
0.02642255648970604,
0.021361062303185463,
0.05744165927171707,
0.21850337088108063,
0.003491203999146819,
-0.07178759574890137,
0.06905952095985413,
0.04454115405678749,
-0.060029253363609314,
-0.017033256590366364,
-0.0765623152256012,
-0.07840225845575333,
0.095978282392025,
0.09851434826850891,
-0.11483482271432877,
0.0559847317636013,
-0.053514305502176285,
-0.043432652950286865,
0.01370470505207777,
-0.14879177510738373,
-0.015640731900930405,
0.005084919277578592,
-0.062000710517168045,
-0.052637793123722076,
0.06353043019771576,
-0.14536184072494507,
-0.14285427331924438,
-0.04555506631731987,
-0.0817958191037178,
-0.05687200278043747,
-0.13386771082878113,
-0.15880143642425537,
-0.017179016023874283,
-0.07445046305656433,
-0.003917084541171789,
-0.08981223404407501,
-0.15578453242778778,
-0.018550008535385132,
0.0021856960374861956,
0.010042262263596058,
0.007502625230699778,
-0.05136622115969658,
-0.004973927978426218,
-0.024745509028434753,
-0.031043363735079765,
-0.0359991118311882,
-0.060952991247177124,
0.11617084592580795,
0.07274827361106873,
0.03420290723443031,
-0.00478279497474432,
0.021552039310336113,
-0.0763801857829094,
0.054359983652830124,
-0.14354388415813446,
0.11314000189304352,
-0.08044204860925674,
0.08692862093448639,
-0.05242953822016716,
-0.0948086604475975,
0.07611928880214691,
0.043953198939561844,
0.08083679527044296,
0.06427627801895142,
-0.09478382021188736,
-0.04778885841369629,
0.17968301475048065,
-0.11639464646577835,
-0.09911888837814331,
0.1458854377269745,
-0.032634999603033066,
0.03240581229329109,
0.11545754224061966,
0.14105813205242157,
0.15645232796669006,
-0.06390458345413208,
-0.010281839407980442,
0.06441988795995712,
0.02449200488626957,
-0.08100678771734238,
0.06444276869297028,
0.06257249414920807,
-0.08660449087619781,
0.04759205877780914,
-0.016216324642300606,
0.11219090223312378,
-0.026345394551753998,
-0.01117340475320816,
-0.056955061852931976,
-0.07386717200279236,
-0.040202051401138306,
-0.0217171348631382,
0.05186764895915985,
-0.08075440675020218,
-0.0760192945599556,
0.022469546645879745,
0.16655613481998444,
-0.11856323480606079,
0.00004737453127745539,
-0.09091734886169434,
0.043475039303302765,
-0.08367914706468582,
0.03385727480053902,
-0.12161120027303696,
0.013791200704872608,
0.08855894207954407,
-0.0287947840988636,
0.08603338152170181,
0.10472267866134644,
0.01978916861116886,
0.0420490987598896,
0.001573950401507318,
-0.024446189403533936,
-0.09084869176149368,
-0.06099030375480652,
-0.08341789245605469,
-0.06338004022836685,
-0.07511291652917862,
-0.04473844915628433,
-0.025974376127123833,
-0.13055136799812317,
0.024212546646595,
0.08120650053024292,
0.019402766600251198,
0.006342487875372171,
-0.03129271790385246,
0.028446901589632034,
0.03572949767112732,
-0.06338568776845932,
-0.06192620098590851,
-0.005274602677673101,
0.02167315036058426,
-0.07522344589233398,
-0.023868288844823837,
-0.05200154334306717,
0.0013982132077217102,
0.11067122220993042,
0.0914979875087738,
-0.07223863899707794,
-0.005720512475818396,
-0.029929684475064278,
-0.029590614140033722,
0.009472956880927086,
-0.07271070033311844,
0.1636822372674942,
-0.004478431772440672,
0.1722625195980072,
-0.1506885588169098,
-0.025197841227054596,
-0.013515305705368519,
0.01387060061097145,
0.04580601304769516,
0.12193095684051514,
-0.040121521800756454,
-0.12653030455112457,
0.07127780467271805,
-0.03695008158683777,
-0.076352059841156,
0.22339791059494019,
-0.0358944870531559,
-0.09391754120588303,
0.011340366676449776,
0.10679446905851364,
-0.012645525857806206,
0.14005082845687866,
-0.1820172518491745,
-0.0382380373775959,
0.008305149152874947,
0.024225205183029175,
0.0796128585934639,
-0.14013506472110748,
-0.0014748790999874473,
0.013053545728325844,
-0.08348900824785233,
-0.08288261294364929,
-0.02343672141432762,
-0.011731777340173721,
0.039956796914339066,
0.01634008064866066,
0.017843078821897507,
0.02171126939356327,
-0.02749484032392502,
-0.09379114955663681,
0.22451229393482208,
-0.10070545971393585,
-0.2568499743938446,
-0.1826373040676117,
0.05778779089450836,
-0.07391829043626785,
0.015489264391362667,
0.0283801406621933,
-0.1455860733985901,
-0.045317817479372025,
-0.04487808793783188,
0.21732772886753082,
-0.08016926050186157,
0.040727756917476654,
-0.036188896745443344,
0.06131729111075401,
-0.010326147079467773,
-0.19787828624248505,
0.027421826496720314,
-0.009978288784623146,
-0.056993454694747925,
0.009738735854625702,
-0.12379182130098343,
0.0707026794552803,
0.16167816519737244,
-0.04703972861170769,
0.04171529412269592,
-0.008226165547966957,
0.25153598189353943,
-0.06963399052619934,
-0.03892775624990463,
0.1359475553035736,
0.01517634280025959,
0.010387321934103966,
0.012686289846897125,
-0.010040204040706158,
-0.0829254612326622,
0.06226888298988342,
-0.0009236407931894064,
-0.04175539314746857,
-0.27216729521751404,
0.006396218668669462,
-0.06727154552936554,
0.0833808183670044,
0.038783762603998184,
0.05418000742793083,
-0.05783319100737572,
0.04638438671827316,
0.02036331221461296,
0.13839319348335266,
0.0021771269384771585,
0.054421138018369675,
-0.008409643545746803,
-0.01395375095307827,
0.02194412238895893,
-0.06843499839305878,
0.00767352432012558,
0.10732442885637283,
0.12268967181444168,
0.2329815924167633,
-0.11646059155464172,
0.20859117805957794,
0.04048832133412361,
0.06755611300468445,
0.05000525712966919,
0.11601773649454117,
-0.1290651559829712,
0.018661385402083397,
0.011546253226697445,
-0.004496497567743063,
-0.11287832260131836,
0.030257483944296837,
-0.02305821143090725,
0.024352222681045532,
-0.09175089001655579,
-0.050239525735378265,
0.03661687672138214,
0.170982226729393,
0.05411766096949577,
-0.21839533746242523,
-0.1280117630958557,
0.01265217550098896,
-0.10775458067655563,
-0.1022830605506897,
0.05977907404303551,
0.21055172383785248,
-0.06687407940626144,
-0.020876258611679077,
-0.00895343255251646,
0.12329334020614624,
-0.10337291657924652,
-0.01927197352051735,
-0.05560560151934624,
0.10378441214561462,
-0.027488116174936295,
0.13208699226379395,
-0.26987704634666443,
0.10425439476966858,
-0.0016535121249035,
0.042578186839818954,
-0.06359770894050598,
0.05625913664698601,
-0.03596455976366997,
0.08693587779998779,
0.03980956971645355,
-0.0025307699106633663,
0.026771580800414085,
-0.1706991046667099,
-0.014702488668262959,
0.030461570248007774,
0.052345748990774155,
0.010182018391788006,
0.07163547724485397,
0.011526031419634819,
0.05194637179374695,
-0.014966200105845928,
-0.11743630468845367,
-0.07254786789417267,
-0.08973827958106995,
0.021314088255167007,
-0.026079043745994568,
-0.005908553954213858,
-0.0724346935749054,
-0.013001975603401661,
0.06475169956684113,
0.23716594278812408,
-0.07805383205413818,
-0.1026601642370224,
-0.09188869595527649,
0.07195780426263809,
0.12548595666885376,
-0.07762663066387177,
0.04343901947140694,
-0.01836816780269146,
0.05910324305295944,
-0.018319986760616302,
-0.06651882082223892,
0.06627833843231201,
-0.046543072909116745,
-0.07844404876232147,
-0.005766948685050011,
0.09833596646785736,
0.03951757773756981,
0.01669979840517044,
-0.001247506239451468,
-0.10171213746070862,
-0.059970688074827194,
-0.1148831695318222,
-0.09741340577602386,
-0.0465361587703228,
-0.0067065698094666,
0.10280431061983109,
-0.07461041957139969,
-0.0248399768024683,
-0.03733177110552788,
-0.046412333846092224,
0.10463923960924149,
0.158534437417984,
-0.024580184370279312,
0.021227188408374786,
0.13688141107559204,
-0.04429370537400246,
-0.17494569718837738,
0.032918531447649,
0.06018485128879547,
0.10544665157794952,
-0.09644218534231186,
-0.1923196017742157,
0.012773203663527966,
0.018174242228269577,
0.02878081239759922,
0.04449933022260666,
-0.31678634881973267,
-0.11544610559940338,
0.06524099409580231,
0.11745034903287888,
0.11379093676805496,
-0.11661123484373093,
-0.0423579066991806,
-0.06252053380012512,
-0.07851733267307281,
0.09907111525535583,
-0.05140278860926628,
0.1471998691558838,
-0.04534602537751198,
0.052240174263715744,
0.03951442241668701,
-0.05378536507487297,
0.06844287365674973,
0.02380583994090557,
0.08902965486049652,
-0.04274039342999458,
0.033241111785173416,
0.12153129279613495,
-0.0378110408782959,
0.15545104444026947,
-0.13417834043502808,
0.11587519198656082,
-0.17098547518253326,
-0.07458443194627762,
-0.08952460438013077,
0.016251640394330025,
-0.008799840696156025,
-0.06628131121397018,
-0.12622298300266266,
0.03520829603075981,
-0.01943575032055378,
-0.003539461176842451,
0.06612170487642288,
-0.02805393747985363,
-0.02739657834172249,
0.0934150442481041,
0.06063620373606682,
-0.030809422954916954,
-0.06165533512830734,
0.05255280062556267,
0.04009973630309105,
0.09354989975690842,
-0.21395687758922577,
0.022173793986439705,
0.0888359546661377,
0.014350119046866894,
0.13446593284606934,
0.04476635158061981,
-0.1441289782524109,
0.02478143759071827,
0.08675810694694519,
-0.0933404490351677,
-0.061073318123817444,
-0.032998550683259964,
-0.06112849339842796,
-0.05757322907447815,
0.09092558175325394,
0.13130992650985718,
-0.03354451432824135,
-0.018585020676255226,
-0.03309263661503792,
-0.00867820531129837,
-0.10812276601791382,
0.21522729098796844,
0.062403660267591476,
0.06309928745031357,
-0.07888176292181015,
0.044858597218990326,
0.07321225851774216,
-0.06408189237117767,
0.008933426812291145,
0.17415669560432434,
-0.10681402683258057,
-0.04657088592648506,
0.028571737930178642,
0.11088740080595016,
-0.0444195531308651,
-0.025789568200707436,
-0.11702978610992432,
-0.06462961435317993,
0.03430026024580002,
0.14155817031860352,
0.07989812642335892,
0.10513937473297119,
-0.02401658333837986,
0.035054903477430344,
-0.08289168030023575,
0.06826914101839066,
0.06108205392956734,
0.057553812861442566,
-0.12967732548713684,
0.14414379000663757,
0.037708133459091187,
0.12255465239286423,
-0.030741682276129723,
-0.001311823958531022,
-0.13786280155181885,
0.06177438795566559,
-0.08850251883268356,
0.012062635272741318,
-0.012651875615119934,
0.04044528678059578,
-0.01966693438589573,
-0.023759497329592705,
-0.02212774194777012,
0.06045578792691231,
-0.08584192395210266,
0.007685690186917782,
-0.013942571356892586,
0.038796912878751755,
-0.044599663466215134,
-0.008056757971644402,
0.04414689540863037,
-0.10474628955125809,
0.16511580348014832,
-0.011028353124856949,
-0.027378516271710396,
0.06741170585155487,
-0.038706809282302856,
0.05802374705672264,
0.02186795137822628,
0.04074319824576378,
0.0034520691260695457,
0.0390864722430706,
0.08131790161132812,
0.02452695555984974,
0.03593612462282181,
0.027433425188064575,
0.11091294139623642,
-0.126087486743927,
-0.09342432767152786,
-0.02043302170932293,
-0.07443656027317047,
-0.056117575615644455,
0.09789489209651947,
0.0699915885925293,
0.09852283447980881,
0.065611831843853,
-0.016773391515016556,
0.013739586807787418,
-0.13749870657920837,
-0.06147142872214317,
0.03958209231495857,
-0.05634208396077156,
-0.014857394620776176,
-0.047796279191970825,
0.05888468399643898,
-0.029130391776561737,
0.17214380204677582,
0.015666622668504715,
0.04018256813287735,
-0.022768888622522354,
0.020068466663360596,
0.059569746255874634,
0.03376173973083496,
0.19795101881027222,
-0.07533390074968338,
0.024805733934044838,
-0.028625359758734703,
-0.004275986924767494,
0.02693753130733967,
0.055809151381254196,
0.0719328224658966,
0.08122220635414124,
0.01328320987522602,
0.0900348573923111,
0.02719065360724926,
0.008201529271900654,
-0.08568624407052994,
-0.020911991596221924,
-0.021838819608092308,
0.06538601219654083,
-0.048891887068748474,
0.17216868698596954,
0.08285988867282867,
-0.09441844373941422,
0.09891385585069656,
0.05206241086125374,
-0.13897819817066193,
-0.02765047177672386,
-0.03669484332203865,
-0.026114415377378464,
-0.16477321088314056,
0.040429990738630295,
-0.12775108218193054,
0.004185377154499292,
0.059204790741205215,
0.07260723412036896,
-0.06046764552593231,
0.20166994631290436,
0.07537620514631271,
-0.08169858902692795,
0.06875243782997131,
0.006624382454901934,
0.0121183842420578,
0.0682702288031578,
-0.009641064330935478,
0.056872088462114334,
-0.01380106434226036,
0.05985622853040695,
0.008802141062915325,
-0.006705617066472769,
0.006562912371009588,
-0.008568825200200081,
-0.007474182173609734,
-0.03329131752252579,
0.00601103575900197,
0.028136568143963814,
0.15453112125396729,
0.009978361427783966,
-0.0748450979590416,
-0.01789623685181141,
0.17707322537899017,
-0.048584945499897,
-0.07159692049026489,
-0.1310896873474121,
0.14905047416687012,
0.01906355284154415,
0.01858772709965706,
0.006728008855134249,
-0.09508814662694931,
-0.05962112545967102,
0.23508824408054352,
0.0676615908741951,
-0.0381263867020607,
-0.035084500908851624,
-0.0031029724050313234,
-0.008775693364441395,
-0.021310562267899513,
0.18004970252513885,
0.00953681580722332,
0.22165140509605408,
0.024716664105653763,
0.02731301449239254,
-0.04851558059453964,
-0.04087023437023163,
0.005988787859678268,
0.1392996907234192,
-0.03455440327525139,
-0.020880183205008507,
-0.10297992080450058,
0.007407707162201405,
-0.010235528461635113,
-0.13363400101661682,
0.05143139511346817,
-0.12005548924207687,
-0.0940699577331543,
-0.022078441455960274,
0.0677768662571907,
-0.05777319148182869,
0.046510521322488785,
-0.026425475254654884,
0.05702563747763634,
0.030684184283018112,
-0.028363589197397232,
-0.10942253470420837,
-0.16501745581626892,
0.10361514240503311,
-0.02614416554570198,
0.12263748794794083,
-0.013042723760008812,
0.1399029791355133,
0.09717519581317902,
0.0068274931982159615,
-0.06510326266288757,
0.08785852044820786,
0.016682501882314682,
0.005831004120409489,
0.01768038049340248,
0.14252762496471405,
-0.046380773186683655,
0.09849739819765091,
-0.03664976358413696,
-0.056455347687006,
-0.03056997060775757,
-0.05612177029252052,
0.011670004576444626,
-0.1813729703426361,
-0.019828319549560547,
-0.11291943490505219,
0.08722740411758423,
0.1709671914577484,
-0.04147343337535858,
-0.0006671804003417492,
-0.10153599828481674,
0.08476985991001129,
-0.022469844669103622,
0.06296533346176147,
-0.028345370665192604,
-0.19760562479496002,
-0.024892205372452736,
0.05588693916797638,
0.023749064654111862,
-0.2540301978588104,
-0.0023376138415187597,
-0.023059897124767303,
-0.010296233929693699,
-0.08392821252346039,
0.16773967444896698,
0.0768011063337326,
0.04114820808172226,
-0.02326565980911255,
-0.09066500514745712,
-0.02012110874056816,
0.03086090087890625,
-0.13449838757514954,
-0.1304698884487152
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the git commit message generation task for the java commit changes.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_commit_generation_multitask_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_commit_generation_multitask_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/multitask/fine-tuning/commit%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 8,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_small_commit_generation_multitask_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used multi-task training on 13 supervised tasks in the software development domain and 7 unsupervised datasets. It is then fine-tuned on the git commit message generation task for the java commit changes.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Multi-task Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 8,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 8,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 8,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
88,
111
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Multi-task Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 8,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08642812073230743,
0.06054961308836937,
-0.0017991255735978484,
0.10256880521774292,
0.05758886784315109,
0.024617575109004974,
0.07600464671850204,
0.10137295722961426,
-0.00919547863304615,
0.07429511845111847,
0.07168880850076675,
-0.03594021499156952,
0.07505439966917038,
0.1725514829158783,
0.033510472625494,
-0.16836360096931458,
-0.008849346078932285,
0.02544676885008812,
-0.02631472609937191,
0.10348252207040787,
0.09500380605459213,
-0.08720429241657257,
0.06423129886388779,
-0.03450958430767059,
-0.11799627542495728,
0.04427102580666542,
-0.04644216597080231,
-0.04182712733745575,
0.0865241065621376,
0.06423237919807434,
0.10311480611562729,
-0.018153607845306396,
0.058624204248189926,
-0.19315537810325623,
-0.0006470137159340084,
0.03224777430295944,
0.04567980766296387,
0.02299564518034458,
0.05955933779478073,
0.0829574465751648,
0.1233360767364502,
-0.041102513670921326,
0.02898271009325981,
0.046296775341033936,
-0.057162199169397354,
-0.03511987254023552,
-0.059991929680109024,
0.07747737318277359,
0.12844641506671906,
0.09055584669113159,
-0.017547128722071648,
0.019949443638324738,
-0.0803641527891159,
0.08769293874502182,
0.13607831299304962,
-0.22163064777851105,
-0.0183296836912632,
0.103357695043087,
0.09268346428871155,
0.07826632261276245,
-0.08118847012519836,
-0.027837982401251793,
0.10824758559465408,
0.03611113503575325,
0.029853055253624916,
-0.07198420912027359,
-0.016574671491980553,
0.0020388292614370584,
-0.06811840087175369,
-0.06612405180931091,
0.10316015034914017,
0.04280022904276848,
-0.06065743416547775,
-0.11808008700609207,
-0.04305589944124222,
-0.21468794345855713,
0.04866856336593628,
0.030734430998563766,
0.009663851000368595,
-0.013684275560081005,
0.0230968426913023,
-0.007836188189685345,
-0.0937797874212265,
-0.1034361943602562,
-0.0007133720209822059,
0.07112141698598862,
0.08183051645755768,
0.02627788856625557,
-0.0025560734793543816,
0.07560602575540543,
-0.026468360796570778,
-0.05252626910805702,
-0.03644769266247749,
0.02017933875322342,
-0.12674535810947418,
0.02854100801050663,
-0.02823028899729252,
-0.07930658757686615,
0.0018705454422160983,
0.09628374874591827,
-0.07002841681241989,
0.07653491944074631,
0.12155555188655853,
0.012298506684601307,
0.010402359068393707,
0.23884059488773346,
0.016057418659329414,
-0.12374448031187057,
0.007011617999523878,
0.03616439551115036,
-0.0011288122041150928,
-0.01642153225839138,
-0.07209207862615585,
-0.031241314485669136,
0.0287113469094038,
0.056304629892110825,
-0.12812387943267822,
0.014796474017202854,
-0.0366443395614624,
-0.01802963763475418,
0.07773621380329132,
-0.12268180400133133,
0.028583452105522156,
0.008475258946418762,
-0.07049518823623657,
-0.04381299763917923,
0.05080346763134003,
-0.1019149050116539,
-0.11410579085350037,
0.025448409840464592,
-0.05368439108133316,
-0.03845709562301636,
-0.11516118049621582,
-0.13210806250572205,
-0.010128041729331017,
-0.07933102548122406,
0.0165839996188879,
-0.10357208549976349,
-0.09050887823104858,
-0.020870935171842575,
0.02070958912372589,
-0.005968378856778145,
-0.014345302246510983,
-0.044247835874557495,
0.020248740911483765,
-0.0017898717196658254,
-0.028160005807876587,
-0.0060721649788320065,
-0.04206442832946777,
0.08959867805242538,
0.07908526062965393,
0.056677594780921936,
0.017437176778912544,
0.020482590422034264,
-0.062445901334285736,
0.06554355472326279,
-0.08700371533632278,
0.06266403198242188,
-0.028794914484024048,
0.06791806221008301,
-0.08768521994352341,
-0.08259676396846771,
0.07863351702690125,
0.04690485820174217,
0.06491551548242569,
0.027160638943314552,
-0.07665761560201645,
0.005180904641747475,
0.12574045360088348,
-0.09089596569538116,
-0.1256161779165268,
0.1356152445077896,
0.006765188183635473,
-0.0297197587788105,
0.07546623051166534,
0.13550423085689545,
0.15192916989326477,
-0.08594197034835815,
-0.06391560286283493,
0.08134377747774124,
0.07109170407056808,
-0.05393361300230026,
0.09429599344730377,
0.034500058740377426,
0.043704669922590256,
0.018101008608937263,
0.05340643599629402,
0.06825507432222366,
-0.012965623289346695,
-0.027895435690879822,
-0.024958595633506775,
-0.08445332199335098,
-0.04042189568281174,
-0.020766718313097954,
0.02357887662947178,
-0.07801467180252075,
-0.0771620124578476,
0.0005296536255627871,
0.1771070510149002,
-0.09880878776311874,
0.025544755160808563,
-0.0993834063410759,
-0.052360713481903076,
-0.08874818682670593,
0.021276643499732018,
-0.08790437877178192,
0.007771176751703024,
0.0619107149541378,
-0.04740075394511223,
0.07115926593542099,
0.0753292441368103,
0.0008331548888236284,
0.04007337987422943,
-0.031206190586090088,
-0.050007373094558716,
-0.04542836919426918,
-0.06055669113993645,
-0.13696110248565674,
-0.01533292606472969,
-0.0952971801161766,
-0.023004200309515,
-0.06121121346950531,
-0.1523369699716568,
0.022085117176175117,
-0.0077999490313231945,
0.026277251541614532,
-0.006331050768494606,
-0.02736734412610531,
0.03222629800438881,
0.04012051597237587,
-0.06319068372249603,
-0.0966787114739418,
0.005692527163773775,
0.014098230749368668,
-0.12717531621456146,
-0.03585361689329147,
-0.10709668695926666,
-0.050311435014009476,
0.07843517512083054,
0.10305612534284592,
-0.061886366456747055,
0.003541694488376379,
-0.026365533471107483,
-0.06278306990861893,
-0.025675976648926735,
-0.0823022723197937,
0.17016389966011047,
0.011357152834534645,
0.16004544496536255,
-0.1344069242477417,
-0.04555431753396988,
-0.02217169478535652,
-0.011691886000335217,
0.01295167114585638,
0.1487787365913391,
-0.013936979696154594,
-0.09221097081899643,
0.049893755465745926,
-0.028387075290083885,
-0.06173137575387955,
0.15412944555282593,
0.0015233814483508468,
-0.09355737268924713,
0.023641247302293777,
0.099814772605896,
-0.011996496468782425,
0.12929341197013855,
-0.09914381057024002,
-0.015116551890969276,
-0.0016030194237828255,
0.02797713503241539,
0.050371598452329636,
-0.14092184603214264,
0.024211453273892403,
0.05623505264520645,
-0.07665340602397919,
-0.048651836812496185,
-0.0378955714404583,
-0.04845338314771652,
0.03679521754384041,
0.005202856846153736,
-0.0028081797063350677,
-0.004663033876568079,
-0.02289503999054432,
-0.09158708900213242,
0.20666560530662537,
-0.08230379968881607,
-0.23387150466442108,
-0.17091703414916992,
0.0049422187730669975,
-0.06668850779533386,
0.006157411728054285,
0.051031410694122314,
-0.1368321180343628,
-0.05812543258070946,
-0.083358034491539,
0.13347119092941284,
-0.11596415936946869,
0.027875782921910286,
-0.001844808692112565,
0.028396867215633392,
0.026965277269482613,
-0.1708109974861145,
0.03102623112499714,
-0.003651128150522709,
-0.01989525370299816,
0.008967970497906208,
-0.07117339223623276,
0.08623690158128738,
0.1399935632944107,
-0.07416422665119171,
0.021408353000879288,
-0.007526679430156946,
0.18692846596240997,
-0.06233207881450653,
0.027425874024629593,
0.19851137697696686,
0.021035756915807724,
0.03678811341524124,
0.03636578097939491,
0.009010371752083302,
-0.08774910122156143,
0.06472380459308624,
0.05153409391641617,
-0.036996662616729736,
-0.2597406208515167,
0.009658806957304478,
-0.05586720630526543,
0.062276214361190796,
0.11789944767951965,
0.05688602849841118,
-0.11852476000785828,
0.041881341487169266,
-0.01897643692791462,
0.15210889279842377,
-0.03827225789427757,
0.05351955071091652,
-0.01714194193482399,
0.010278499685227871,
0.016450170427560806,
-0.07972521334886551,
0.01707508973777294,
0.08267507702112198,
0.12024173140525818,
0.19355405867099762,
-0.06322862952947617,
0.2095620334148407,
0.016939986497163773,
0.07764093577861786,
0.022413304075598717,
0.10481417179107666,
-0.12841448187828064,
-0.008392112329602242,
0.006277905777096748,
-0.0034630075097084045,
-0.07117165625095367,
0.05903599411249161,
-0.008176240138709545,
0.05480334535241127,
-0.07153297960758209,
0.03672431781888008,
0.025632798671722412,
0.17397937178611755,
0.07525432854890823,
-0.18758201599121094,
-0.11429168283939362,
0.018433669582009315,
-0.11018431931734085,
-0.11126815527677536,
0.06788770854473114,
0.20843061804771423,
-0.03448934853076935,
0.016128219664096832,
-0.008927283808588982,
0.13518841564655304,
-0.09622222185134888,
-0.016227848827838898,
0.030209409072995186,
0.08363340049982071,
0.0009932308457791805,
0.13538680970668793,
-0.2742549479007721,
0.0612071193754673,
0.012700356543064117,
0.07945404946804047,
-0.031017504632472992,
0.06572001427412033,
-0.04291697219014168,
0.016800450161099434,
0.05889781191945076,
0.0009102006442844868,
-0.06449899077415466,
-0.21438321471214294,
-0.06573712825775146,
0.024964941665530205,
0.060718536376953125,
-0.019329220056533813,
0.09603827446699142,
0.0010941042564809322,
0.06027756631374359,
-0.028015973046422005,
-0.11791505664587021,
-0.05814260616898537,
-0.13185641169548035,
-0.0065653882920742035,
0.019737359136343002,
-0.02749505266547203,
-0.038389697670936584,
0.014082060195505619,
-0.021007250994443893,
0.2367970049381256,
-0.14224012196063995,
-0.11303272843360901,
-0.08975940197706223,
0.08048094063997269,
0.12815478444099426,
-0.10111953318119049,
0.018832771107554436,
0.01672479324042797,
0.05318538099527359,
-0.04773769900202751,
-0.04342033341526985,
0.03652594983577728,
-0.06366148591041565,
-0.08954902738332748,
-0.02294863574206829,
0.08992934226989746,
-0.01086173765361309,
0.04723060131072998,
-0.001333701889961958,
-0.0981358215212822,
-0.05119843780994415,
-0.13223490118980408,
-0.072583869099617,
-0.027107127010822296,
0.02157180942595005,
0.021438511088490486,
-0.057358693331480026,
0.10602816194295883,
-0.03197455033659935,
-0.09578336030244827,
0.07010261714458466,
0.20339690148830414,
-0.04784824326634407,
0.0009534908458590508,
0.12083848565816879,
-0.0470840260386467,
-0.1505008041858673,
-0.06067109853029251,
0.05030990019440651,
0.09034877270460129,
-0.04932442680001259,
-0.14939728379249573,
0.05062844604253769,
0.03474368527531624,
0.025932801887392998,
0.027740051969885826,
-0.3003356158733368,
-0.12506012618541718,
0.04068366438150406,
0.07162093371152878,
0.0764617919921875,
-0.12495147436857224,
-0.0425993949174881,
-0.06510636955499649,
-0.06492260098457336,
0.03414487838745117,
0.050707705318927765,
0.13886889815330505,
-0.03667054325342178,
0.029724154621362686,
0.031173748895525932,
-0.03452026844024658,
0.10797661542892456,
0.007735287770628929,
0.09211885184049606,
-0.0208186786621809,
0.032075610011816025,
0.041473452001810074,
-0.06846368312835693,
0.15915364027023315,
-0.1841200292110443,
0.08539485931396484,
-0.20967768132686615,
-0.05629673972725868,
-0.011225787922739983,
-0.016371460631489754,
-0.03212878108024597,
-0.054715752601623535,
-0.12089360505342484,
0.003805470187216997,
0.03566489368677139,
-0.016327129676938057,
0.09358792752027512,
-0.018316129222512245,
-0.055782124400138855,
0.06965470314025879,
0.056825023144483566,
-0.05612105131149292,
-0.13394473493099213,
0.013693325221538544,
0.034796860069036484,
0.08789590001106262,
-0.20859837532043457,
0.016915099695324898,
0.12098633497953415,
0.00325622851960361,
0.11896931380033493,
0.011038030497729778,
-0.09082202613353729,
0.04308396950364113,
0.0706738829612732,
-0.04167918488383293,
-0.07602382451295853,
-0.01647217944264412,
-0.012485144659876823,
-0.08544891327619553,
0.050184622406959534,
0.1008705273270607,
-0.052410341799259186,
-0.016627494245767593,
-0.018123654648661613,
0.009925511665642262,
-0.06743957847356796,
0.2147367149591446,
0.026975544169545174,
0.08928972482681274,
-0.06599925458431244,
0.07561134546995163,
0.09839806705713272,
-0.10539322346448898,
0.016063859686255455,
0.164293110370636,
-0.0735914409160614,
-0.025489985942840576,
0.02858765423297882,
0.059933409094810486,
-0.05088748037815094,
-0.06128937378525734,
-0.10642962902784348,
-0.06848498433828354,
0.01926284283399582,
0.0032933149486780167,
0.06743723154067993,
0.07607962936162949,
-0.026101505383849144,
0.022129889577627182,
-0.09777536243200302,
0.08872587978839874,
0.06598874926567078,
0.058774180710315704,
-0.15418356657028198,
0.14137586951255798,
0.04567961022257805,
0.10198310017585754,
0.002083877567201853,
0.04130758345127106,
-0.10220354050397873,
0.047796666622161865,
-0.03860696032643318,
0.032275788486003876,
-0.009235127829015255,
0.04286167398095131,
-0.0258638896048069,
0.019339703023433685,
-0.024832088500261307,
0.0475948266685009,
-0.039606980979442596,
-0.029675936326384544,
-0.031172968447208405,
0.036209482699632645,
-0.047113921493291855,
-0.0202627032995224,
0.009275970049202442,
-0.08615581691265106,
0.12009899318218231,
-0.07276012003421783,
-0.011597953736782074,
-0.005876340437680483,
0.0011698680464178324,
0.07126712799072266,
0.02630644664168358,
0.05346760153770447,
-0.01007846649736166,
0.0033978710416704416,
0.04378592222929001,
0.009579973295331001,
-0.009677733294665813,
-0.005042595323175192,
0.032037511467933655,
-0.14038336277008057,
-0.097673699259758,
-0.09073017537593842,
-0.05470496416091919,
-0.06728042662143707,
0.08198053389787674,
0.08331339806318283,
0.06959426403045654,
0.08483924716711044,
-0.020688222721219063,
0.0029972982592880726,
-0.13905833661556244,
-0.03394154831767082,
0.05677563697099686,
-0.023107541725039482,
-0.06966587901115417,
-0.04652734473347664,
0.05897281691431999,
-0.040000006556510925,
0.13360841572284698,
-0.01299898698925972,
0.04425511509180069,
-0.012643500231206417,
-0.028284169733524323,
0.0034022002946585417,
0.003108546370640397,
0.2089463472366333,
-0.09331019967794418,
0.023960063233971596,
-0.005400625988841057,
-0.007487933151423931,
0.0556216686964035,
0.11809521168470383,
0.08138247579336166,
0.11300111562013626,
0.06839378923177719,
0.11598054319620132,
-0.04836510494351387,
-0.03025367297232151,
-0.17316564917564392,
0.040719665586948395,
-0.011623733676970005,
0.04397207871079445,
-0.025530684739351273,
0.09778834879398346,
0.14220492541790009,
-0.11981720477342606,
0.08279102295637131,
0.030581923201680183,
-0.10781591385602951,
-0.04135812819004059,
-0.04759631305932999,
-0.04639003053307533,
-0.10914258658885956,
0.023879649117588997,
-0.11127494275569916,
0.027358125895261765,
0.09018493443727493,
0.05164644867181778,
-0.02512623742222786,
0.15158234536647797,
0.00416920380666852,
-0.054309360682964325,
0.019197750836610794,
0.025220772251486778,
0.03975290060043335,
0.11392030119895935,
-0.010904228314757347,
0.08111458271741867,
-0.04900119826197624,
0.0897684395313263,
0.011722410097718239,
0.023346934467554092,
0.03329676017165184,
0.005779983010143042,
-0.013404530473053455,
-0.04839044436812401,
-0.006699870340526104,
0.08431106060743332,
0.1586393564939499,
0.02823455072939396,
-0.04202199727296829,
-0.05220244824886322,
0.16024015843868256,
-0.056435976177453995,
-0.05767025798559189,
-0.10854455828666687,
0.14698903262615204,
0.04880549758672714,
0.023494157940149307,
0.0015242531662806869,
-0.08687012642621994,
-0.06104090437293053,
0.23853544890880585,
0.011660254560410976,
-0.039589863270521164,
-0.0500161238014698,
-0.014734838157892227,
-0.01191145833581686,
-0.036714255809783936,
0.15048819780349731,
0.023930205032229424,
0.19098946452140808,
0.012946461327373981,
0.002273123012855649,
-0.03956526145339012,
-0.031441036611795425,
-0.03343520313501358,
0.17844119668006897,
-0.031966034322977066,
0.04342051222920418,
-0.09861161559820175,
-0.017251404002308846,
0.04424493759870529,
-0.11310111731290817,
0.09142017364501953,
-0.08904219418764114,
-0.07913318276405334,
0.03908352553844452,
0.10152018070220947,
-0.02505369298160076,
0.047746557742357254,
-0.012481176294386387,
0.056527718901634216,
0.020621219649910927,
-0.03529300540685654,
-0.09263835102319717,
-0.12014863640069962,
0.04790947958827019,
-0.004754026886075735,
0.1585843712091446,
0.02133413963019848,
0.09475620090961456,
0.09512346982955933,
0.007507951930165291,
-0.08029422909021378,
0.09891875833272934,
0.02630416490137577,
-0.0023132567293941975,
0.06687475740909576,
0.1335412859916687,
-0.032236237078905106,
0.11824316531419754,
0.0013886918313801289,
-0.05138643831014633,
-0.042747385799884796,
-0.03284161910414696,
0.0023125584702938795,
-0.15324358642101288,
0.0006271190359257162,
-0.06233316287398338,
0.12478314340114594,
0.17767444252967834,
-0.04878239333629608,
-0.024614520370960236,
-0.030685177072882652,
0.06509801000356674,
-0.027370359748601913,
0.10669764876365662,
-0.006427176762372255,
-0.17809142172336578,
0.014526551589369774,
-0.013292244635522366,
0.019272398203611374,
-0.18805551528930664,
-0.036139924079179764,
-0.03300364688038826,
-0.03790435567498207,
-0.08938898146152496,
0.14537127315998077,
0.06979251652956009,
0.022450659424066544,
-0.04250612482428551,
-0.1875457614660263,
-0.024135224521160126,
0.04491102695465088,
-0.1468072533607483,
-0.12996500730514526
] |
null | null |
transformers
|
# CodeTrans model for git commit message generation
Pretrained model on git commit using the t5 small model architecture. It was first released in
[this repository](https://github.com/agemagician/CodeTrans). This model is trained on tokenized git commit: it works best with tokenized git commit.
## Model description
This CodeTrans model is based on the `t5-small` model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the git commit message generation task for the java commit changes.
## Intended uses & limitations
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead, SummarizationPipeline
pipeline = SummarizationPipeline(
model=AutoModelWithLMHead.from_pretrained("SEBIS/code_trans_t5_small_commit_generation_transfer_learning_finetune"),
tokenizer=AutoTokenizer.from_pretrained("SEBIS/code_trans_t5_small_commit_generation_transfer_learning_finetune", skip_special_tokens=True),
device=0
)
tokenized_code = "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"
pipeline([tokenized_code])
```
Run this example in [colab notebook](https://github.com/agemagician/CodeTrans/blob/main/prediction/transfer%20learning%20fine-tuning/commit%20generation/small_model.ipynb).
## Training data
The supervised training tasks datasets can be downloaded on [Link](https://www.dropbox.com/sh/488bq2of10r4wvw/AACs5CGIQuwtsD7j_Ls_JAORa/finetuning_dataset?dl=0&subfolder_nav_tracking=1)
## Training procedure
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
## Evaluation results
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
| Language / Model | Java |
| -------------------- | :------------: |
| CodeTrans-ST-Small | 39.61 |
| CodeTrans-ST-Base | 38.67 |
| CodeTrans-TF-Small | 44.22 |
| CodeTrans-TF-Base | 44.17 |
| CodeTrans-TF-Large | **44.41** |
| CodeTrans-MT-Small | 36.17 |
| CodeTrans-MT-Base | 39.25 |
| CodeTrans-MT-Large | 41.18 |
| CodeTrans-MT-TF-Small | 43.96 |
| CodeTrans-MT-TF-Base | 44.19 |
| CodeTrans-MT-TF-Large | 44.34 |
| State of the art | 32.81 |
> Created by [Ahmed Elnaggar](https://twitter.com/Elnaggar_AI) | [LinkedIn](https://www.linkedin.com/in/prof-ahmed-elnaggar/) and Wei Ding | [LinkedIn](https://www.linkedin.com/in/wei-ding-92561270/)
|
{"tags": ["summarization"], "widget": [{"text": "new file mode 100644 index 000000000 . . 892fda21b Binary files / dev / null and b / src / plugins / gateway / lib / joscar . jar differ"}]}
|
summarization
|
SEBIS/code_trans_t5_small_commit_generation_transfer_learning_finetune
|
[
"transformers",
"pytorch",
"jax",
"t5",
"feature-extraction",
"summarization",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:04+00:00
|
[] |
[] |
TAGS
#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us
|
CodeTrans model for git commit message generation
=================================================
Pretrained model on git commit using the t5 small model architecture. It was first released in
this repository. This model is trained on tokenized git commit: it works best with tokenized git commit.
Model description
-----------------
This CodeTrans model is based on the 't5-small' model. It has its own SentencePiece vocabulary model. It used transfer-learning pre-training on 7 unsupervised datasets in the software development domain. It is then fine-tuned on the git commit message generation task for the java commit changes.
Intended uses & limitations
---------------------------
The model could be used to generate the git commit message for the git commit changes or be fine-tuned on other relevant tasks. It can be used on unparsed and untokenized commit changes. However, if the change is tokenized, the performance should be better.
### How to use
Here is how to use this model to generate git commit message using Transformers SummarizationPipeline:
Run this example in colab notebook.
Training data
-------------
The supervised training tasks datasets can be downloaded on Link
Training procedure
------------------
### Transfer-learning Pretraining
The model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).
It has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.
The optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.
### Fine-tuning
This model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.
Evaluation results
------------------
For the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):
Test results :
>
> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn
>
>
>
|
[
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
"TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n",
"### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------",
"### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.",
"### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
46,
61,
87,
110
] |
[
"passage: TAGS\n#transformers #pytorch #jax #t5 #feature-extraction #summarization #endpoints_compatible #text-generation-inference #region-us \n### How to use\n\n\nHere is how to use this model to generate git commit message using Transformers SummarizationPipeline:\n\n\nRun this example in colab notebook.\n\n\nTraining data\n-------------\n\n\nThe supervised training tasks datasets can be downloaded on Link\n\n\nTraining procedure\n------------------### Transfer-learning Pretraining\n\n\nThe model was trained on a single TPU Pod V3-8 for 500,000 steps in total, using sequence length 512 (batch size 4096).\nIt has a total of approximately 220M parameters and was trained using the encoder-decoder architecture.\nThe optimizer used is AdaFactor with inverse square root learning rate schedule for pre-training.### Fine-tuning\n\n\nThis model was then fine-tuned on a single TPU Pod V2-8 for 10,000 steps in total, using sequence length 512 (batch size 256), using only the dataset only containing commit changes.\n\n\nEvaluation results\n------------------\n\n\nFor the git commit message generation task, different models achieves the following results on different programming languages (in BLEU score):\n\n\nTest results :\n\n\n\n\n> \n> Created by Ahmed Elnaggar | LinkedIn and Wei Ding | LinkedIn\n> \n> \n>"
] |
[
-0.08296319842338562,
0.06389674544334412,
-0.0016570630250498652,
0.10570209473371506,
0.05311350151896477,
0.01954781450331211,
0.04668606445193291,
0.11005321890115738,
-0.02733171358704567,
0.06955728679895401,
0.061568595468997955,
-0.05517576262354851,
0.07031910866498947,
0.18450534343719482,
0.027737872675061226,
-0.18268364667892456,
-0.02424204722046852,
0.025322386994957924,
-0.054406777024269104,
0.10865537822246552,
0.08690683543682098,
-0.07697393745183945,
0.07456537336111069,
-0.04115836322307587,
-0.1043972298502922,
0.04261696711182594,
-0.029449328780174255,
-0.028050454333424568,
0.09027821570634842,
0.06071839854121208,
0.10820513218641281,
-0.022117823362350464,
0.04774421080946922,
-0.19342556595802307,
0.0033282528165727854,
0.04625985026359558,
0.054759543389081955,
0.03403012454509735,
0.04516591504216194,
0.05991663411259651,
0.12281041592359543,
-0.033121977001428604,
0.037054311484098434,
0.0512361004948616,
-0.07020071148872375,
-0.034464046359062195,
-0.06140910089015961,
0.05109477788209915,
0.08716080337762833,
0.09942655265331268,
-0.01677306927740574,
0.001857916940934956,
-0.09373939037322998,
0.08078635483980179,
0.14897090196609497,
-0.21521562337875366,
-0.01853860914707184,
0.12315233796834946,
0.09743964672088623,
0.07738128304481506,
-0.08584174513816833,
-0.02502870000898838,
0.09836039692163467,
0.038732945919036865,
0.06567208468914032,
-0.07465164363384247,
-0.011133219115436077,
0.0015090189408510923,
-0.06700311601161957,
-0.056288838386535645,
0.1276533603668213,
0.026957247406244278,
-0.05101975426077843,
-0.12539024651050568,
-0.057208795100450516,
-0.21985691785812378,
0.03736292943358421,
0.027363549917936325,
0.012688376940786839,
0.002719020936638117,
-0.013445342890918255,
-0.033000409603118896,
-0.08779191225767136,
-0.11038917303085327,
0.007281925063580275,
0.026849159970879555,
0.065340057015419,
0.0319942831993103,
-0.01805836334824562,
0.0892753005027771,
-0.02305895835161209,
-0.04783445969223976,
-0.03023088350892067,
0.01379490178078413,
-0.125848650932312,
0.024342838674783707,
-0.02160559967160225,
-0.06522941589355469,
0.008230882696807384,
0.09189930558204651,
-0.08282238245010376,
0.07781516760587692,
0.09819979220628738,
0.01337413676083088,
0.009315157309174538,
0.23665066063404083,
0.019089560955762863,
-0.1425040066242218,
0.0314900241792202,
0.028010979294776917,
-0.0007818841841071844,
-0.0025405636988580227,
-0.07147395610809326,
-0.039468489587306976,
0.029277091845870018,
0.07114160805940628,
-0.11948496848344803,
0.03030737116932869,
-0.046386994421482086,
-0.009924433194100857,
0.07761250436306,
-0.1203308254480362,
0.025928808376193047,
0.0065068467520177364,
-0.0733187198638916,
-0.03199115768074989,
0.06713928282260895,
-0.11940702795982361,
-0.11252649128437042,
0.029783202335238457,
-0.048957642167806625,
-0.04376532509922981,
-0.11678998917341232,
-0.12644702196121216,
-0.011257829144597054,
-0.07613077759742737,
-0.003952350001782179,
-0.09797739237546921,
-0.11408846825361252,
-0.01609777845442295,
0.024777919054031372,
-0.0031675880309194326,
-0.02036125585436821,
-0.050753723829984665,
0.005951309576630592,
-0.0026598102413117886,
-0.021394332870841026,
0.004326956812292337,
-0.04530542343854904,
0.09054391831159592,
0.04900433123111725,
0.0567099004983902,
0.002650216454640031,
0.018499942496418953,
-0.08642490208148956,
0.07545474916696548,
-0.11603603512048721,
0.08417180180549622,
-0.02070562168955803,
0.0647212415933609,
-0.10105181485414505,
-0.07851332426071167,
0.017101090401411057,
0.042270343750715256,
0.06713531166315079,
0.03213128820061684,
-0.11755586415529251,
0.009568366222083569,
0.145074725151062,
-0.10217171162366867,
-0.10305320471525192,
0.12975840270519257,
0.0005966548342257738,
0.0007108596037141979,
0.09342637658119202,
0.14306995272636414,
0.1602792888879776,
-0.09557195752859116,
-0.03948155790567398,
0.0795581191778183,
0.04242435842752457,
-0.046934593468904495,
0.06542476266622543,
0.02187078446149826,
0.02521754801273346,
0.02842264249920845,
0.05480561777949333,
0.06381029635667801,
-0.000520834990311414,
-0.028312942013144493,
-0.04317258670926094,
-0.0921386107802391,
-0.04035736992955208,
-0.018953317776322365,
0.016717199236154556,
-0.06132321059703827,
-0.07294784486293793,
-0.002471641870215535,
0.16832813620567322,
-0.10138809680938721,
0.029088376089930534,
-0.07950681447982788,
-0.038299866020679474,
-0.061335306614637375,
0.033867865800857544,
-0.09842836111783981,
0.005920428317040205,
0.06546146422624588,
-0.0343409925699234,
0.07200649380683899,
0.07694341242313385,
0.011428301222622395,
0.019079409539699554,
-0.0529075525701046,
-0.04588533565402031,
-0.02485312521457672,
-0.07444077730178833,
-0.1140933558344841,
-0.017175378277897835,
-0.07866890728473663,
-0.01580372080206871,
-0.0354497916996479,
-0.14552009105682373,
0.011811358854174614,
0.008349616080522537,
0.028813393786549568,
0.015502077527344227,
-0.03627602383494377,
0.03049580007791519,
0.030313491821289062,
-0.045134540647268295,
-0.10084578394889832,
0.006047980394214392,
0.026163248345255852,
-0.1084357351064682,
-0.013445109128952026,
-0.09568259119987488,
-0.06352581828832626,
0.07872110605239868,
0.12369867414236069,
-0.09063562750816345,
-0.017902087420225143,
-0.02656574733555317,
-0.047711871564388275,
-0.05220562964677811,
-0.07469043880701065,
0.16669127345085144,
0.009958941489458084,
0.15730279684066772,
-0.14022456109523773,
-0.0537036694586277,
-0.03266771510243416,
-0.0006330418400466442,
0.029075460508465767,
0.14713017642498016,
-0.017652202397584915,
-0.10313409566879272,
0.04552203044295311,
-0.04923305660486221,
-0.06357111036777496,
0.17456188797950745,
-0.0035116500221192837,
-0.08085235953330994,
0.0058991615660488605,
0.1008826494216919,
-0.005857209675014019,
0.1507311910390854,
-0.06681688874959946,
0.0008706142543815076,
-0.008825450204312801,
0.022781379520893097,
0.049435652792453766,
-0.1321047991514206,
0.026123525574803352,
0.042297814041376114,
-0.0813189148902893,
-0.031257592141628265,
-0.03321153298020363,
-0.04229225218296051,
0.04308268427848816,
0.022288337349891663,
0.030132900923490524,
-0.014560004696249962,
-0.026555055752396584,
-0.09130899608135223,
0.19591985642910004,
-0.08980225026607513,
-0.241884246468544,
-0.16185638308525085,
0.049631521105766296,
-0.037062179297208786,
0.0004394313436932862,
0.03231225907802582,
-0.12745943665504456,
-0.05965390428900719,
-0.09380048513412476,
0.13424131274223328,
-0.11198602616786957,
0.017509089782834053,
-0.047490444034338,
0.053263209760189056,
0.034861497581005096,
-0.16446629166603088,
0.029216760769486427,
-0.017619237303733826,
-0.015209445729851723,
-0.010268394835293293,
-0.07576154917478561,
0.06978540867567062,
0.1372295618057251,
-0.06051090359687805,
0.025869304314255714,
-0.018182173371315002,
0.17981106042861938,
-0.06893917173147202,
0.04700348898768425,
0.18087254464626312,
0.031082816421985626,
0.0392281748354435,
0.04864983260631561,
0.00465969555079937,
-0.08449583500623703,
0.06913580745458603,
0.052733439952135086,
-0.04848693311214447,
-0.23137716948986053,
-0.009964586235582829,
-0.06611816585063934,
0.08085329085588455,
0.1272791624069214,
0.056683022528886795,
-0.12183524668216705,
0.030807651579380035,
-0.023509835824370384,
0.1511460393667221,
-0.012150237336754799,
0.05263873189687729,
-0.0043622227385640144,
0.003430824726819992,
0.016712551936507225,
-0.08284057676792145,
0.016494402661919594,
0.08442135155200958,
0.1186046451330185,
0.18973833322525024,
-0.09539622813463211,
0.19770024716854095,
0.005305865779519081,
0.09824609756469727,
0.04369663447141647,
0.08189821243286133,
-0.1304163783788681,
0.004881326109170914,
0.009287665598094463,
-0.01646452583372593,
-0.061694007366895676,
0.050478775054216385,
-0.013384906575083733,
0.04936636984348297,
-0.051554106175899506,
0.018543191254138947,
0.029643183574080467,
0.18874071538448334,
0.07828601449728012,
-0.1795852929353714,
-0.1162811815738678,
0.014038342982530594,
-0.09309963136911392,
-0.10157708078622818,
0.06960750371217728,
0.21453550457954407,
-0.04567534103989601,
0.018231242895126343,
-0.011634211987257004,
0.13182803988456726,
-0.12217233330011368,
-0.022748444229364395,
0.024173419922590256,
0.1050512045621872,
-0.01039848942309618,
0.11815717816352844,
-0.274554967880249,
0.048065632581710815,
0.01701609417796135,
0.08127664774656296,
-0.025817448273301125,
0.05312911421060562,
-0.04093804210424423,
0.004661292303353548,
0.06096213310956955,
0.0065903617069125175,
-0.050021056085824966,
-0.1882220208644867,
-0.0653587281703949,
0.018547896295785904,
0.06298837810754776,
-0.019044334068894386,
0.08530960977077484,
-0.018211649730801582,
0.04030495882034302,
-0.022804932668805122,
-0.15494544804096222,
-0.056509315967559814,
-0.14112156629562378,
-0.02520243264734745,
0.016580132767558098,
-0.020410681143403053,
-0.04034959897398949,
0.03829347342252731,
0.023662161082029343,
0.24622730910778046,
-0.13011188805103302,
-0.10167241841554642,
-0.09484485536813736,
0.07848656177520752,
0.1417173445224762,
-0.09099029004573822,
0.038664620369672775,
0.012173175811767578,
0.0626697912812233,
-0.04729655385017395,
-0.06877855211496353,
0.03740609064698219,
-0.054734621196985245,
-0.08320166170597076,
-0.02233966626226902,
0.12082242220640182,
-0.0018522001337260008,
0.04555216431617737,
-0.0002949026529677212,
-0.08747968822717667,
-0.04746033623814583,
-0.1260516345500946,
-0.0837450623512268,
-0.003915539477020502,
0.009375178255140781,
0.03729713708162308,
-0.07991661876440048,
0.07809969037771225,
-0.01814020611345768,
-0.08051811903715134,
0.07347773015499115,
0.16198398172855377,
-0.06119174510240555,
0.02434311993420124,
0.09734214842319489,
-0.04527314752340317,
-0.17718899250030518,
-0.03455181419849396,
0.0457594096660614,
0.07507171481847763,
-0.057791568338871,
-0.15534701943397522,
0.04039841517806053,
0.029218215495347977,
0.021617161110043526,
0.035567507147789,
-0.3092656433582306,
-0.12969444692134857,
0.007740580942481756,
0.0628228560090065,
0.06326574087142944,
-0.10040464997291565,
-0.05046103149652481,
-0.07440488040447235,
-0.021423807367682457,
0.06494767963886261,
0.04832793399691582,
0.1284264326095581,
-0.03167877718806267,
0.02697850577533245,
0.03964579477906227,
-0.029150601476430893,
0.06745392829179764,
-0.016773635521531105,
0.08834286034107208,
-0.02038506418466568,
0.03689365088939667,
0.06332476437091827,
-0.06963969767093658,
0.16956371068954468,
-0.17903079092502594,
0.097723588347435,
-0.14862407743930817,
-0.0482998788356781,
-0.024543920531868935,
-0.003546034684404731,
-0.02036898396909237,
-0.05524477735161781,
-0.14320193231105804,
0.025884926319122314,
0.03884713351726532,
-0.018356535583734512,
0.04838458076119423,
-0.026982931420207024,
-0.0608290433883667,
0.05793971195816994,
0.06239485740661621,
-0.026589222252368927,
-0.13400113582611084,
0.03477051854133606,
0.027198204770684242,
0.08808033913373947,
-0.21322138607501984,
0.01759106107056141,
0.10471762716770172,
0.013997272588312626,
0.11201147735118866,
0.010850131511688232,
-0.09656145423650742,
0.027518296614289284,
0.06967336684465408,
-0.05727032944560051,
-0.08232507854700089,
-0.019614070653915405,
-0.037853751331567764,
-0.08576879650354385,
0.04057611525058746,
0.09884659945964813,
-0.053872209042310715,
-0.007521352730691433,
-0.005903649143874645,
0.01879349909722805,
-0.06166107952594757,
0.19968943297863007,
0.0249375831335783,
0.06910188496112823,
-0.07131853699684143,
0.08249387890100479,
0.09480365365743637,
-0.10470638424158096,
0.019377421587705612,
0.16477295756340027,
-0.08644537627696991,
-0.020398808643221855,
0.04966633394360542,
0.06529276072978973,
-0.03148037940263748,
-0.045132678002119064,
-0.09119656682014465,
-0.07117053121328354,
0.02333119884133339,
0.03640219569206238,
0.06070283055305481,
0.09005710482597351,
-0.03296113386750221,
0.023648029193282127,
-0.10675470530986786,
0.09463196247816086,
0.06866873800754547,
0.05606745555996895,
-0.13751506805419922,
0.13164867460727692,
0.04668423533439636,
0.07659368962049484,
-0.0008358269697055221,
0.02767694741487503,
-0.11489679664373398,
0.04209113121032715,
-0.01722143590450287,
0.03372407704591751,
-0.005218777805566788,
0.04050864279270172,
-0.03938601538538933,
0.02912207506597042,
-0.02328632026910782,
0.04202114790678024,
-0.03409867361187935,
-0.031744446605443954,
-0.04079868644475937,
0.019347773864865303,
-0.06282868981361389,
-0.016211621463298798,
0.010268536396324635,
-0.08700426667928696,
0.11501506716012955,
-0.05888141691684723,
-0.0073574562557041645,
-0.0030935807153582573,
0.0029697862919420004,
0.061350367963314056,
0.030079277232289314,
0.04301287606358528,
-0.013749784789979458,
0.01409801747649908,
0.03821638971567154,
0.008907985873520374,
-0.012094640173017979,
-0.0008900489192456007,
0.05320674180984497,
-0.13914209604263306,
-0.07510756701231003,
-0.08079052716493607,
-0.04462748393416405,
-0.06543338298797607,
0.07943887263536453,
0.08345787227153778,
0.07419691979885101,
0.08266199380159378,
-0.028426064178347588,
0.00390436640009284,
-0.15379762649536133,
-0.04038725793361664,
0.05157948285341263,
-0.029060322791337967,
-0.07489652931690216,
-0.03133466839790344,
0.06476587057113647,
-0.03588242456316948,
0.1420702189207077,
0.0025948593392968178,
0.048477426171302795,
-0.016216829419136047,
0.0026775829028338194,
-0.006797197740525007,
0.012301180511713028,
0.18887534737586975,
-0.08629819750785828,
0.005589441861957312,
-0.00896106380969286,
0.0072123087011277676,
0.05139163136482239,
0.1305430829524994,
0.0835789293050766,
0.10038875788450241,
0.05990378558635712,
0.0932765007019043,
-0.046816468238830566,
-0.01819920539855957,
-0.14119207859039307,
0.07045218348503113,
-0.03933652862906456,
0.04282728582620621,
-0.038048405200242996,
0.13211196660995483,
0.1102045476436615,
-0.13130085170269012,
0.0977218821644783,
0.03673592582345009,
-0.10634879767894745,
-0.040898922830820084,
-0.11045472323894501,
-0.04504382610321045,
-0.11897344142198563,
0.00840029027312994,
-0.10981728136539459,
0.021483413875102997,
0.07691644877195358,
0.04136611148715019,
-0.02569465897977352,
0.16088184714317322,
-0.004618777893483639,
-0.07390198111534119,
0.041507523506879807,
0.043393392115831375,
0.028325563296675682,
0.11790063977241516,
0.0006843970040790737,
0.0749262347817421,
-0.05762858688831329,
0.08131599426269531,
0.020610181614756584,
0.01955065317451954,
0.02265772596001625,
0.012360329739749432,
-0.005296318791806698,
-0.050228316336870193,
0.0004033386067021638,
0.0664113312959671,
0.148846834897995,
0.03778298199176788,
-0.05139342322945595,
-0.04828233644366264,
0.20252567529678345,
-0.0595056377351284,
-0.06247590854763985,
-0.1212877482175827,
0.1577194631099701,
0.03514115512371063,
0.01684713363647461,
0.003163212211802602,
-0.08410633355379105,
-0.03535113111138344,
0.2603089213371277,
0.050336774438619614,
-0.045536916702985764,
-0.036914169788360596,
-0.015547280199825764,
-0.01507415808737278,
-0.02603341080248356,
0.15114086866378784,
0.013026094064116478,
0.2195703536272049,
0.013616043142974377,
0.006274753715842962,
-0.027916066348552704,
-0.04783129319548607,
-0.026486586779356003,
0.1911984533071518,
-0.0379435271024704,
0.03455724939703941,
-0.0994003564119339,
-0.013991165906190872,
0.010516856797039509,
-0.12465473264455795,
0.08988536149263382,
-0.11364137381315231,
-0.0734323263168335,
0.01879470981657505,
0.08104443550109863,
-0.039057400077581406,
0.048256855458021164,
-0.013389830477535725,
0.06215555593371391,
0.02020333707332611,
-0.03416366130113602,
-0.10643796622753143,
-0.14544008672237396,
0.041981518268585205,
-0.0022604488767683506,
0.12870629131793976,
0.011399609036743641,
0.07094766199588776,
0.08845999836921692,
0.008820226415991783,
-0.07896425575017929,
0.08822742104530334,
0.023630157113075256,
-0.030129900202155113,
0.047916997224092484,
0.13162976503372192,
-0.03816080838441849,
0.1371898353099823,
0.011746184900403023,
-0.031355120241642,
-0.02855801396071911,
-0.019728107377886772,
0.005684865638613701,
-0.15878741443157196,
-0.00606141472235322,
-0.06060957908630371,
0.12946513295173645,
0.18292368948459625,
-0.041690535843372345,
-0.010775377973914146,
-0.049526918679475784,
0.059061530977487564,
-0.024544410407543182,
0.08742597699165344,
0.003198615973815322,
-0.1678626835346222,
-0.005603645462542772,
0.010161280632019043,
0.009791692718863487,
-0.1869906634092331,
-0.039956267923116684,
-0.03357841074466705,
-0.030228562653064728,
-0.08836226165294647,
0.1519758403301239,
0.06422426551580429,
0.033577319234609604,
-0.03184868022799492,
-0.1436055302619934,
-0.005059864372014999,
0.04141392931342125,
-0.14111003279685974,
-0.11829046905040741
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.