sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
# T5-Efficient-BASE-FF6000 (Deep-Narrow version)
T5-Efficient-BASE-FF6000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-ff6000** - is of model type **Base** with the following variations:
- **ff** is **6000**
It has **336.18** million parameters and thus requires *ca.* **1344.71 MB** of memory in full precision (*fp32*)
or **672.36 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-ff6000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-FF6000 (Deep-Narrow version)
==============================================
T5-Efficient-BASE-FF6000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-ff6000 - is of model type Base with the following variations:
* ff is 6000
It has 336.18 million parameters and thus requires *ca.* 1344.71 MB of memory in full precision (*fp32*)
or 672.36 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-FF9000 (Deep-Narrow version)
T5-Efficient-BASE-FF9000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-ff9000** - is of model type **Base** with the following variations:
- **ff** is **9000**
It has **449.42** million parameters and thus requires *ca.* **1797.7 MB** of memory in full precision (*fp32*)
or **898.85 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-ff9000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-FF9000 (Deep-Narrow version)
==============================================
T5-Efficient-BASE-FF9000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-ff9000 - is of model type Base with the following variations:
* ff is 9000
It has 449.42 million parameters and thus requires *ca.* 1797.7 MB of memory in full precision (*fp32*)
or 898.85 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-KV128 (Deep-Narrow version)
T5-Efficient-BASE-KV128 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-kv128** - is of model type **Base** with the following variations:
- **kv** is **128**
It has **307.87** million parameters and thus requires *ca.* **1231.47 MB** of memory in full precision (*fp32*)
or **615.73 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-kv128
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-KV128 (Deep-Narrow version)
=============================================
T5-Efficient-BASE-KV128 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-kv128 - is of model type Base with the following variations:
* kv is 128
It has 307.87 million parameters and thus requires *ca.* 1231.47 MB of memory in full precision (*fp32*)
or 615.73 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-KV16 (Deep-Narrow version)
T5-Efficient-BASE-KV16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-kv16** - is of model type **Base** with the following variations:
- **kv** is **16**
It has **159.23** million parameters and thus requires *ca.* **636.92 MB** of memory in full precision (*fp32*)
or **318.46 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-kv16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-KV16 (Deep-Narrow version)
============================================
T5-Efficient-BASE-KV16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-kv16 - is of model type Base with the following variations:
* kv is 16
It has 159.23 million parameters and thus requires *ca.* 636.92 MB of memory in full precision (*fp32*)
or 318.46 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-KV256 (Deep-Narrow version)
T5-Efficient-BASE-KV256 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-kv256** - is of model type **Base** with the following variations:
- **kv** is **256**
It has **477.74** million parameters and thus requires *ca.* **1910.94 MB** of memory in full precision (*fp32*)
or **955.47 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-kv256
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-KV256 (Deep-Narrow version)
=============================================
T5-Efficient-BASE-KV256 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-kv256 - is of model type Base with the following variations:
* kv is 256
It has 477.74 million parameters and thus requires *ca.* 1910.94 MB of memory in full precision (*fp32*)
or 955.47 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-KV32 (Deep-Narrow version)
T5-Efficient-BASE-KV32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-kv32** - is of model type **Base** with the following variations:
- **kv** is **32**
It has **180.46** million parameters and thus requires *ca.* **721.86 MB** of memory in full precision (*fp32*)
or **360.93 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-kv32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-KV32 (Deep-Narrow version)
============================================
T5-Efficient-BASE-KV32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-kv32 - is of model type Base with the following variations:
* kv is 32
It has 180.46 million parameters and thus requires *ca.* 721.86 MB of memory in full precision (*fp32*)
or 360.93 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NH16 (Deep-Narrow version)
T5-Efficient-BASE-NH16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nh16** - is of model type **Base** with the following variations:
- **nh** is **16**
It has **251.24** million parameters and thus requires *ca.* **1004.97 MB** of memory in full precision (*fp32*)
or **502.49 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nh16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NH16 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NH16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nh16 - is of model type Base with the following variations:
* nh is 16
It has 251.24 million parameters and thus requires *ca.* 1004.97 MB of memory in full precision (*fp32*)
or 502.49 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NH24 (Deep-Narrow version)
T5-Efficient-BASE-NH24 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nh24** - is of model type **Base** with the following variations:
- **nh** is **24**
It has **307.87** million parameters and thus requires *ca.* **1231.47 MB** of memory in full precision (*fp32*)
or **615.73 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nh24
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NH24 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NH24 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nh24 - is of model type Base with the following variations:
* nh is 24
It has 307.87 million parameters and thus requires *ca.* 1231.47 MB of memory in full precision (*fp32*)
or 615.73 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NH32 (Deep-Narrow version)
T5-Efficient-BASE-NH32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nh32** - is of model type **Base** with the following variations:
- **nh** is **32**
It has **364.49** million parameters and thus requires *ca.* **1457.96 MB** of memory in full precision (*fp32*)
or **728.98 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nh32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NH32 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NH32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nh32 - is of model type Base with the following variations:
* nh is 32
It has 364.49 million parameters and thus requires *ca.* 1457.96 MB of memory in full precision (*fp32*)
or 728.98 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NH8 (Deep-Narrow version)
T5-Efficient-BASE-NH8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nh8** - is of model type **Base** with the following variations:
- **nh** is **8**
It has **194.62** million parameters and thus requires *ca.* **778.48 MB** of memory in full precision (*fp32*)
or **389.24 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nh8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NH8 (Deep-Narrow version)
===========================================
T5-Efficient-BASE-NH8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nh8 - is of model type Base with the following variations:
* nh is 8
It has 194.62 million parameters and thus requires *ca.* 778.48 MB of memory in full precision (*fp32*)
or 389.24 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL16 (Deep-Narrow version)
T5-Efficient-BASE-NL16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl16** - is of model type **Base** with the following variations:
- **nl** is **16**
It has **289.02** million parameters and thus requires *ca.* **1156.07 MB** of memory in full precision (*fp32*)
or **578.03 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NL16 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NL16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl16 - is of model type Base with the following variations:
* nl is 16
It has 289.02 million parameters and thus requires *ca.* 1156.07 MB of memory in full precision (*fp32*)
or 578.03 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL2 (Deep-Narrow version)
T5-Efficient-BASE-NL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl2** - is of model type **Base** with the following variations:
- **nl** is **2**
It has **57.72** million parameters and thus requires *ca.* **230.88 MB** of memory in full precision (*fp32*)
or **115.44 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NL2 (Deep-Narrow version)
===========================================
T5-Efficient-BASE-NL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl2 - is of model type Base with the following variations:
* nl is 2
It has 57.72 million parameters and thus requires *ca.* 230.88 MB of memory in full precision (*fp32*)
or 115.44 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL24 (Deep-Narrow version)
T5-Efficient-BASE-NL24 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl24** - is of model type **Base** with the following variations:
- **nl** is **24**
It has **421.19** million parameters and thus requires *ca.* **1684.75 MB** of memory in full precision (*fp32*)
or **842.37 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl24
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NL24 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NL24 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl24 - is of model type Base with the following variations:
* nl is 24
It has 421.19 million parameters and thus requires *ca.* 1684.75 MB of memory in full precision (*fp32*)
or 842.37 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL32 (Deep-Narrow version)
T5-Efficient-BASE-NL32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl32** - is of model type **Base** with the following variations:
- **nl** is **32**
It has **553.36** million parameters and thus requires *ca.* **2213.43 MB** of memory in full precision (*fp32*)
or **1106.71 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NL32 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NL32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl32 - is of model type Base with the following variations:
* nl is 32
It has 553.36 million parameters and thus requires *ca.* 2213.43 MB of memory in full precision (*fp32*)
or 1106.71 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL36 (Deep-Narrow version)
T5-Efficient-BASE-NL36 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl36** - is of model type **Base** with the following variations:
- **nl** is **36**
It has **619.44** million parameters and thus requires *ca.* **2477.77 MB** of memory in full precision (*fp32*)
or **1238.88 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl36
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NL36 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NL36 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl36 - is of model type Base with the following variations:
* nl is 36
It has 619.44 million parameters and thus requires *ca.* 2477.77 MB of memory in full precision (*fp32*)
or 1238.88 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL4 (Deep-Narrow version)
T5-Efficient-BASE-NL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl4** - is of model type **Base** with the following variations:
- **nl** is **4**
It has **90.76** million parameters and thus requires *ca.* **363.05 MB** of memory in full precision (*fp32*)
or **181.52 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us
|
T5-Efficient-BASE-NL4 (Deep-Narrow version)
===========================================
T5-Efficient-BASE-NL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl4 - is of model type Base with the following variations:
* nl is 4
It has 90.76 million parameters and thus requires *ca.* 363.05 MB of memory in full precision (*fp32*)
or 181.52 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
75
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #text-generation-inference #region-us \n"
] |
[
-0.05570411682128906,
0.12282758951187134,
-0.005034130997955799,
0.08672954142093658,
0.06898552924394608,
-0.011954421177506447,
0.1395179033279419,
0.1621740311384201,
-0.0670320987701416,
-0.03256862238049507,
0.14385086297988892,
0.16604116559028625,
0.032107751816511154,
0.099956214427948,
-0.0730796679854393,
-0.20465558767318726,
0.049706459045410156,
0.01590156927704811,
-0.06527746468782425,
0.10919924825429916,
0.11073058098554611,
-0.03938174620270729,
0.07847720384597778,
-0.039514027535915375,
-0.10183439403772354,
0.025842783972620964,
0.04374360665678978,
-0.10783270001411438,
0.09400172531604767,
0.06060252711176872,
0.024000072851777077,
0.07796210050582886,
-0.009618537500500679,
-0.10489112883806229,
0.031828805804252625,
0.05335342884063721,
-0.07550960034132004,
0.10573862493038177,
0.08413807302713394,
0.020345916971564293,
0.04621202498674393,
0.03509976714849472,
-0.03911476954817772,
0.05877332389354706,
-0.10045237839221954,
-0.15002314746379852,
-0.08060825616121292,
0.069659024477005,
0.041695769876241684,
0.08840835839509964,
0.036654699593782425,
0.16261403262615204,
-0.06959223747253418,
0.08145783841609955,
0.16198232769966125,
-0.36947792768478394,
-0.0027636727318167686,
0.025417599827051163,
-0.010110324248671532,
0.08880250155925751,
-0.026761390268802643,
0.024338532239198685,
0.05924758315086365,
0.012043720111250877,
0.06407403945922852,
-0.05773497372865677,
-0.2867133319377899,
0.04683360084891319,
-0.07425887882709503,
-0.0681416466832161,
0.3276005685329437,
0.011956805363297462,
0.027790002524852753,
0.022181646898388863,
-0.11815889924764633,
-0.0006244563846848905,
0.010343744419515133,
0.0005634057451970875,
0.03453032672405243,
0.08833464980125427,
0.04846115782856941,
-0.06883382797241211,
-0.1426284909248352,
-0.032682064920663834,
-0.22130867838859558,
-0.011272410862147808,
0.00565978791564703,
0.08330198377370834,
-0.15057235956192017,
0.06372974812984467,
0.05195802450180054,
-0.14092710614204407,
0.05449538305401802,
-0.059571024030447006,
0.07395626604557037,
0.029512939974665642,
-0.03099535033106804,
-0.0556771419942379,
0.12498443573713303,
0.16858932375907898,
0.02996494248509407,
-0.004873601719737053,
-0.10336856544017792,
0.09128722548484802,
-0.0357963852584362,
0.0284517053514719,
-0.06937769055366516,
0.0009763659327290952,
0.13987500965595245,
-0.06481536477804184,
0.06863997876644135,
-0.06524720042943954,
-0.15415649116039276,
-0.052902355790138245,
0.03019748069345951,
0.11550955474376678,
0.08701970428228378,
0.04508454352617264,
-0.03243367001414299,
-0.041083551943302155,
0.0675392746925354,
-0.09223248809576035,
-0.029587063938379288,
0.029510390013456345,
-0.02717173472046852,
0.0453907772898674,
0.037643104791641235,
0.029602261260151863,
-0.11876709759235382,
-0.030176084488630295,
-0.07090640068054199,
-0.03221173956990242,
-0.0016904843505471945,
-0.06733360141515732,
0.09069954603910446,
-0.04382643848657608,
0.0012819600524380803,
-0.15084129571914673,
-0.1899523138999939,
0.036709461361169815,
0.03216804563999176,
-0.011734962463378906,
-0.10622259974479675,
0.01750107854604721,
-0.05360542982816696,
0.027879822999238968,
-0.05899248272180557,
0.03745562583208084,
-0.07308074086904526,
0.0791182592511177,
-0.10898207128047943,
0.038618527352809906,
-0.1747450828552246,
0.050581980496644974,
-0.15469790995121002,
-0.016742568463087082,
-0.005721233319491148,
0.0011331752175465226,
-0.04396280273795128,
0.13895659148693085,
-0.09165682643651962,
-0.018479634076356888,
-0.012800591066479683,
-0.021717168390750885,
0.002469486789777875,
0.13217882812023163,
-0.199951633810997,
-0.046126898378133774,
0.14011776447296143,
-0.0923740565776825,
-0.21153153479099274,
0.09347816556692123,
0.019892442971467972,
0.04977783188223839,
0.05831635743379593,
0.21679480373859406,
0.057242024689912796,
0.024616090580821037,
0.0789419561624527,
0.11873488873243332,
-0.08576957136392593,
-0.1488705426454544,
0.08600755035877228,
-0.06758228689432144,
-0.07225816696882248,
0.02306186780333519,
0.01743476837873459,
0.06348901987075806,
-0.011859234422445297,
-0.08625810593366623,
-0.04664988070726395,
-0.033854298293590546,
0.038426708430051804,
-0.020378364250063896,
0.07613706588745117,
-0.06286407262086868,
0.008113310672342777,
-0.054408393800258636,
0.051983095705509186,
0.015503056347370148,
0.04026037082076073,
-0.03140820562839508,
0.08287984877824783,
-0.009266623295843601,
0.06555290520191193,
-0.13648349046707153,
-0.0309186652302742,
0.03064938448369503,
0.04953187704086304,
0.0402371808886528,
0.07455120980739594,
0.0345231294631958,
-0.035468824207782745,
0.015509014017879963,
-0.010629232972860336,
0.08448012918233871,
0.016018550843000412,
-0.04109097272157669,
-0.1583581566810608,
0.05033468082547188,
-0.017749104648828506,
0.025973495095968246,
-0.03679458424448967,
0.02534607984125614,
0.008571547456085682,
0.07755140960216522,
-0.0008130799978971481,
0.09803840517997742,
0.02502051740884781,
-0.031355082988739014,
-0.04712055251002312,
-0.019204160198569298,
0.08812778443098068,
0.03367113322019577,
-0.060932304710149765,
0.17975027859210968,
-0.1138605922460556,
0.2805148661136627,
0.160171777009964,
-0.17627672851085663,
0.08846516907215118,
-0.011358972638845444,
-0.022707968950271606,
-0.028180019930005074,
0.05388337001204491,
0.011986793018877506,
0.008809306658804417,
-0.0036244168877601624,
0.15214000642299652,
-0.11950327455997467,
-0.020460357889533043,
-0.0170882735401392,
-0.04434298351407051,
0.023135807365179062,
0.07981380820274353,
0.12498633563518524,
-0.21477828919887543,
0.15863870084285736,
0.32449209690093994,
0.0016439036699011922,
0.12228608131408691,
-0.10682942718267441,
-0.06353677809238434,
0.036234088242053986,
-0.003913433291018009,
-0.024936173111200333,
-0.00656896224245429,
-0.03933307155966759,
0.04796266183257103,
0.11112105846405029,
0.029884420335292816,
0.03880082443356514,
-0.11484035104513168,
-0.0506577268242836,
-0.006949932314455509,
-0.05471048504114151,
-0.014661346562206745,
0.054812345653772354,
0.015269694849848747,
0.1359664499759674,
-0.03281427174806595,
-0.07899575680494308,
0.12288303673267365,
0.0010208992753177881,
-0.10396036505699158,
0.17385101318359375,
-0.18913260102272034,
-0.2583014965057373,
-0.11486788839101791,
-0.05207446217536926,
-0.0781259536743164,
-0.023762274533510208,
0.10172223299741745,
-0.06483521312475204,
-0.03305550292134285,
-0.05217665806412697,
-0.047416865825653076,
-0.020764926448464394,
0.007962789386510849,
-0.023941662162542343,
0.049400921911001205,
0.018720747902989388,
-0.14233148097991943,
-0.010740024968981743,
-0.00206738174892962,
-0.04914933815598488,
0.08165809512138367,
-0.05681855231523514,
0.06686026602983475,
0.16272549331188202,
0.02332107536494732,
0.024849172681570053,
-0.03688204661011696,
0.12501376867294312,
-0.003573574125766754,
0.021501833572983742,
0.2531941533088684,
-0.018275870010256767,
0.05697769299149513,
0.13157403469085693,
0.018079835921525955,
-0.048802122473716736,
0.004683513194322586,
-0.040672071278095245,
-0.03611711040139198,
-0.2932734787464142,
-0.08497098833322525,
-0.09851180762052536,
0.13384483754634857,
0.04416899010539055,
0.08536102622747421,
0.13607187569141388,
0.03893570974469185,
-0.025576181709766388,
0.05880200117826462,
0.004363357089459896,
0.052305836230516434,
0.23157402873039246,
-0.0009328394662588835,
0.09261489659547806,
-0.1009460911154747,
-0.01624683104455471,
0.14794620871543884,
0.04620811715722084,
0.10566814243793488,
0.04406449943780899,
0.19140304625034332,
0.04715624079108238,
0.17922645807266235,
0.05770609527826309,
0.12508824467658997,
0.04144202172756195,
0.0022819554433226585,
-0.048923518508672714,
-0.07890604436397552,
-0.025620222091674805,
0.013397100381553173,
-0.07113829255104065,
-0.06432990729808807,
-0.04789054021239281,
-0.013092461973428726,
0.06623168289661407,
0.16647613048553467,
0.0537080354988575,
-0.17543193697929382,
0.013044450432062149,
0.029875701293349266,
0.0022038116585463285,
-0.04937604442238808,
0.08536721020936966,
0.011178065091371536,
-0.09527018666267395,
0.024005603045225143,
-0.027623632922768593,
0.1270289421081543,
0.00233447109349072,
0.06595228612422943,
-0.03943086043000221,
0.008047153241932392,
0.022804439067840576,
0.12337320297956467,
-0.3111293613910675,
0.17482581734657288,
0.013890242204070091,
-0.04445768892765045,
-0.09811215102672577,
-0.004814379382878542,
0.0508960597217083,
0.1174883022904396,
0.07892464846372604,
0.008121246472001076,
-0.006799246650189161,
0.04198957234621048,
-0.13014322519302368,
0.06602685153484344,
0.028936702758073807,
-0.01628023386001587,
-0.06072122976183891,
-0.05760903283953667,
-0.006256370805203915,
0.024293040856719017,
0.09179990738630295,
-0.06399688869714737,
-0.09458586573600769,
0.07638496160507202,
0.059861816465854645,
-0.033772390335798264,
-0.03881452605128288,
-0.03521130234003067,
-0.043592557311058044,
0.17240381240844727,
-0.027513276785612106,
-0.06854669004678726,
-0.1031888872385025,
-0.05124082416296005,
0.07727695256471634,
-0.06879234313964844,
0.04835452139377594,
-0.06520906090736389,
-0.011632749810814857,
-0.026798103004693985,
-0.24599693715572357,
0.1431661695241928,
-0.10387803614139557,
-0.08024198561906815,
-0.028748132288455963,
0.12782049179077148,
-0.13622449338436127,
0.034987855702638626,
0.00685861287638545,
-0.005023123696446419,
-0.08322577178478241,
-0.08446717262268066,
-0.0009934785775840282,
-0.01565943844616413,
-0.011509710922837257,
-0.0007127832504920661,
-0.08958468586206436,
-0.0479840449988842,
0.01428944244980812,
-0.030629640445113182,
0.2840288579463959,
0.20170719921588898,
-0.07759648561477661,
0.1929580420255661,
0.19298067688941956,
-0.05785050615668297,
-0.2746768295764923,
-0.10785812884569168,
-0.1308738887310028,
-0.06095614656805992,
0.02482675015926361,
-0.19080661237239838,
0.06766372174024582,
0.06515968590974808,
-0.06334258615970612,
0.11768530309200287,
-0.2804555296897888,
-0.07906436920166016,
0.18210196495056152,
0.00698821060359478,
0.23600812256336212,
-0.13701999187469482,
-0.09183479100465775,
-0.054273925721645355,
-0.14733482897281647,
0.1798621267080307,
-0.2108396738767624,
0.0635811910033226,
-0.028483780100941658,
-0.02052517794072628,
0.004613215569406748,
-0.015760617330670357,
0.07334162294864655,
-0.04201570153236389,
0.06566096097230911,
-0.12429068237543106,
0.039458051323890686,
0.13555492460727692,
-0.00584886409342289,
0.06893850117921829,
-0.16460810601711273,
0.0654645711183548,
-0.08294710516929626,
0.021399367600679398,
-0.06964141875505447,
0.05700312927365303,
-0.005546221975237131,
-0.09105651080608368,
-0.02480972185730934,
-0.04520290344953537,
0.02193109132349491,
-0.017169104889035225,
0.14894141256809235,
0.05772387981414795,
0.06643081456422806,
0.17773862183094025,
0.1077333614230156,
-0.10202119499444962,
0.02990851178765297,
-0.09203239530324936,
-0.0699254646897316,
0.07899381220340729,
-0.2083473950624466,
0.041257306933403015,
0.10253090411424637,
-0.03003925085067749,
0.05923517420887947,
0.05417567119002342,
-0.018431570380926132,
-0.05750332772731781,
0.10915028303861618,
-0.19066955149173737,
-0.0016436516307294369,
-0.04653415456414223,
0.07329300791025162,
0.01351103000342846,
0.038329485803842545,
0.1365024745464325,
-0.0062185730785131454,
-0.027538534253835678,
0.041138242930173874,
0.029382098466157913,
-0.035766903311014175,
0.060946252197027206,
0.07483623921871185,
0.0016359075671061873,
-0.14144286513328552,
0.173092320561409,
0.053047578781843185,
-0.04594302922487259,
0.025969885289669037,
0.18284769356250763,
-0.1244397982954979,
-0.12907923758029938,
0.05886175483465195,
0.11469587683677673,
-0.09422916918992996,
-0.06956297159194946,
-0.06987392902374268,
-0.10129812359809875,
0.06951490044593811,
0.11035279184579849,
0.054424092173576355,
0.06670401990413666,
-0.016755783930420876,
-0.09880576282739639,
0.031289879232645035,
0.04876509681344032,
-0.04356060549616814,
-0.0008795495377853513,
-0.10109391063451767,
0.024883583188056946,
-0.02076825685799122,
0.12098028510808945,
-0.04325319826602936,
-0.01316527184098959,
-0.07263671606779099,
0.004254651255905628,
-0.1699688583612442,
-0.025108525529503822,
-0.020984090864658356,
-0.022584736347198486,
-0.012796435505151749,
-0.02617758698761463,
-0.054184358566999435,
0.01572447642683983,
-0.1169048473238945,
-0.038089774549007416,
-0.025690700858831406,
0.06945615261793137,
-0.1181618794798851,
-0.041407354176044464,
0.055282220244407654,
-0.000641151680611074,
0.16800306737422943,
0.07072217762470245,
-0.06573931872844696,
0.10509854555130005,
-0.16099543869495392,
-0.11527203768491745,
0.06121814250946045,
0.06261979788541794,
0.0345909520983696,
0.04923246428370476,
-0.015753118321299553,
0.08422551304101944,
-0.05746765807271004,
0.016137193888425827,
0.013127660378813744,
-0.11238182336091995,
-0.03499456122517586,
-0.04737129434943199,
-0.08807849138975143,
-0.049709171056747437,
-0.059628844261169434,
0.06050391122698784,
0.03213513270020485,
0.13215863704681396,
-0.024584809318184853,
0.05731859803199768,
-0.12072695046663284,
0.008046028204262257,
-0.02392396703362465,
-0.14991939067840576,
-0.17588452994823456,
-0.0445500984787941,
0.0005256273434497416,
-0.05333923548460007,
0.13374261558055878,
0.004476295318454504,
-0.1123279482126236,
0.0348467119038105,
0.10014140605926514,
0.016375776380300522,
0.015840744599699974,
0.24402152001857758,
0.05682697147130966,
-0.019305327907204628,
-0.07185734063386917,
0.016831720247864723,
0.02415654994547367,
0.025942185893654823,
0.11701535433530807,
0.08957409113645554,
0.0685979351401329,
0.07196531444787979,
0.0657234936952591,
-0.0462668351829052,
-0.038644518703222275,
-0.03935271501541138,
0.056529756635427475,
0.12234323471784592,
0.008564425632357597,
0.06300852447748184,
0.1874171942472458,
0.008037508465349674,
0.008482879027724266,
-0.02462194114923477,
-0.021467847749590874,
-0.1542261391878128,
-0.16365589201450348,
-0.07757946848869324,
-0.10224281996488571,
-0.03500665724277496,
-0.08750391006469727,
0.036674100905656815,
0.05264698714017868,
0.06698419153690338,
-0.07758472859859467,
0.012724372558295727,
0.03560100495815277,
-0.10753808170557022,
0.05140981823205948,
-0.035544995218515396,
-0.008911787532269955,
-0.035189103335142136,
-0.04339638724923134,
-0.018236534669995308,
-0.0015892150113359094,
-0.037340838462114334,
0.02542048692703247,
0.03558829426765442,
0.056515272706747055,
-0.13415499031543732,
-0.09856957942247391,
-0.015451623126864433,
0.00030238943872973323,
-0.007271090056747198,
0.16309064626693726,
0.057698123157024384,
-0.002164888195693493,
0.10125632584095001,
0.2078016698360443,
-0.04780400171875954,
-0.12318052351474762,
-0.054041869938373566,
0.1128292828798294,
-0.008157117292284966,
0.027095161378383636,
0.014891940169036388,
-0.00944147165864706,
-0.05440564826130867,
0.22700592875480652,
0.3027529716491699,
-0.14668981730937958,
0.01941136084496975,
-0.008022304624319077,
0.011104089207947254,
0.017621733248233795,
0.14713054895401,
0.11342642456293106,
0.14660701155662537,
-0.07228822261095047,
-0.009538468904793262,
-0.036421943455934525,
0.016099020838737488,
-0.16327962279319763,
0.08671069890260696,
0.040196552872657776,
-0.09609886258840561,
0.0209068451076746,
0.08349285274744034,
-0.10606203228235245,
0.12082625180482864,
-0.11412054300308228,
-0.13213174045085907,
-0.09275808185338974,
-0.04053623229265213,
0.12055382877588272,
0.06831949204206467,
0.019226176664233208,
-0.04991500452160835,
-0.0024300843942910433,
0.03420693799853325,
-0.014701534993946552,
-0.20846503973007202,
-0.03270589932799339,
0.09585238993167877,
-0.11072534322738647,
0.09742837399244308,
0.008501806296408176,
0.0861034244298935,
0.0835358127951622,
0.06714050471782684,
-0.10976028442382812,
0.07394906133413315,
0.02099701575934887,
-0.01609017886221409,
0.005726792849600315,
-0.088374063372612,
0.00543031794950366,
-0.05168148875236511,
0.07484707981348038,
-0.03537111356854439,
0.004332484677433968,
0.07922820001840591,
-0.03273746371269226,
-0.022600093856453896,
-0.0021388104651123285,
-0.060846418142318726,
0.06569106876850128,
0.04145483672618866,
-0.04808998852968216,
-0.029950670897960663,
-0.08855261653661728,
-0.035723794251680374,
0.03993440419435501,
-0.1921888142824173,
-0.057377249002456665,
-0.0029520217794924974,
-0.03955553472042084,
0.03593825176358223,
0.03997822105884552,
-0.1218360960483551,
-0.01419318001717329,
-0.11239119619131088,
-0.008159349672496319,
-0.18428082764148712,
0.0609917938709259,
0.10473144799470901,
-0.017214074730873108,
0.025681229308247566,
0.03486568108201027,
0.018481282517313957,
0.03192055597901344,
-0.11023510247468948,
-0.07595492899417877
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL40 (Deep-Narrow version)
T5-Efficient-BASE-NL40 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl40** - is of model type **Base** with the following variations:
- **nl** is **40**
It has **685.53** million parameters and thus requires *ca.* **2742.11 MB** of memory in full precision (*fp32*)
or **1371.05 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl40
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-BASE-NL40 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NL40 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl40 - is of model type Base with the following variations:
* nl is 40
It has 685.53 million parameters and thus requires *ca.* 2742.11 MB of memory in full precision (*fp32*)
or 1371.05 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL48 (Deep-Narrow version)
T5-Efficient-BASE-NL48 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl48** - is of model type **Base** with the following variations:
- **nl** is **48**
It has **817.7** million parameters and thus requires *ca.* **3270.79 MB** of memory in full precision (*fp32*)
or **1635.39 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl48
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-BASE-NL48 (Deep-Narrow version)
============================================
T5-Efficient-BASE-NL48 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl48 - is of model type Base with the following variations:
* nl is 48
It has 817.7 million parameters and thus requires *ca.* 3270.79 MB of memory in full precision (*fp32*)
or 1635.39 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-BASE-NL8 (Deep-Narrow version)
T5-Efficient-BASE-NL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base-nl8** - is of model type **Base** with the following variations:
- **nl** is **8**
It has **156.85** million parameters and thus requires *ca.* **627.39 MB** of memory in full precision (*fp32*)
or **313.69 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base-nl8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-BASE-NL8 (Deep-Narrow version)
===========================================
T5-Efficient-BASE-NL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base-nl8 - is of model type Base with the following variations:
* nl is 8
It has 156.85 million parameters and thus requires *ca.* 627.39 MB of memory in full precision (*fp32*)
or 313.69 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-BASE (Deep-Narrow version)
T5-Efficient-BASE is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-base** - is of model type **Base** with no variations.
It has **222.93** million parameters and thus requires *ca.* **891.73 MB** of memory in full precision (*fp32*)
or **445.86 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-base
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-BASE (Deep-Narrow version)
=======================================
T5-Efficient-BASE is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-base - is of model type Base with no variations.
It has 222.93 million parameters and thus requires *ca.* 891.73 MB of memory in full precision (*fp32*)
or 445.86 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DL12 (Deep-Narrow version)
T5-Efficient-LARGE-DL12 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dl12** - is of model type **Large** with the following variations:
- **dl** is **12**
It has **536.34** million parameters and thus requires *ca.* **2145.37 MB** of memory in full precision (*fp32*)
or **1072.69 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dl12
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DL12 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-DL12 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dl12 - is of model type Large with the following variations:
* dl is 12
It has 536.34 million parameters and thus requires *ca.* 2145.37 MB of memory in full precision (*fp32*)
or 1072.69 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DL16 (Deep-Narrow version)
T5-Efficient-LARGE-DL16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dl16** - is of model type **Large** with the following variations:
- **dl** is **16**
It has **603.47** million parameters and thus requires *ca.* **2413.88 MB** of memory in full precision (*fp32*)
or **1206.94 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dl16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DL16 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-DL16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dl16 - is of model type Large with the following variations:
* dl is 16
It has 603.47 million parameters and thus requires *ca.* 2413.88 MB of memory in full precision (*fp32*)
or 1206.94 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DL2 (Deep-Narrow version)
T5-Efficient-LARGE-DL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dl2** - is of model type **Large** with the following variations:
- **dl** is **2**
It has **368.53** million parameters and thus requires *ca.* **1474.11 MB** of memory in full precision (*fp32*)
or **737.05 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dl2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DL2 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-DL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dl2 - is of model type Large with the following variations:
* dl is 2
It has 368.53 million parameters and thus requires *ca.* 1474.11 MB of memory in full precision (*fp32*)
or 737.05 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DL32 (Deep-Narrow version)
T5-Efficient-LARGE-DL32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dl32** - is of model type **Large** with the following variations:
- **dl** is **32**
It has **871.98** million parameters and thus requires *ca.* **3487.91 MB** of memory in full precision (*fp32*)
or **1743.96 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dl32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DL32 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-DL32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dl32 - is of model type Large with the following variations:
* dl is 32
It has 871.98 million parameters and thus requires *ca.* 3487.91 MB of memory in full precision (*fp32*)
or 1743.96 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DL4 (Deep-Narrow version)
T5-Efficient-LARGE-DL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dl4** - is of model type **Large** with the following variations:
- **dl** is **4**
It has **402.09** million parameters and thus requires *ca.* **1608.36 MB** of memory in full precision (*fp32*)
or **804.18 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dl4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DL4 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-DL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dl4 - is of model type Large with the following variations:
* dl is 4
It has 402.09 million parameters and thus requires *ca.* 1608.36 MB of memory in full precision (*fp32*)
or 804.18 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DL6 (Deep-Narrow version)
T5-Efficient-LARGE-DL6 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dl6** - is of model type **Large** with the following variations:
- **dl** is **6**
It has **435.65** million parameters and thus requires *ca.* **1742.61 MB** of memory in full precision (*fp32*)
or **871.31 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dl6
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DL6 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-DL6 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dl6 - is of model type Large with the following variations:
* dl is 6
It has 435.65 million parameters and thus requires *ca.* 1742.61 MB of memory in full precision (*fp32*)
or 871.31 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DL8 (Deep-Narrow version)
T5-Efficient-LARGE-DL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dl8** - is of model type **Large** with the following variations:
- **dl** is **8**
It has **469.22** million parameters and thus requires *ca.* **1876.87 MB** of memory in full precision (*fp32*)
or **938.43 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dl8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DL8 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-DL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dl8 - is of model type Large with the following variations:
* dl is 8
It has 469.22 million parameters and thus requires *ca.* 1876.87 MB of memory in full precision (*fp32*)
or 938.43 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DM128 (Deep-Narrow version)
T5-Efficient-LARGE-DM128 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dm128** - is of model type **Large** with the following variations:
- **dm** is **128**
It has **92.27** million parameters and thus requires *ca.* **369.06 MB** of memory in full precision (*fp32*)
or **184.53 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dm128
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DM128 (Deep-Narrow version)
==============================================
T5-Efficient-LARGE-DM128 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dm128 - is of model type Large with the following variations:
* dm is 128
It has 92.27 million parameters and thus requires *ca.* 369.06 MB of memory in full precision (*fp32*)
or 184.53 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DM2000 (Deep-Narrow version)
T5-Efficient-LARGE-DM2000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dm2000** - is of model type **Large** with the following variations:
- **dm** is **2000**
It has **1475.39** million parameters and thus requires *ca.* **5901.57 MB** of memory in full precision (*fp32*)
or **2950.78 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dm2000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DM2000 (Deep-Narrow version)
===============================================
T5-Efficient-LARGE-DM2000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dm2000 - is of model type Large with the following variations:
* dm is 2000
It has 1475.39 million parameters and thus requires *ca.* 5901.57 MB of memory in full precision (*fp32*)
or 2950.78 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DM256 (Deep-Narrow version)
T5-Efficient-LARGE-DM256 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dm256** - is of model type **Large** with the following variations:
- **dm** is **256**
It has **184.47** million parameters and thus requires *ca.* **737.9 MB** of memory in full precision (*fp32*)
or **368.95 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dm256
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DM256 (Deep-Narrow version)
==============================================
T5-Efficient-LARGE-DM256 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dm256 - is of model type Large with the following variations:
* dm is 256
It has 184.47 million parameters and thus requires *ca.* 737.9 MB of memory in full precision (*fp32*)
or 368.95 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DM512 (Deep-Narrow version)
T5-Efficient-LARGE-DM512 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dm512** - is of model type **Large** with the following variations:
- **dm** is **512**
It has **368.89** million parameters and thus requires *ca.* **1475.56 MB** of memory in full precision (*fp32*)
or **737.78 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dm512
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DM512 (Deep-Narrow version)
==============================================
T5-Efficient-LARGE-DM512 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dm512 - is of model type Large with the following variations:
* dm is 512
It has 368.89 million parameters and thus requires *ca.* 1475.56 MB of memory in full precision (*fp32*)
or 737.78 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-DM768 (Deep-Narrow version)
T5-Efficient-LARGE-DM768 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-dm768** - is of model type **Large** with the following variations:
- **dm** is **768**
It has **553.31** million parameters and thus requires *ca.* **2213.23 MB** of memory in full precision (*fp32*)
or **1106.62 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-dm768
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-DM768 (Deep-Narrow version)
==============================================
T5-Efficient-LARGE-DM768 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-dm768 - is of model type Large with the following variations:
* dm is 768
It has 553.31 million parameters and thus requires *ca.* 2213.23 MB of memory in full precision (*fp32*)
or 1106.62 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-EL12 (Deep-Narrow version)
T5-Efficient-LARGE-EL12 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-el12** - is of model type **Large** with the following variations:
- **el** is **12**
It has **586.69** million parameters and thus requires *ca.* **2346.78 MB** of memory in full precision (*fp32*)
or **1173.39 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-el12
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-EL12 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-EL12 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-el12 - is of model type Large with the following variations:
* el is 12
It has 586.69 million parameters and thus requires *ca.* 2346.78 MB of memory in full precision (*fp32*)
or 1173.39 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-EL2 (Deep-Narrow version)
T5-Efficient-LARGE-EL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-el2** - is of model type **Large** with the following variations:
- **el** is **2**
It has **460.84** million parameters and thus requires *ca.* **1843.34 MB** of memory in full precision (*fp32*)
or **921.67 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-el2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-EL2 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-EL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-el2 - is of model type Large with the following variations:
* el is 2
It has 460.84 million parameters and thus requires *ca.* 1843.34 MB of memory in full precision (*fp32*)
or 921.67 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-EL4 (Deep-Narrow version)
T5-Efficient-LARGE-EL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-el4** - is of model type **Large** with the following variations:
- **el** is **4**
It has **486.01** million parameters and thus requires *ca.* **1944.03 MB** of memory in full precision (*fp32*)
or **972.01 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-el4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-EL4 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-EL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-el4 - is of model type Large with the following variations:
* el is 4
It has 486.01 million parameters and thus requires *ca.* 1944.03 MB of memory in full precision (*fp32*)
or 972.01 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-EL6 (Deep-Narrow version)
T5-Efficient-LARGE-EL6 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-el6** - is of model type **Large** with the following variations:
- **el** is **6**
It has **511.18** million parameters and thus requires *ca.* **2044.72 MB** of memory in full precision (*fp32*)
or **1022.36 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-el6
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-EL6 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-EL6 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-el6 - is of model type Large with the following variations:
* el is 6
It has 511.18 million parameters and thus requires *ca.* 2044.72 MB of memory in full precision (*fp32*)
or 1022.36 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-EL8 (Deep-Narrow version)
T5-Efficient-LARGE-EL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-el8** - is of model type **Large** with the following variations:
- **el** is **8**
It has **536.35** million parameters and thus requires *ca.* **2145.4 MB** of memory in full precision (*fp32*)
or **1072.7 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-el8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-EL8 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-EL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-el8 - is of model type Large with the following variations:
* el is 8
It has 536.35 million parameters and thus requires *ca.* 2145.4 MB of memory in full precision (*fp32*)
or 1072.7 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-KV128 (Deep-Narrow version)
T5-Efficient-LARGE-KV128 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-kv128** - is of model type **Large** with the following variations:
- **kv** is **128**
It has **1039.71** million parameters and thus requires *ca.* **4158.86 MB** of memory in full precision (*fp32*)
or **2079.43 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-kv128
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-KV128 (Deep-Narrow version)
==============================================
T5-Efficient-LARGE-KV128 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-kv128 - is of model type Large with the following variations:
* kv is 128
It has 1039.71 million parameters and thus requires *ca.* 4158.86 MB of memory in full precision (*fp32*)
or 2079.43 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-KV16 (Deep-Narrow version)
T5-Efficient-LARGE-KV16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-kv16** - is of model type **Large** with the following variations:
- **kv** is **16**
It has **511.23** million parameters and thus requires *ca.* **2044.93 MB** of memory in full precision (*fp32*)
or **1022.46 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-kv16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-KV16 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-KV16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-kv16 - is of model type Large with the following variations:
* kv is 16
It has 511.23 million parameters and thus requires *ca.* 2044.93 MB of memory in full precision (*fp32*)
or 1022.46 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-KV256 (Deep-Narrow version)
T5-Efficient-LARGE-KV256 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-kv256** - is of model type **Large** with the following variations:
- **kv** is **256**
It has **1643.69** million parameters and thus requires *ca.* **6574.78 MB** of memory in full precision (*fp32*)
or **3287.39 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-kv256
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-KV256 (Deep-Narrow version)
==============================================
T5-Efficient-LARGE-KV256 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-kv256 - is of model type Large with the following variations:
* kv is 256
It has 1643.69 million parameters and thus requires *ca.* 6574.78 MB of memory in full precision (*fp32*)
or 3287.39 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-KV32 (Deep-Narrow version)
T5-Efficient-LARGE-KV32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-kv32** - is of model type **Large** with the following variations:
- **kv** is **32**
It has **586.73** million parameters and thus requires *ca.* **2346.92 MB** of memory in full precision (*fp32*)
or **1173.46 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-kv32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-KV32 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-KV32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-kv32 - is of model type Large with the following variations:
* kv is 32
It has 586.73 million parameters and thus requires *ca.* 2346.92 MB of memory in full precision (*fp32*)
or 1173.46 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NH12 (Deep-Narrow version)
T5-Efficient-LARGE-NH12 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nh12** - is of model type **Large** with the following variations:
- **nh** is **12**
It has **662.23** million parameters and thus requires *ca.* **2648.91 MB** of memory in full precision (*fp32*)
or **1324.45 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nh12
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NH12 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NH12 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nh12 - is of model type Large with the following variations:
* nh is 12
It has 662.23 million parameters and thus requires *ca.* 2648.91 MB of memory in full precision (*fp32*)
or 1324.45 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NH2 (Deep-Narrow version)
T5-Efficient-LARGE-NH2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nh2** - is of model type **Large** with the following variations:
- **nh** is **2**
It has **473.48** million parameters and thus requires *ca.* **1893.93 MB** of memory in full precision (*fp32*)
or **946.96 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nh2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NH2 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-NH2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nh2 - is of model type Large with the following variations:
* nh is 2
It has 473.48 million parameters and thus requires *ca.* 1893.93 MB of memory in full precision (*fp32*)
or 946.96 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NH24 (Deep-Narrow version)
T5-Efficient-LARGE-NH24 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nh24** - is of model type **Large** with the following variations:
- **nh** is **24**
It has **888.72** million parameters and thus requires *ca.* **3554.88 MB** of memory in full precision (*fp32*)
or **1777.44 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nh24
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NH24 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NH24 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nh24 - is of model type Large with the following variations:
* nh is 24
It has 888.72 million parameters and thus requires *ca.* 3554.88 MB of memory in full precision (*fp32*)
or 1777.44 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NH32 (Deep-Narrow version)
T5-Efficient-LARGE-NH32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nh32** - is of model type **Large** with the following variations:
- **nh** is **32**
It has **1039.72** million parameters and thus requires *ca.* **4158.86 MB** of memory in full precision (*fp32*)
or **2079.43 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nh32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NH32 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NH32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nh32 - is of model type Large with the following variations:
* nh is 32
It has 1039.72 million parameters and thus requires *ca.* 4158.86 MB of memory in full precision (*fp32*)
or 2079.43 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NH4 (Deep-Narrow version)
T5-Efficient-LARGE-NH4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nh4** - is of model type **Large** with the following variations:
- **nh** is **4**
It has **511.23** million parameters and thus requires *ca.* **2044.92 MB** of memory in full precision (*fp32*)
or **1022.46 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nh4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NH4 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-NH4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nh4 - is of model type Large with the following variations:
* nh is 4
It has 511.23 million parameters and thus requires *ca.* 2044.92 MB of memory in full precision (*fp32*)
or 1022.46 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NH8-NL32 (Deep-Narrow version)
T5-Efficient-LARGE-NH8-NL32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nh8-nl32** - is of model type **Large** with the following variations:
- **nh** is **8**
- **nl** is **32**
It has **771.34** million parameters and thus requires *ca.* **3085.35 MB** of memory in full precision (*fp32*)
or **1542.68 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nh8-nl32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NH8-NL32 (Deep-Narrow version)
=================================================
T5-Efficient-LARGE-NH8-NL32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nh8-nl32 - is of model type Large with the following variations:
* nh is 8
* nl is 32
It has 771.34 million parameters and thus requires *ca.* 3085.35 MB of memory in full precision (*fp32*)
or 1542.68 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NH8 (Deep-Narrow version)
T5-Efficient-LARGE-NH8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nh8** - is of model type **Large** with the following variations:
- **nh** is **8**
It has **586.73** million parameters and thus requires *ca.* **2346.92 MB** of memory in full precision (*fp32*)
or **1173.46 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nh8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NH8 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-NH8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nh8 - is of model type Large with the following variations:
* nh is 8
It has 586.73 million parameters and thus requires *ca.* 2346.92 MB of memory in full precision (*fp32*)
or 1173.46 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL10 (Deep-Narrow version)
T5-Efficient-LARGE-NL10 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl10** - is of model type **Large** with the following variations:
- **nl** is **10**
It has **326.58** million parameters and thus requires *ca.* **1306.31 MB** of memory in full precision (*fp32*)
or **653.16 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl10
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL10 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NL10 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl10 - is of model type Large with the following variations:
* nl is 10
It has 326.58 million parameters and thus requires *ca.* 1306.31 MB of memory in full precision (*fp32*)
or 653.16 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL12 (Deep-Narrow version)
T5-Efficient-LARGE-NL12 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl12** - is of model type **Large** with the following variations:
- **nl** is **12**
It has **385.31** million parameters and thus requires *ca.* **1541.25 MB** of memory in full precision (*fp32*)
or **770.63 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl12
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL12 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NL12 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl12 - is of model type Large with the following variations:
* nl is 12
It has 385.31 million parameters and thus requires *ca.* 1541.25 MB of memory in full precision (*fp32*)
or 770.63 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL16 (Deep-Narrow version)
T5-Efficient-LARGE-NL16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl16** - is of model type **Large** with the following variations:
- **nl** is **16**
It has **502.78** million parameters and thus requires *ca.* **2011.13 MB** of memory in full precision (*fp32*)
or **1005.57 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL16 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NL16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl16 - is of model type Large with the following variations:
* nl is 16
It has 502.78 million parameters and thus requires *ca.* 2011.13 MB of memory in full precision (*fp32*)
or 1005.57 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL2 (Deep-Narrow version)
T5-Efficient-LARGE-NL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl2** - is of model type **Large** with the following variations:
- **nl** is **2**
It has **91.64** million parameters and thus requires *ca.* **366.55 MB** of memory in full precision (*fp32*)
or **183.28 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL2 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-NL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl2 - is of model type Large with the following variations:
* nl is 2
It has 91.64 million parameters and thus requires *ca.* 366.55 MB of memory in full precision (*fp32*)
or 183.28 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL20 (Deep-Narrow version)
T5-Efficient-LARGE-NL20 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl20** - is of model type **Large** with the following variations:
- **nl** is **20**
It has **620.25** million parameters and thus requires *ca.* **2481.02 MB** of memory in full precision (*fp32*)
or **1240.51 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl20
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL20 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NL20 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl20 - is of model type Large with the following variations:
* nl is 20
It has 620.25 million parameters and thus requires *ca.* 2481.02 MB of memory in full precision (*fp32*)
or 1240.51 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL32 (Deep-Narrow version)
T5-Efficient-LARGE-NL32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl32** - is of model type **Large** with the following variations:
- **nl** is **32**
It has **972.67** million parameters and thus requires *ca.* **3890.66 MB** of memory in full precision (*fp32*)
or **1945.33 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL32 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NL32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl32 - is of model type Large with the following variations:
* nl is 32
It has 972.67 million parameters and thus requires *ca.* 3890.66 MB of memory in full precision (*fp32*)
or 1945.33 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL36 (Deep-Narrow version)
T5-Efficient-LARGE-NL36 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl36** - is of model type **Large** with the following variations:
- **nl** is **36**
It has **1090.14** million parameters and thus requires *ca.* **4360.54 MB** of memory in full precision (*fp32*)
or **2180.27 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl36
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL36 (Deep-Narrow version)
=============================================
T5-Efficient-LARGE-NL36 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl36 - is of model type Large with the following variations:
* nl is 36
It has 1090.14 million parameters and thus requires *ca.* 4360.54 MB of memory in full precision (*fp32*)
or 2180.27 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL4 (Deep-Narrow version)
T5-Efficient-LARGE-NL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl4** - is of model type **Large** with the following variations:
- **nl** is **4**
It has **150.37** million parameters and thus requires *ca.* **601.49 MB** of memory in full precision (*fp32*)
or **300.75 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL4 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-NL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl4 - is of model type Large with the following variations:
* nl is 4
It has 150.37 million parameters and thus requires *ca.* 601.49 MB of memory in full precision (*fp32*)
or 300.75 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE-NL8 (Deep-Narrow version)
T5-Efficient-LARGE-NL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large-nl8** - is of model type **Large** with the following variations:
- **nl** is **8**
It has **267.84** million parameters and thus requires *ca.* **1071.37 MB** of memory in full precision (*fp32*)
or **535.69 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large-nl8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE-NL8 (Deep-Narrow version)
============================================
T5-Efficient-LARGE-NL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large-nl8 - is of model type Large with the following variations:
* nl is 8
It has 267.84 million parameters and thus requires *ca.* 1071.37 MB of memory in full precision (*fp32*)
or 535.69 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-LARGE (Deep-Narrow version)
T5-Efficient-LARGE is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-large** - is of model type **Large** with no variations.
It has **737.72** million parameters and thus requires *ca.* **2950.9 MB** of memory in full precision (*fp32*)
or **1475.45 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-large
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-LARGE (Deep-Narrow version)
========================================
T5-Efficient-LARGE is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-large - is of model type Large with no variations.
It has 737.72 million parameters and thus requires *ca.* 2950.9 MB of memory in full precision (*fp32*)
or 1475.45 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-MINI-NL12 (Deep-Narrow version)
T5-Efficient-MINI-NL12 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-mini-nl12** - is of model type **Mini** with the following variations:
- **nl** is **12**
It has **69.01** million parameters and thus requires *ca.* **276.05 MB** of memory in full precision (*fp32*)
or **138.03 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-mini-nl12
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-MINI-NL12 (Deep-Narrow version)
============================================
T5-Efficient-MINI-NL12 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-mini-nl12 - is of model type Mini with the following variations:
* nl is 12
It has 69.01 million parameters and thus requires *ca.* 276.05 MB of memory in full precision (*fp32*)
or 138.03 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-MINI-NL24 (Deep-Narrow version)
T5-Efficient-MINI-NL24 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-mini-nl24** - is of model type **Mini** with the following variations:
- **nl** is **24**
It has **125.69** million parameters and thus requires *ca.* **502.75 MB** of memory in full precision (*fp32*)
or **251.37 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-mini-nl24
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-MINI-NL24 (Deep-Narrow version)
============================================
T5-Efficient-MINI-NL24 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-mini-nl24 - is of model type Mini with the following variations:
* nl is 24
It has 125.69 million parameters and thus requires *ca.* 502.75 MB of memory in full precision (*fp32*)
or 251.37 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-MINI-NL6 (Deep-Narrow version)
T5-Efficient-MINI-NL6 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-mini-nl6** - is of model type **Mini** with the following variations:
- **nl** is **6**
It has **40.68** million parameters and thus requires *ca.* **162.7 MB** of memory in full precision (*fp32*)
or **81.35 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-mini-nl6
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-MINI-NL6 (Deep-Narrow version)
===========================================
T5-Efficient-MINI-NL6 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-mini-nl6 - is of model type Mini with the following variations:
* nl is 6
It has 40.68 million parameters and thus requires *ca.* 162.7 MB of memory in full precision (*fp32*)
or 81.35 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-MINI-NL8 (Deep-Narrow version)
T5-Efficient-MINI-NL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-mini-nl8** - is of model type **Mini** with the following variations:
- **nl** is **8**
It has **50.12** million parameters and thus requires *ca.* **200.49 MB** of memory in full precision (*fp32*)
or **100.24 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-mini-nl8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-MINI-NL8 (Deep-Narrow version)
===========================================
T5-Efficient-MINI-NL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-mini-nl8 - is of model type Mini with the following variations:
* nl is 8
It has 50.12 million parameters and thus requires *ca.* 200.49 MB of memory in full precision (*fp32*)
or 100.24 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-MINI (Deep-Narrow version)
T5-Efficient-MINI is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-mini** - is of model type **Mini** with no variations.
It has **31.23** million parameters and thus requires *ca.* **124.92 MB** of memory in full precision (*fp32*)
or **62.46 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-mini
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-MINI (Deep-Narrow version)
=======================================
T5-Efficient-MINI is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-mini - is of model type Mini with no variations.
It has 31.23 million parameters and thus requires *ca.* 124.92 MB of memory in full precision (*fp32*)
or 62.46 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DL12 (Deep-Narrow version)
T5-Efficient-SMALL-DL12 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dl12** - is of model type **Small** with the following variations:
- **dl** is **12**
It has **85.7** million parameters and thus requires *ca.* **342.82 MB** of memory in full precision (*fp32*)
or **171.41 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dl12
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DL12 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-DL12 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dl12 - is of model type Small with the following variations:
* dl is 12
It has 85.7 million parameters and thus requires *ca.* 342.82 MB of memory in full precision (*fp32*)
or 171.41 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DL16 (Deep-Narrow version)
T5-Efficient-SMALL-DL16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dl16** - is of model type **Small** with the following variations:
- **dl** is **16**
It has **102.49** million parameters and thus requires *ca.* **409.97 MB** of memory in full precision (*fp32*)
or **204.99 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dl16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DL16 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-DL16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dl16 - is of model type Small with the following variations:
* dl is 16
It has 102.49 million parameters and thus requires *ca.* 409.97 MB of memory in full precision (*fp32*)
or 204.99 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DL2 (Deep-Narrow version)
T5-Efficient-SMALL-DL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dl2** - is of model type **Small** with the following variations:
- **dl** is **2**
It has **43.73** million parameters and thus requires *ca.* **174.93 MB** of memory in full precision (*fp32*)
or **87.46 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dl2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DL2 (Deep-Narrow version)
============================================
T5-Efficient-SMALL-DL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dl2 - is of model type Small with the following variations:
* dl is 2
It has 43.73 million parameters and thus requires *ca.* 174.93 MB of memory in full precision (*fp32*)
or 87.46 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DL4 (Deep-Narrow version)
T5-Efficient-SMALL-DL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dl4** - is of model type **Small** with the following variations:
- **dl** is **4**
It has **52.13** million parameters and thus requires *ca.* **208.51 MB** of memory in full precision (*fp32*)
or **104.25 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dl4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DL4 (Deep-Narrow version)
============================================
T5-Efficient-SMALL-DL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dl4 - is of model type Small with the following variations:
* dl is 4
It has 52.13 million parameters and thus requires *ca.* 208.51 MB of memory in full precision (*fp32*)
or 104.25 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DL8 (Deep-Narrow version)
T5-Efficient-SMALL-DL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dl8** - is of model type **Small** with the following variations:
- **dl** is **8**
It has **68.92** million parameters and thus requires *ca.* **275.66 MB** of memory in full precision (*fp32*)
or **137.83 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dl8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DL8 (Deep-Narrow version)
============================================
T5-Efficient-SMALL-DL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dl8 - is of model type Small with the following variations:
* dl is 8
It has 68.92 million parameters and thus requires *ca.* 275.66 MB of memory in full precision (*fp32*)
or 137.83 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DM1000 (Deep-Narrow version)
T5-Efficient-SMALL-DM1000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dm1000** - is of model type **Small** with the following variations:
- **dm** is **1000**
It has **121.03** million parameters and thus requires *ca.* **484.11 MB** of memory in full precision (*fp32*)
or **242.05 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dm1000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DM1000 (Deep-Narrow version)
===============================================
T5-Efficient-SMALL-DM1000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dm1000 - is of model type Small with the following variations:
* dm is 1000
It has 121.03 million parameters and thus requires *ca.* 484.11 MB of memory in full precision (*fp32*)
or 242.05 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DM128 (Deep-Narrow version)
T5-Efficient-SMALL-DM128 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dm128** - is of model type **Small** with the following variations:
- **dm** is **128**
It has **15.14** million parameters and thus requires *ca.* **60.57 MB** of memory in full precision (*fp32*)
or **30.28 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dm128
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DM128 (Deep-Narrow version)
==============================================
T5-Efficient-SMALL-DM128 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dm128 - is of model type Small with the following variations:
* dm is 128
It has 15.14 million parameters and thus requires *ca.* 60.57 MB of memory in full precision (*fp32*)
or 30.28 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DM2000 (Deep-Narrow version)
T5-Efficient-SMALL-DM2000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dm2000** - is of model type **Small** with the following variations:
- **dm** is **2000**
It has **242.04** million parameters and thus requires *ca.* **968.16 MB** of memory in full precision (*fp32*)
or **484.08 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dm2000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DM2000 (Deep-Narrow version)
===============================================
T5-Efficient-SMALL-DM2000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dm2000 - is of model type Small with the following variations:
* dm is 2000
It has 242.04 million parameters and thus requires *ca.* 968.16 MB of memory in full precision (*fp32*)
or 484.08 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DM256 (Deep-Narrow version)
T5-Efficient-SMALL-DM256 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dm256** - is of model type **Small** with the following variations:
- **dm** is **256**
It has **30.27** million parameters and thus requires *ca.* **121.07 MB** of memory in full precision (*fp32*)
or **60.54 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dm256
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DM256 (Deep-Narrow version)
==============================================
T5-Efficient-SMALL-DM256 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dm256 - is of model type Small with the following variations:
* dm is 256
It has 30.27 million parameters and thus requires *ca.* 121.07 MB of memory in full precision (*fp32*)
or 60.54 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-DM768 (Deep-Narrow version)
T5-Efficient-SMALL-DM768 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-dm768** - is of model type **Small** with the following variations:
- **dm** is **768**
It has **90.77** million parameters and thus requires *ca.* **363.1 MB** of memory in full precision (*fp32*)
or **181.55 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-dm768
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-DM768 (Deep-Narrow version)
==============================================
T5-Efficient-SMALL-DM768 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-dm768 - is of model type Small with the following variations:
* dm is 768
It has 90.77 million parameters and thus requires *ca.* 363.1 MB of memory in full precision (*fp32*)
or 181.55 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL12 (Deep-Narrow version)
T5-Efficient-SMALL-EL12 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el12** - is of model type **Small** with the following variations:
- **el** is **12**
It has **79.41** million parameters and thus requires *ca.* **317.63 MB** of memory in full precision (*fp32*)
or **158.81 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el12
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL12 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-EL12 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el12 - is of model type Small with the following variations:
* el is 12
It has 79.41 million parameters and thus requires *ca.* 317.63 MB of memory in full precision (*fp32*)
or 158.81 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL16-DL1 (Deep-Narrow version)
T5-Efficient-SMALL-EL16-DL1 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el16-dl1** - is of model type **Small** with the following variations:
- **el** is **16**
- **dl** is **1**
It has **71.01** million parameters and thus requires *ca.* **284.04 MB** of memory in full precision (*fp32*)
or **142.02 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el16-dl1
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL16-DL1 (Deep-Narrow version)
=================================================
T5-Efficient-SMALL-EL16-DL1 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el16-dl1 - is of model type Small with the following variations:
* el is 16
* dl is 1
It has 71.01 million parameters and thus requires *ca.* 284.04 MB of memory in full precision (*fp32*)
or 142.02 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL16-DL2 (Deep-Narrow version)
T5-Efficient-SMALL-EL16-DL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el16-dl2** - is of model type **Small** with the following variations:
- **el** is **16**
- **dl** is **2**
It has **75.21** million parameters and thus requires *ca.* **300.83 MB** of memory in full precision (*fp32*)
or **150.42 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el16-dl2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL16-DL2 (Deep-Narrow version)
=================================================
T5-Efficient-SMALL-EL16-DL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el16-dl2 - is of model type Small with the following variations:
* el is 16
* dl is 2
It has 75.21 million parameters and thus requires *ca.* 300.83 MB of memory in full precision (*fp32*)
or 150.42 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL16-DL4 (Deep-Narrow version)
T5-Efficient-SMALL-EL16-DL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el16-dl4** - is of model type **Small** with the following variations:
- **el** is **16**
- **dl** is **4**
It has **83.6** million parameters and thus requires *ca.* **334.41 MB** of memory in full precision (*fp32*)
or **167.21 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el16-dl4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL16-DL4 (Deep-Narrow version)
=================================================
T5-Efficient-SMALL-EL16-DL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el16-dl4 - is of model type Small with the following variations:
* el is 16
* dl is 4
It has 83.6 million parameters and thus requires *ca.* 334.41 MB of memory in full precision (*fp32*)
or 167.21 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL16-DL8 (Deep-Narrow version)
T5-Efficient-SMALL-EL16-DL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el16-dl8** - is of model type **Small** with the following variations:
- **el** is **16**
- **dl** is **8**
It has **100.39** million parameters and thus requires *ca.* **401.57 MB** of memory in full precision (*fp32*)
or **200.78 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el16-dl8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL16-DL8 (Deep-Narrow version)
=================================================
T5-Efficient-SMALL-EL16-DL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el16-dl8 - is of model type Small with the following variations:
* el is 16
* dl is 8
It has 100.39 million parameters and thus requires *ca.* 401.57 MB of memory in full precision (*fp32*)
or 200.78 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL16 (Deep-Narrow version)
T5-Efficient-SMALL-EL16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el16** - is of model type **Small** with the following variations:
- **el** is **16**
It has **92.0** million parameters and thus requires *ca.* **367.99 MB** of memory in full precision (*fp32*)
or **183.99 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL16 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-EL16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el16 - is of model type Small with the following variations:
* el is 16
It has 92.0 million parameters and thus requires *ca.* 367.99 MB of memory in full precision (*fp32*)
or 183.99 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL2 (Deep-Narrow version)
T5-Efficient-SMALL-EL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el2** - is of model type **Small** with the following variations:
- **el** is **2**
It has **47.93** million parameters and thus requires *ca.* **191.72 MB** of memory in full precision (*fp32*)
or **95.86 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL2 (Deep-Narrow version)
============================================
T5-Efficient-SMALL-EL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el2 - is of model type Small with the following variations:
* el is 2
It has 47.93 million parameters and thus requires *ca.* 191.72 MB of memory in full precision (*fp32*)
or 95.86 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL32 (Deep-Narrow version)
T5-Efficient-SMALL-EL32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el32** - is of model type **Small** with the following variations:
- **el** is **32**
It has **142.36** million parameters and thus requires *ca.* **569.44 MB** of memory in full precision (*fp32*)
or **284.72 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL32 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-EL32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el32 - is of model type Small with the following variations:
* el is 32
It has 142.36 million parameters and thus requires *ca.* 569.44 MB of memory in full precision (*fp32*)
or 284.72 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL4 (Deep-Narrow version)
T5-Efficient-SMALL-EL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el4** - is of model type **Small** with the following variations:
- **el** is **4**
It has **54.23** million parameters and thus requires *ca.* **216.9 MB** of memory in full precision (*fp32*)
or **108.45 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL4 (Deep-Narrow version)
============================================
T5-Efficient-SMALL-EL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el4 - is of model type Small with the following variations:
* el is 4
It has 54.23 million parameters and thus requires *ca.* 216.9 MB of memory in full precision (*fp32*)
or 108.45 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL48 (Deep-Narrow version)
T5-Efficient-SMALL-EL48 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el48** - is of model type **Small** with the following variations:
- **el** is **48**
It has **192.72** million parameters and thus requires *ca.* **770.89 MB** of memory in full precision (*fp32*)
or **385.44 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el48
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL48 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-EL48 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el48 - is of model type Small with the following variations:
* el is 48
It has 192.72 million parameters and thus requires *ca.* 770.89 MB of memory in full precision (*fp32*)
or 385.44 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL64 (Deep-Narrow version)
T5-Efficient-SMALL-EL64 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el64** - is of model type **Small** with the following variations:
- **el** is **64**
It has **243.08** million parameters and thus requires *ca.* **972.34 MB** of memory in full precision (*fp32*)
or **486.17 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el64
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL64 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-EL64 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el64 - is of model type Small with the following variations:
* el is 64
It has 243.08 million parameters and thus requires *ca.* 972.34 MB of memory in full precision (*fp32*)
or 486.17 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL8-DL1 (Deep-Narrow version)
T5-Efficient-SMALL-EL8-DL1 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el8-dl1** - is of model type **Small** with the following variations:
- **el** is **8**
- **dl** is **1**
It has **45.83** million parameters and thus requires *ca.* **183.32 MB** of memory in full precision (*fp32*)
or **91.66 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el8-dl1
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL8-DL1 (Deep-Narrow version)
================================================
T5-Efficient-SMALL-EL8-DL1 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el8-dl1 - is of model type Small with the following variations:
* el is 8
* dl is 1
It has 45.83 million parameters and thus requires *ca.* 183.32 MB of memory in full precision (*fp32*)
or 91.66 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL8-DL2 (Deep-Narrow version)
T5-Efficient-SMALL-EL8-DL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el8-dl2** - is of model type **Small** with the following variations:
- **el** is **8**
- **dl** is **2**
It has **50.03** million parameters and thus requires *ca.* **200.11 MB** of memory in full precision (*fp32*)
or **100.05 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el8-dl2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL8-DL2 (Deep-Narrow version)
================================================
T5-Efficient-SMALL-EL8-DL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el8-dl2 - is of model type Small with the following variations:
* el is 8
* dl is 2
It has 50.03 million parameters and thus requires *ca.* 200.11 MB of memory in full precision (*fp32*)
or 100.05 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL8-DL4 (Deep-Narrow version)
T5-Efficient-SMALL-EL8-DL4 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el8-dl4** - is of model type **Small** with the following variations:
- **el** is **8**
- **dl** is **4**
It has **58.42** million parameters and thus requires *ca.* **233.69 MB** of memory in full precision (*fp32*)
or **116.84 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el8-dl4
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL8-DL4 (Deep-Narrow version)
================================================
T5-Efficient-SMALL-EL8-DL4 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el8-dl4 - is of model type Small with the following variations:
* el is 8
* dl is 4
It has 58.42 million parameters and thus requires *ca.* 233.69 MB of memory in full precision (*fp32*)
or 116.84 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-EL8 (Deep-Narrow version)
T5-Efficient-SMALL-EL8 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-el8** - is of model type **Small** with the following variations:
- **el** is **8**
It has **66.82** million parameters and thus requires *ca.* **267.26 MB** of memory in full precision (*fp32*)
or **133.63 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-el8
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-EL8 (Deep-Narrow version)
============================================
T5-Efficient-SMALL-EL8 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-el8 - is of model type Small with the following variations:
* el is 8
It has 66.82 million parameters and thus requires *ca.* 267.26 MB of memory in full precision (*fp32*)
or 133.63 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-FF1000 (Deep-Narrow version)
T5-Efficient-SMALL-FF1000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-ff1000** - is of model type **Small** with the following variations:
- **ff** is **1000**
It has **47.94** million parameters and thus requires *ca.* **191.75 MB** of memory in full precision (*fp32*)
or **95.88 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-ff1000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-FF1000 (Deep-Narrow version)
===============================================
T5-Efficient-SMALL-FF1000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-ff1000 - is of model type Small with the following variations:
* ff is 1000
It has 47.94 million parameters and thus requires *ca.* 191.75 MB of memory in full precision (*fp32*)
or 95.88 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-FF12000 (Deep-Narrow version)
T5-Efficient-SMALL-FF12000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-ff12000** - is of model type **Small** with the following variations:
- **ff** is **12000**
It has **186.35** million parameters and thus requires *ca.* **745.4 MB** of memory in full precision (*fp32*)
or **372.7 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-ff12000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-FF12000 (Deep-Narrow version)
================================================
T5-Efficient-SMALL-FF12000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-ff12000 - is of model type Small with the following variations:
* ff is 12000
It has 186.35 million parameters and thus requires *ca.* 745.4 MB of memory in full precision (*fp32*)
or 372.7 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-FF3000 (Deep-Narrow version)
T5-Efficient-SMALL-FF3000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-ff3000** - is of model type **Small** with the following variations:
- **ff** is **3000**
It has **73.1** million parameters and thus requires *ca.* **292.42 MB** of memory in full precision (*fp32*)
or **146.21 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-ff3000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-FF3000 (Deep-Narrow version)
===============================================
T5-Efficient-SMALL-FF3000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-ff3000 - is of model type Small with the following variations:
* ff is 3000
It has 73.1 million parameters and thus requires *ca.* 292.42 MB of memory in full precision (*fp32*)
or 146.21 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-FF6000 (Deep-Narrow version)
T5-Efficient-SMALL-FF6000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-ff6000** - is of model type **Small** with the following variations:
- **ff** is **6000**
It has **110.85** million parameters and thus requires *ca.* **443.41 MB** of memory in full precision (*fp32*)
or **221.71 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-ff6000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-FF6000 (Deep-Narrow version)
===============================================
T5-Efficient-SMALL-FF6000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-ff6000 - is of model type Small with the following variations:
* ff is 6000
It has 110.85 million parameters and thus requires *ca.* 443.41 MB of memory in full precision (*fp32*)
or 221.71 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-FF9000 (Deep-Narrow version)
T5-Efficient-SMALL-FF9000 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-ff9000** - is of model type **Small** with the following variations:
- **ff** is **9000**
It has **148.6** million parameters and thus requires *ca.* **594.41 MB** of memory in full precision (*fp32*)
or **297.2 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-ff9000
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-FF9000 (Deep-Narrow version)
===============================================
T5-Efficient-SMALL-FF9000 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-ff9000 - is of model type Small with the following variations:
* ff is 9000
It has 148.6 million parameters and thus requires *ca.* 594.41 MB of memory in full precision (*fp32*)
or 297.2 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-KV128 (Deep-Narrow version)
T5-Efficient-SMALL-KV128 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-kv128** - is of model type **Small** with the following variations:
- **kv** is **128**
It has **79.4** million parameters and thus requires *ca.* **317.58 MB** of memory in full precision (*fp32*)
or **158.79 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-kv128
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-KV128 (Deep-Narrow version)
==============================================
T5-Efficient-SMALL-KV128 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-kv128 - is of model type Small with the following variations:
* kv is 128
It has 79.4 million parameters and thus requires *ca.* 317.58 MB of memory in full precision (*fp32*)
or 158.79 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-KV16 (Deep-Narrow version)
T5-Efficient-SMALL-KV16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-kv16** - is of model type **Small** with the following variations:
- **kv** is **16**
It has **46.37** million parameters and thus requires *ca.* **185.46 MB** of memory in full precision (*fp32*)
or **92.73 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-kv16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-KV16 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-KV16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-kv16 - is of model type Small with the following variations:
* kv is 16
It has 46.37 million parameters and thus requires *ca.* 185.46 MB of memory in full precision (*fp32*)
or 92.73 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-KV256 (Deep-Narrow version)
T5-Efficient-SMALL-KV256 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-kv256** - is of model type **Small** with the following variations:
- **kv** is **256**
It has **117.14** million parameters and thus requires *ca.* **468.58 MB** of memory in full precision (*fp32*)
or **234.29 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-kv256
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-KV256 (Deep-Narrow version)
==============================================
T5-Efficient-SMALL-KV256 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-kv256 - is of model type Small with the following variations:
* kv is 256
It has 117.14 million parameters and thus requires *ca.* 468.58 MB of memory in full precision (*fp32*)
or 234.29 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-KV32 (Deep-Narrow version)
T5-Efficient-SMALL-KV32 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-kv32** - is of model type **Small** with the following variations:
- **kv** is **32**
It has **51.08** million parameters and thus requires *ca.* **204.34 MB** of memory in full precision (*fp32*)
or **102.17 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-kv32
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-KV32 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-KV32 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-kv32 - is of model type Small with the following variations:
* kv is 32
It has 51.08 million parameters and thus requires *ca.* 204.34 MB of memory in full precision (*fp32*)
or 102.17 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-NL16 (Deep-Narrow version)
T5-Efficient-SMALL-NL16 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-nl16** - is of model type **Small** with the following variations:
- **nl** is **16**
It has **133.97** million parameters and thus requires *ca.* **535.88 MB** of memory in full precision (*fp32*)
or **267.94 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-nl16
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-NL16 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-NL16 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-nl16 - is of model type Small with the following variations:
* nl is 16
It has 133.97 million parameters and thus requires *ca.* 535.88 MB of memory in full precision (*fp32*)
or 267.94 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-NL2 (Deep-Narrow version)
T5-Efficient-SMALL-NL2 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-nl2** - is of model type **Small** with the following variations:
- **nl** is **2**
It has **31.14** million parameters and thus requires *ca.* **124.57 MB** of memory in full precision (*fp32*)
or **62.28 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-nl2
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-NL2 (Deep-Narrow version)
============================================
T5-Efficient-SMALL-NL2 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-nl2 - is of model type Small with the following variations:
* nl is 2
It has 31.14 million parameters and thus requires *ca.* 124.57 MB of memory in full precision (*fp32*)
or 62.28 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
null | null |
transformers
|
# T5-Efficient-SMALL-NL20 (Deep-Narrow version)
T5-Efficient-SMALL-NL20 is a variation of [Google's original T5](https://ai.googleblog.com/2020/02/exploring-transfer-learning-with-t5.html) following the [T5 model architecture](https://huggingface.co/docs/transformers/model_doc/t5).
It is a *pretrained-only* checkpoint and was released with the
paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)**
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a **Deep-Narrow** model architecture is favorable for **downstream** performance compared to other model architectures
of similar parameter count.
To quote the paper:
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
## Details model architecture
This model checkpoint - **t5-efficient-small-nl20** - is of model type **Small** with the following variations:
- **nl** is **20**
It has **163.35** million parameters and thus requires *ca.* **653.4 MB** of memory in full precision (*fp32*)
or **326.7 MB** of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
| Model | nl (el/dl) | ff | dm | kv | nh | #Params|
| ----| ---- | ---- | ---- | ---- | ---- | ----|
| Tiny | 4/4 | 1024 | 256 | 32 | 4 | 16M|
| Mini | 4/4 | 1536 | 384 | 32 | 8 | 31M|
| Small | 6/6 | 2048 | 512 | 32 | 8 | 60M|
| Base | 12/12 | 3072 | 768 | 64 | 12 | 220M|
| Large | 24/24 | 4096 | 1024 | 64 | 16 | 738M|
| Xl | 24/24 | 16384 | 1024 | 128 | 32 | 3B|
| XXl | 24/24 | 65536 | 1024 | 128 | 128 | 11B|
whereas the following abbreviations are used:
| Abbreviation | Definition |
| ----| ---- |
| nl | Number of transformer blocks (depth) |
| dm | Dimension of embedding vector (output vector of transformers block) |
| kv | Dimension of key/value projection matrix |
| nh | Number of attention heads |
| ff | Dimension of intermediate vector within transformer block (size of feed-forward projection matrix) |
| el | Number of transformer blocks in the encoder (encoder depth) |
| dl | Number of transformer blocks in the decoder (decoder depth) |
| sh | Signifies that attention heads are shared |
| skv | Signifies that key-values projection matrices are tied |
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
## Pre-Training
The checkpoint was pretrained on the [Colossal, Cleaned version of Common Crawl (C4)](https://huggingface.co/datasets/c4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
## Fine-Tuning
**Note**: This model is a **pretrained** checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/pytorch/summarization)
- [Question Answering](https://github.com/huggingface/transformers/blob/master/examples/pytorch/question-answering/run_seq2seq_qa.py)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/pytorch/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/tensorflow/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
- [Summarization](https://github.com/huggingface/transformers/tree/master/examples/flax/summarization)
- [Text Classification](https://github.com/huggingface/transformers/tree/master/examples/flax/text-classification) - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
## Downstream Performance
TODO: Add table if available
## Computational Complexity
TODO: Add table if available
## More information
We strongly recommend the reader to go carefully through the original paper **[Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers](https://arxiv.org/abs/2109.10686)** to get a more nuanced understanding of this model checkpoint.
As explained in the following [issue](https://github.com/google-research/google-research/issues/986#issuecomment-1035051145), checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept [here](https://huggingface.co/NewT5SharedHeadsSharedKeyValues) as they might be ported potentially in the future.
|
{"language": ["en"], "license": "apache-2.0", "tags": ["deep-narrow"], "datasets": ["c4"], "inference": false}
|
text2text-generation
|
google/t5-efficient-small-nl20
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"deep-narrow",
"en",
"dataset:c4",
"arxiv:2109.10686",
"license:apache-2.0",
"autotrain_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[
"2109.10686"
] |
[
"en"
] |
TAGS
#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us
|
T5-Efficient-SMALL-NL20 (Deep-Narrow version)
=============================================
T5-Efficient-SMALL-NL20 is a variation of Google's original T5 following the T5 model architecture.
It is a *pretrained-only* checkpoint and was released with the
paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers
by *Yi Tay, Mostafa Dehghani, Jinfeng Rao, William Fedus, Samira Abnar, Hyung Won Chung, Sharan Narang, Dani Yogatama, Ashish Vaswani, Donald Metzler*.
In a nutshell, the paper indicates that a Deep-Narrow model architecture is favorable for downstream performance compared to other model architectures
of similar parameter count.
To quote the paper:
>
> We generally recommend a DeepNarrow strategy where the model’s depth is preferentially increased
> before considering any other forms of uniform scaling across other dimensions. This is largely due to
> how much depth influences the Pareto-frontier as shown in earlier sections of the paper. Specifically, a
> tall small (deep and narrow) model is generally more efficient compared to the base model. Likewise,
> a tall base model might also generally more efficient compared to a large model. We generally find
> that, regardless of size, even if absolute performance might increase as we continue to stack layers,
> the relative gain of Pareto-efficiency diminishes as we increase the layers, converging at 32 to 36
> layers. Finally, we note that our notion of efficiency here relates to any one compute dimension, i.e.,
> params, FLOPs or throughput (speed). We report all three key efficiency metrics (number of params,
> FLOPS and speed) and leave this decision to the practitioner to decide which compute dimension to
> consider.
>
>
>
To be more precise, *model depth* is defined as the number of transformer blocks that are stacked sequentially.
A sequence of word embeddings is therefore processed sequentially by each transformer block.
Details model architecture
--------------------------
This model checkpoint - t5-efficient-small-nl20 - is of model type Small with the following variations:
* nl is 20
It has 163.35 million parameters and thus requires *ca.* 653.4 MB of memory in full precision (*fp32*)
or 326.7 MB of memory in half precision (*fp16* or *bf16*).
A summary of the *original* T5 model architectures can be seen here:
whereas the following abbreviations are used:
If a model checkpoint has no specific, *el* or *dl* than both the number of encoder- and decoder layers correspond to *nl*.
Pre-Training
------------
The checkpoint was pretrained on the Colossal, Cleaned version of Common Crawl (C4) for 524288 steps using
the span-based masked language modeling (MLM) objective.
Fine-Tuning
-----------
Note: This model is a pretrained checkpoint and has to be fine-tuned for practical usage.
The checkpoint was pretrained in English and is therefore only useful for English NLP tasks.
You can follow on of the following examples on how to fine-tune the model:
*PyTorch*:
* Summarization
* Question Answering
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*Tensorflow*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
*JAX/Flax*:
* Summarization
* Text Classification - *Note*: You will have to slightly adapt the training example here to make it work with an encoder-decoder model.
Downstream Performance
----------------------
TODO: Add table if available
Computational Complexity
------------------------
TODO: Add table if available
More information
----------------
We strongly recommend the reader to go carefully through the original paper Scale Efficiently: Insights from Pre-training and Fine-tuning Transformers to get a more nuanced understanding of this model checkpoint.
As explained in the following issue, checkpoints including the *sh* or *skv*
model architecture variations have *not* been ported to Transformers as they are probably of limited practical usage and are lacking a more detailed description. Those checkpoints are kept here as they might be ported potentially in the future.
|
[] |
[
"TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
79
] |
[
"passage: TAGS\n#transformers #pytorch #tf #jax #t5 #text2text-generation #deep-narrow #en #dataset-c4 #arxiv-2109.10686 #license-apache-2.0 #autotrain_compatible #has_space #text-generation-inference #region-us \n"
] |
[
-0.04254638031125069,
0.11769898980855942,
-0.003979119006544352,
0.09715647995471954,
0.05742545425891876,
-0.018159177154302597,
0.12103479355573654,
0.16874375939369202,
-0.08899904787540436,
-0.018448568880558014,
0.15077979862689972,
0.14261233806610107,
0.018900517374277115,
0.11473057419061661,
-0.06461337953805923,
-0.20998615026474,
0.05409939959645271,
0.010116108693182468,
-0.06038738042116165,
0.10815446078777313,
0.1137012466788292,
-0.05059779807925224,
0.08365850895643234,
-0.03556648641824722,
-0.10376870632171631,
0.024480491876602173,
0.03199612349271774,
-0.109830342233181,
0.0907052755355835,
0.05546672269701958,
0.022783510386943817,
0.07769568264484406,
-0.012909478507936,
-0.08137255907058716,
0.03527992591261864,
0.06325751543045044,
-0.07716087251901627,
0.10825901478528976,
0.07234556972980499,
0.02387293055653572,
0.056406013667583466,
0.012652678415179253,
-0.03781942278146744,
0.04453824833035469,
-0.09842909127473831,
-0.1612836718559265,
-0.07658370584249496,
0.06557530909776688,
0.01654265820980072,
0.09187588840723038,
0.026537200435996056,
0.17466366291046143,
-0.0776221975684166,
0.07682986557483673,
0.1788335144519806,
-0.3719708323478699,
-0.012459357269108295,
0.039841342717409134,
0.0223538801074028,
0.09488000720739365,
-0.03147624060511589,
0.025146549567580223,
0.0564388707280159,
0.016575537621974945,
0.08057033270597458,
-0.046517904847860336,
-0.302586168050766,
0.05843212082982063,
-0.08216454088687897,
-0.06840910762548447,
0.32611581683158875,
0.003956862725317478,
0.04188018664717674,
0.00798291340470314,
-0.12394081056118011,
-0.02163282223045826,
0.0195434782654047,
0.0032748719677329063,
0.032767634838819504,
0.08525314182043076,
0.044923122972249985,
-0.07708670198917389,
-0.15070722997188568,
-0.019160594791173935,
-0.2269735038280487,
-0.013622233644127846,
-0.0002429143787594512,
0.0777311772108078,
-0.1344832181930542,
0.06718359142541885,
0.03819330781698227,
-0.14985118806362152,
0.05865996703505516,
-0.053782738745212555,
0.07812971621751785,
0.036263518035411835,
-0.04736582189798355,
-0.05754081904888153,
0.11616957187652588,
0.15345695614814758,
0.02261003479361534,
-0.01643592305481434,
-0.08811437338590622,
0.09266255050897598,
-0.026918523013591766,
0.04341743513941765,
-0.09748063236474991,
-0.00868972484022379,
0.12478282302618027,
-0.05117316544055939,
0.0747646689414978,
-0.07778767496347427,
-0.1664166897535324,
-0.06328913569450378,
0.040551841259002686,
0.09510618448257446,
0.1051187664270401,
0.056759487837553024,
-0.023194726556539536,
-0.030397512018680573,
0.07955987006425858,
-0.09160031378269196,
-0.005603222642093897,
0.025780677795410156,
-0.026714367792010307,
0.01941857486963272,
0.053136687725782394,
0.026423322036862373,
-0.1023850366473198,
-0.040010347962379456,
-0.07218720763921738,
-0.03220491111278534,
-0.014276556670665741,
-0.09352828562259674,
0.09258051216602325,
-0.041519396007061005,
-0.010359606705605984,
-0.164694681763649,
-0.15345411002635956,
0.034552332013845444,
0.030709778890013695,
-0.01925623044371605,
-0.11409030109643936,
0.03517638146877289,
-0.07223428785800934,
0.0524461530148983,
-0.062394801527261734,
0.060283415019512177,
-0.07230525463819504,
0.07424701750278473,
-0.11293698847293854,
0.06025610864162445,
-0.18567818403244019,
0.05149489641189575,
-0.14390838146209717,
-0.011732026003301144,
-0.0034681179095059633,
-0.0026324933860450983,
-0.03497390076518059,
0.12001846730709076,
-0.07861626148223877,
-0.01586638018488884,
-0.026480574160814285,
-0.017366811633110046,
0.011687102727591991,
0.11682283878326416,
-0.20636002719402313,
-0.029771776869893074,
0.11065364629030228,
-0.07490658015012741,
-0.20088379085063934,
0.09287188947200775,
0.023517878726124763,
0.05051181837916374,
0.029591629281640053,
0.23399794101715088,
0.06159195676445961,
0.011682100594043732,
0.05208359658718109,
0.12429419904947281,
-0.08257151395082474,
-0.13571636378765106,
0.08883780241012573,
-0.05126161873340607,
-0.03595305606722832,
0.023069968447089195,
0.030777042731642723,
0.06358307600021362,
-0.0025246499571949244,
-0.08910442888736725,
-0.06094134971499443,
-0.036478474736213684,
0.05339744687080383,
-0.021482067182660103,
0.07859036326408386,
-0.06669094413518906,
-0.013309878297150135,
-0.03104024939239025,
0.0503816194832325,
0.02133384346961975,
0.050026122480630875,
-0.020997153595089912,
0.09722495824098587,
-0.01566113904118538,
0.05979827046394348,
-0.13485535979270935,
-0.031013233587145805,
0.02129344269633293,
0.06124376505613327,
0.03970795497298241,
0.11202684789896011,
0.03720906749367714,
-0.04102879762649536,
0.014936007559299469,
-0.005811616312712431,
0.08045772463083267,
0.027414752170443535,
-0.06145905330777168,
-0.15039584040641785,
0.04751947149634361,
-0.025240251794457436,
0.014741656370460987,
-0.032690294086933136,
0.03091658465564251,
0.014925787225365639,
0.08182704448699951,
-0.010361547581851482,
0.1031244769692421,
0.027928920462727547,
-0.029726244509220123,
-0.06268949061632156,
-0.017913520336151123,
0.07720652967691422,
0.029109550639986992,
-0.047446779906749725,
0.20497991144657135,
-0.14903467893600464,
0.2931467890739441,
0.18341439962387085,
-0.16676819324493408,
0.08604627847671509,
0.012637350708246231,
-0.027452420443296432,
-0.02700759470462799,
0.04312296211719513,
-0.0002312481083208695,
-0.019762849435210228,
-0.023677224293351173,
0.14846713840961456,
-0.11485148221254349,
-0.018409807235002518,
-0.01506184134632349,
-0.04416215419769287,
0.004082539584487677,
0.07273254543542862,
0.09416019171476364,
-0.18429860472679138,
0.166764497756958,
0.3547271490097046,
-0.01291155256330967,
0.13091467320919037,
-0.08387233316898346,
-0.05125956982374191,
0.027284277603030205,
-0.0168942678719759,
-0.03401607647538185,
0.0034082334022969007,
-0.03649473935365677,
0.0381108783185482,
0.11398807168006897,
0.03455577418208122,
0.036516156047582626,
-0.1135324090719223,
-0.044499821960926056,
0.004434969276189804,
-0.042860690504312515,
-0.011679698713123798,
0.07209854573011398,
0.01750793494284153,
0.13443109393119812,
-0.023565933108329773,
-0.07931440323591232,
0.10756431519985199,
0.011662446893751621,
-0.08525500446557999,
0.15606653690338135,
-0.19391785562038422,
-0.25945115089416504,
-0.09803029894828796,
-0.04224206879734993,
-0.07426200807094574,
-0.015451968647539616,
0.11215955764055252,
-0.05802702158689499,
-0.027517355978488922,
-0.07662458717823029,
-0.06253543496131897,
-0.019675450399518013,
0.017338739708065987,
-0.04889349639415741,
0.03979690000414848,
0.02850809134542942,
-0.145213782787323,
-0.009166746400296688,
0.010719900019466877,
-0.048146460205316544,
0.08297647535800934,
-0.030946021899580956,
0.07520534098148346,
0.16089968383312225,
0.007302524987608194,
0.020920345559716225,
-0.0317072756588459,
0.13318248093128204,
-0.007346819620579481,
0.038225483149290085,
0.25266408920288086,
0.002947733737528324,
0.05588952451944351,
0.14811481535434723,
0.005307730287313461,
-0.02390916459262371,
0.007619152776896954,
-0.04094370827078819,
-0.0423840768635273,
-0.2602440118789673,
-0.09032062441110611,
-0.10900454968214035,
0.12619061768054962,
0.04594874009490013,
0.0908038541674614,
0.1265239119529724,
0.03578295186161995,
-0.021931469440460205,
0.05439893156290054,
-0.008785656653344631,
0.047103401273489,
0.23059585690498352,
-0.017776379361748695,
0.11084158718585968,
-0.09735196828842163,
-0.01949990727007389,
0.1463218629360199,
0.05132850259542465,
0.08437740802764893,
0.02501460164785385,
0.1643269807100296,
0.044249627739191055,
0.1991615742444992,
0.05997395142912865,
0.10893027484416962,
0.040593549609184265,
-0.00022699873079545796,
-0.05014732480049133,
-0.07905711233615875,
-0.006585677154362202,
0.02599361166357994,
-0.04385573789477348,
-0.06813118606805801,
-0.04009837657213211,
-0.029320431873202324,
0.07329276204109192,
0.1426927149295807,
0.06962130218744278,
-0.16768327355384827,
0.023342620581388474,
0.035327453166246414,
0.003905769670382142,
-0.054488979279994965,
0.07075091451406479,
0.051872335374355316,
-0.08004653453826904,
0.005893490742892027,
-0.00951821357011795,
0.11690004169940948,
0.032685536891222,
0.06957677751779556,
-0.04686485603451729,
-0.0030130723025649786,
0.018698105588555336,
0.11896239966154099,
-0.29746487736701965,
0.1698092669248581,
0.010544862598180771,
-0.06227630749344826,
-0.09902402758598328,
-0.00588437682017684,
0.054542094469070435,
0.11532649397850037,
0.055573537945747375,
0.012330549769103527,
-0.031553976237773895,
0.044118281453847885,
-0.12396524101495743,
0.05786973983049393,
0.022624468430876732,
-0.003157981438562274,
-0.05245694890618324,
-0.058386072516441345,
-0.003038334660232067,
0.03928688168525696,
0.15380634367465973,
-0.05484093353152275,
-0.10479079186916351,
0.08026454597711563,
0.08152925223112106,
-0.0521031878888607,
-0.03465885668992996,
-0.04786438122391701,
-0.07334746420383453,
0.16139961779117584,
0.0038780034519732,
-0.05938153713941574,
-0.09951896220445633,
-0.055427975952625275,
0.08259836584329605,
-0.06442441791296005,
0.0473603680729866,
-0.06371449679136276,
-0.003896909300237894,
-0.03224900737404823,
-0.23246993124485016,
0.14722178876399994,
-0.09396586567163467,
-0.08036700636148453,
-0.025796381756663322,
0.09547383338212967,
-0.14977768063545227,
0.045967645943164825,
0.004525578115135431,
0.014795131050050259,
-0.11366591602563858,
-0.08210735023021698,
-0.008047297596931458,
-0.022639119997620583,
0.0016309976344928145,
-0.010441416874527931,
-0.08192944526672363,
-0.06162133440375328,
0.04252227395772934,
-0.028142213821411133,
0.2735251784324646,
0.1964879333972931,
-0.10515137016773224,
0.18773908913135529,
0.16352398693561554,
-0.04658016934990883,
-0.29743441939353943,
-0.11019548028707504,
-0.12996283173561096,
-0.04663108289241791,
0.04086287319660187,
-0.1585066020488739,
0.050556350499391556,
0.05106563866138458,
-0.0750616118311882,
0.09970050305128098,
-0.27465879917144775,
-0.08723904937505722,
0.17700892686843872,
-0.02031525783240795,
0.22051025927066803,
-0.14184220135211945,
-0.07186169177293777,
-0.04820942506194115,
-0.12139235436916351,
0.17245012521743774,
-0.2380591630935669,
0.06899460405111313,
-0.029217194765806198,
-0.008685052394866943,
0.007933665066957474,
-0.018741577863693237,
0.0885966345667839,
-0.061949390918016434,
0.062338586896657944,
-0.13977846503257751,
0.02662869542837143,
0.14730112254619598,
-0.024196898564696312,
0.07244403660297394,
-0.18847183883190155,
0.05778153985738754,
-0.08966945111751556,
0.03430340811610222,
-0.07842088490724564,
0.06134519353508949,
-0.011852484196424484,
-0.08546144515275955,
-0.045877229422330856,
-0.04529910162091255,
0.026898114010691643,
-0.019824014976620674,
0.14342354238033295,
0.051433585584163666,
0.07276804000139236,
0.21056409180164337,
0.08166195452213287,
-0.10135101526975632,
0.03430200740695,
-0.06402049213647842,
-0.06924732774496078,
0.07884487509727478,
-0.22806335985660553,
0.03744830563664436,
0.08266454935073853,
-0.04226420447230339,
0.04946140572428703,
0.052993033081293106,
-0.016860216856002808,
-0.05304180830717087,
0.12007668614387512,
-0.1866276115179062,
-0.004738918971270323,
-0.042245570570230484,
0.07792022824287415,
-0.00271809846162796,
0.013094721361994743,
0.13443593680858612,
-0.022345639765262604,
-0.02478000894188881,
0.04490942135453224,
0.029744263738393784,
-0.040930189192295074,
0.05754847079515457,
0.08785868436098099,
0.006388948764652014,
-0.12671534717082977,
0.14946810901165009,
0.06800685077905655,
-0.04305084049701691,
0.033907532691955566,
0.177393838763237,
-0.10773411393165588,
-0.14268213510513306,
0.06296763569116592,
0.0801798552274704,
-0.08126277476549149,
-0.05839155986905098,
-0.05308040976524353,
-0.09010263532400131,
0.06573419272899628,
0.0998682901263237,
0.06007777154445648,
0.06004951521754265,
-0.00596815999597311,
-0.09923981875181198,
0.03332768380641937,
0.06295625120401382,
-0.04058876633644104,
0.008060302585363388,
-0.10473950207233429,
0.045695576816797256,
-0.023916956037282944,
0.12450843304395676,
-0.047944530844688416,
-0.004756059497594833,
-0.07930737733840942,
-0.014147129841148853,
-0.16143319010734558,
-0.03533610329031944,
-0.015655232593417168,
-0.031050140038132668,
-0.011654541827738285,
-0.037266362458467484,
-0.057942964136600494,
0.014022091403603554,
-0.11481281369924545,
-0.04800979420542717,
-0.03402962535619736,
0.0725952535867691,
-0.12232708185911179,
-0.049896348267793655,
0.05211661756038666,
0.002104011829942465,
0.15336973965168,
0.05356777831912041,
-0.0653446763753891,
0.08119096606969833,
-0.13408999145030975,
-0.12971603870391846,
0.051719389855861664,
0.06432921439409256,
0.043826624751091,
0.04869408160448074,
-0.02088754065334797,
0.07829948514699936,
-0.040209803730249405,
0.018000219017267227,
0.02288087271153927,
-0.10663481801748276,
-0.01837669312953949,
-0.058192621916532516,
-0.08236733078956604,
-0.042992256581783295,
-0.04716731980443001,
0.05688028410077095,
0.03358939290046692,
0.1315603405237198,
-0.02463277243077755,
0.04496198892593384,
-0.1234222799539566,
0.011819138191640377,
-0.037856608629226685,
-0.15462663769721985,
-0.15649162232875824,
-0.03842226043343544,
0.0035742870531976223,
-0.0517486073076725,
0.139879047870636,
0.04080154001712799,
-0.13167423009872437,
0.03628997504711151,
0.10010296106338501,
0.01368250884115696,
0.014776254072785378,
0.23225435614585876,
0.04998161271214485,
-0.018728168681263924,
-0.06183699518442154,
0.024647627025842667,
0.030662257224321365,
0.045684292912483215,
0.09585291892290115,
0.098447784781456,
0.0817013680934906,
0.07843443006277084,
0.05855492502450943,
-0.06770267337560654,
-0.061010636389255524,
-0.04361327737569809,
0.040224719792604446,
0.13395404815673828,
-0.024432659149169922,
0.033528950065374374,
0.1924162358045578,
0.0037106177769601345,
0.009423993527889252,
-0.04152887314558029,
-0.017733968794345856,
-0.14542320370674133,
-0.1614256650209427,
-0.07786951959133148,
-0.1044926568865776,
-0.05325547605752945,
-0.07906834036111832,
0.04594474658370018,
0.056503795087337494,
0.0563795305788517,
-0.061329182237386703,
0.01636219583451748,
0.02679457701742649,
-0.10182344168424606,
0.03694610297679901,
-0.03469136357307434,
0.006349622271955013,
-0.04007522016763687,
-0.038214363157749176,
-0.025946468114852905,
0.006463898345828056,
-0.036759473383426666,
0.025820419192314148,
0.035642098635435104,
0.05230894312262535,
-0.13891200721263885,
-0.10057123005390167,
-0.026119988411664963,
-0.0025455462746322155,
0.004877444822341204,
0.16928228735923767,
0.05569462478160858,
-0.004741322714835405,
0.08905092626810074,
0.22920583188533783,
-0.061281468719244,
-0.1312342882156372,
-0.049911003559827805,
0.09233519434928894,
-0.0064275385811924934,
0.017880141735076904,
-0.005018583498895168,
-0.020604806020855904,
-0.0725020170211792,
0.22346173226833344,
0.3339496850967407,
-0.15270039439201355,
0.031814269721508026,
-0.005658406764268875,
0.008391441777348518,
0.0135980024933815,
0.13416580855846405,
0.12073924392461777,
0.13796046376228333,
-0.07346078753471375,
-0.003171829506754875,
-0.043010156601667404,
0.022048220038414,
-0.1703283041715622,
0.08756586164236069,
0.030671700835227966,
-0.1008283868432045,
0.01701170764863491,
0.05697766691446304,
-0.08465685695409775,
0.1192784383893013,
-0.11409296840429306,
-0.15792979300022125,
-0.09335266798734665,
-0.01994713582098484,
0.14672261476516724,
0.05890607833862305,
0.022856898605823517,
-0.05014454200863838,
0.014295718632638454,
0.01602124236524105,
-0.02176312357187271,
-0.18699631094932556,
-0.015018213540315628,
0.09635619819164276,
-0.11417221277952194,
0.1020013615489006,
0.0028105208184570074,
0.06466713547706604,
0.09407299011945724,
0.062339868396520615,
-0.11728609353303909,
0.07787041366100311,
0.02412484958767891,
-0.029193071648478508,
0.0010519442148506641,
-0.08994314819574356,
0.0010372919496148825,
-0.05440681427717209,
0.07694657146930695,
-0.04158562794327736,
0.008002988062798977,
0.06277894228696823,
-0.014922311529517174,
-0.024803679436445236,
-0.015568822622299194,
-0.048738058656454086,
0.07509893923997879,
0.044263169169425964,
-0.049780312925577164,
-0.03658917546272278,
-0.08053407073020935,
-0.055552564561367035,
0.02343747951090336,
-0.18435139954090118,
-0.06214624270796776,
0.00457345973700285,
-0.04034147039055824,
0.025593820959329605,
0.04375770315527916,
-0.12093635648488998,
-0.02164360322058201,
-0.09835075587034225,
-0.010218717157840729,
-0.16810116171836853,
0.06120442599058151,
0.11138508468866348,
-0.0270224679261446,
0.02055194601416588,
0.037483636289834976,
0.010804940015077591,
0.03337160125374794,
-0.112313412129879,
-0.07110311090946198
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.